Lumetric Inscription

Lumetric Inscription
Lumetric Inscription

Category:

Category:

Art & Technology

Art & Technology

Service:

Service:

Research & Design

Research & Design

Year:

Year:

2025

2025

his project presents a mobile web-based computational tool operating within the framework of Autographic Design. It recontextualizes the smartphone, defining it as an active probe for Spatial Interrogation. Distinct from traditional linear scanning or LiDAR imaging, the tool utilizes gyroscopic data to map temporality onto a 360-degree field of spatiality.


his project presents a mobile web-based computational tool operating within the framework of Autographic Design. It recontextualizes the smartphone, defining it as an active probe for Spatial Interrogation. Distinct from traditional linear scanning or LiDAR imaging, the tool utilizes gyroscopic data to map temporality onto a 360-degree field of spatiality.


The software functions through a human-machine vision co-agency, a hybrid mode of perception where algorithmic interpretation is coupled with somatic input. Through real-time camera sensor fusion, the system executes a Lumetric Extrusion: translating the Lumetric RGBI (intensity and color) input from the camera into spatial coordinates, directed by the displacement and rotational data of the user’s hand.


The software functions through a human-machine vision co-agency, a hybrid mode of perception where algorithmic interpretation is coupled with somatic input. Through real-time camera sensor fusion, the system executes a Lumetric Extrusion: translating the Lumetric RGBI (intensity and color) input from the camera into spatial coordinates, directed by the displacement and rotational data of the user’s hand.


This process inscribes the temporality into specific spatial coordinates relative to the user. The user’s scanning behavior transcends simple video recording, constructing instead an encoded panoramic model. The resulting continuous geometry (point cloud or mesh) is a product of linear time acting upon non-linear space, digitally solidifying the human body’s interventional gesture within three-dimensional space and its relationship to the environment.

This process inscribes the temporality into specific spatial coordinates relative to the user. The user’s scanning behavior transcends simple video recording, constructing instead an encoded panoramic model. The resulting continuous geometry (point cloud or mesh) is a product of linear time acting upon non-linear space, digitally solidifying the human body’s interventional gesture within three-dimensional space and its relationship to the environment.

Point Cloud Mode

Point Cloud Mode

Mesh Mode

Mesh Mode

I propose a framework of synchronous spatiality—as illustrated in the digital visualization—where interaction unfolds in real-time rather than retrospectively. The conceptualized telematic system aggregates dispersed ‘traces’ from global participants into a unified, interactive environment. This establishes a performative archive, inscribing individual records into a living, encoded spectacle. The proposal aims to bridge the ontological gap between the static artifact and the fluid nature of human temporality.

I propose a framework of synchronous spatiality—as illustrated in the digital visualization—where interaction unfolds in real-time rather than retrospectively. The conceptualized telematic system aggregates dispersed ‘traces’ from global participants into a unified, interactive environment. This establishes a performative archive, inscribing individual records into a living, encoded spectacle. The proposal aims to bridge the ontological gap between the static artifact and the fluid nature of human temporality.

Create a free website with Framer, the website builder loved by startups, designers and agencies.