Spatial Imprint
Spatial Imprint
SPATIAL IMPRINT PROJECT
Spatial Imprint explores the intersection of light, form, and perception through an active shadow projection device. By manipulating the position, intensity, and radius of LED point lights, the project casts complex patterns and evolving geometries onto surrounding surfaces. A custom-designed 3D simulation system computes how variations in distance, object shape, and light radius affect shadow sharpness and behavior. Through a series of calibrated optical experiments, Spatial Imprint reveals how ephemeral shadows can become spatial tools—transforming intangible light into a medium for architectural expression, responsive environments, and performative installations that imprint space with dynamic, interactive light geometries.
SHADOW SHARPNESS
The shadow sharpness experiments were conducted using a custom 3D simulation tool. These tests focused on three key variables: the radius of the point light , the distance between the light source and the shadow-casting object, and the distance from the object to the projection surface. The findings revealed that the smaller the LED radius, the sharper the shadow edges—mimicking the behavior of an ideal point light. Conversely, larger radius produced penumbras, creating blurrier, more diffused shadows. Additionally, increasing the distance between the light source and the shadow caster enhanced the definition and scale of the shadow pattern, but also reduced brightness intensity. Lastly, the proximity of the shadow-casting object to the projection surface proved crucial: closer objects yielded sharper, smaller shadows, while increased distance resulted in larger, softer silhouettes. These calibrated variables allowed the design of a system capable of f ine-tuning shadow resolution to match spatial and narrative requirements
LIGHT OVERLAPPING
Light Overlapping principle through the use of high-intensity CREE LEDs combined with magnifier lenses to create a focused and sharply defined projection. The diagram on the left shows a layered configuration where multiple light beams intersect within a symmetrical structure, producing intricate shadow patterns on the surrounding surfaces. Each lighting setup experiment demonstrates how shifting the orientation and number of light sources affects the density and direction of shadows, creating zones of overlapping intensity.
SHADOW PROJECTION DEVICE
A custom-designed apparatus engineered to cast intricate shadow patterns by projecting light through interchangeable pattern modules. The core structure features six high-intensity CREE LEDs arranged symmetrically on a 3D-printed frame, each directed through a magnifier lens and a patterned filter. To optimize the design, a custom 3D simulation software was developed to compute and visualize how light interacts with the geometry of the shadow maker. This tool enabled precise calculation of projection behavior, allowing real-time adjustments to the orientation, scale, and curvature of components. The simulation also revealed how the LED beam radius directly affects the sharpness of the projected shadows — a narrower beam creates sharper, more defined edges, while a wider beam softens the patterns. This dynamic control is essential for tailoring the visual outcome to specific spatial contexts or narrative intentions
DIGITAL TWIN - 3D PHOTOGRAMMATRY
Using photogrammetry and accurate room measurements, a digital twin of the physical environment is created. This includes both a highly detailed 3D scan and a simplified dimensionally accurate model for performance and control. The digital twin becomes the canvas for projection and lighting simulation. By aligning digital and physical geometries, this foundational step allows for seamless virtual-physical integration, essential for projection mapping, lighting analysis, and dynamic spatial experiments. The twin also serves as a calibration reference and simulation environment inside Blender, ensuring all subsequent layers of the system are grounded in real-world scale and spatial relationships
REALTIME OBJECT TRACKING
Using HTC Vive trackers and base stations, real-time object tracking is achieved through Blender integration via OpenVR and Python OSC. The tracked object (in this case, a stool or user’s hand) updates its position and orientation live inside Blender, allowing digital elements to follow or react to movement in physical space. This enables dynamic projection mapping that can follow a person or object as it moves, allowing for interaction-driven installations. The real-time tracking system bridges the motion between the physical and virtual world, offering a responsive, reactive framework for immersive environments
INTERACTIVE DIMENSIONAL APPLICATION
Explore three projection mapping methods for spatial engagement. The first uses static projectors with tracked objects, allowing dynamic visuals to follow moving elements. The second attaches a tracker to a moving projector, enabling real-time adaptive projections onto static scenes. The third and most advanced method tracks both projector and objects using multiple fixed projectors, allowing full spatial synchronization where light, object, and architecture interact seamlessly. These setups vary in complexity and interactivity, offering escalating levels of immersion—from reactive surfaces to environments that respond holistically to movement and proximity.
SPATIAL IMPRINT
This interactive installation explores how real-time human movement can reshape architectural space. Using a volumetric grid as both structure and interface, the environment responds to the viewer’s motion—tracked via Vive Tracker—by dynamically distorting its virtual geometry. This redefines architecture as participatory, where the body imprints itself onto space, reversing conventional relationships. Projection mapping is precisely calibrated using Blender, aligning digital visuals with the physical room. Despite minor system latency, real-time responsiveness is maintained through optimized smoothing. Set in a dim, immersive environment, the piece blends architectural principles with technology, proposing space as a living blueprint—fluid, reactive, and shaped by human agency.