this project uses galvanic skin response (gsr) to translate subtle physiological changes into the movement of a candle. i built the system using arduino and touchdesigner, mapping live sensor data to control the flame’s shape and flicker in real time. gsr is the same technology used in lie detectors, which it doesn’t identify which exact emotion you’re feeling, but rather how intense an emotion is.
the goal was to create a quiet moment of feedback between the body and the environment. when i was testing it out, a lot of people shared that they felt more present while using it, simply by watching their breath and attention influence the candle. this ended up being one of the most meaningful projects i’ve worked on, and i’m really grateful for the chance to explore it and have so much more in the works. i’m also excited to keep developing it into a larger and more user friendly installation and share more from the process soon, just thought i’d share what i have so far!
projected particle animation on 3d-scanned mannequin for xr exhibition on physical-digital interaction
//
this piece was created in touchdesigner for an xr exhibition focused on the intersection of physical and digital space. i 3d-scanned a mannequin, built a particle system based on its geometry, and animated it using SOP and CHOP networks. the final render was projected back onto the original mannequin to simulate real-time scanning. although pre-rendered, the projection creates a loop that explores how xr can alter our sense of presence and embodiment. (excuse the light leaks i was not allowed to set up the projector boo)