THE LATENT BOX
… a tactile interface for navigating latent space
Latent Box is an interactive instrument that translates touch, pressure, and gesture into an evolving ambient soundscape generated by a custom RAVE neural audio model.
About the Project
The work responds to Alyson Denny's Photon Motion (2022), a series of looping silent videos created by recording refracted light through motorized glass. The visuals are meditative and constantly shifting — abstract enough to read as alternative graphic scores, yet too fluid to pin down. Latent Box takes that ambiguity as an invitation: rather than simply watching, participants step into a feedback loop where their physical gestures shape the sonic texture of the piece, becoming collaborators in its ongoing abstraction.
The interface is intentionally minimal. A capacitive touch grid maps gesture across an X/Y plane. A pressure-sensitive velostat strip enables expressive modulation. A rotary encoder navigates between sonic palettes. Together, these inputs control a RAVE generative model trained to produce ambient textures that mirror the flowing, refracted quality of the source visuals. There is no visual display of interaction — feedback is entirely sonic and immersive.
The goal is not precision or polish. Like Denny's visuals, the system embraces emergence, entropy, and blur. Sound doesn't explain the image — it adds a sensory dimension that deepens interpretation through embodiment. The audience becomes part of the system, another layer of abstraction re-processing each looping frame.
How It Works
An Arduino Mega reads two MPR121 capacitive touch sensors (supporting up to 144 touch inputs), velostat pressure strips, and I2C rotary encoders — all inlaid beneath laser-cut acrylic panels. Sensor data routes via serial to a Max patch, which maps gestures to parameters inside a RAVE model running in a Max for Live device within Ableton. The full signal chain — touch to timbre — runs in real time with no pre-composed audio.
Collaborators
Kyle Smith — Hardware design, local infrastructure, sensor integration, spatialization
Ishaan Jagyasi — RAVE model, TouchDesigner integration, generative audio