Interactive video and sound performance
In collaboration with Jessica Kee, Sachiko Murakami, and Adam Owen.
bodyForTheTrees utilizes wearable sensors that convert the movements of a performer into video animations and generative sound. The system consists of a flex sensor on one of the performer’s knees, two distance sensors in the palms of the hands and an orientation sensor on the top of the head. The orientation movements are also translated into a variation of colour in the RGB LEDs worn on the back of the hands and the background of the scene in the video projection.
- Technical: Arduino, XBee Radios, wearable sensors, 3 channel video projection, stereo audio, Processing with Csound.
- Roles: Original concept, electronic sensor system design and construction, generative sound design, interactive video processing and visual design.