This another piece part of my ongoing space_ series of media art works.

The works in the series are derived directly from the attentive and phenomenological exploration of a specific geographical/physical location. The majority of the visual and acoustic elements present in these pieces come from processing photogrammetry scans, spatial audio (ambisonic), and still images (textures, HDRI) captured during a period of attentive immersion and observation that I conduct while being physically present at the chosen location. This audiovisual material capture seek to document, in an intuitive way, the textures, lighting, shapes, and sounds of the location. This observation and capture technique follows mindfulness processes of attentive perception and connection with the environment. This new piece in the cycle was commissioned by New Adventures in Sound Art in honour of its 20th anniversary.

Access to the location explored for this piece was facilitated through a residency at NAISA’s Warbler’s Roost in South River, Ontario. The materials captured in the area were used to create a photorealistic virtual place composed of 3D geometry, textures, dynamic lighting and reactive spatial audio.

The visual virtual environment is rendered in real-time using Unreal Engine. During the live performance, a real-time virtual exploration of this environment is done by me using a hand controlled virtual camera (via a 3D mouse) which determines the final presented audiovisual sequence. To control the camera, I programmed a custom real-time control system that allows me to bookmark camera locations, switch locations while remembering the last position at the bookmark, perform visual cross-fades when switching locations and change camera motion speed and sensitivity.

The sound synthesis was done in a Reaper multitrack session that receives camera location from UE via OSC messaging. The audio spatialization of the piece is determined by the location of the virtual camera and presented over a multichannel sound system (12 channels used during the premiere) surrounding the audience. I developed a custom system to allow me to place audio sources in Unreal Engine that automatically update the source location in the acoustic simulation plugins (IEM, SPARTA) hosted in Reaper. The sound reflections are produced by an impulse response reverb that I produced using acoustic simulation software and based on the actual geometry of the room presented in the virtual place. The source sounds for both the processed sounds and pure soundscape elements come from 3rd order (Zylia microphone) Ambisonic recordings I did at the location. A combination of VST plugins and custom Max/MSP patches where used for sound processing.

By doing this live control of the camera, I’m able to perform a mindful audiovisual exploration of the virtual place live in front of an audience.

The video included here is a recording of the premier performance, with binaural audio to be listened using headphones.

This piece can also be presented as a live performance or exhibited as a fixed piece. The video can be presented using wall projection or a high resolution (4K or 8K) display. The audio can also be delivered as binaural over headphones, or over a stereo/multichannel speaker system.


  • Visuals: Unreal Engine 5, photogrammetry, scanned textures, HDRI lighting
  • Audio: Reaper with real-time spatialization, playback and synthesis. Impulse response simulation. Max/MSP.
  • Interactivity: 3D Mouse and custom UE based camera control system