Four Dimensions Development Update 3
Four Dimensions is a work for Orchestra, EWI and Surround Electronics tracks commissioned by the Orlando Philharmonic Orchestra for an April 21st premiere at Full Sail University’s “Live” Venue. This is one of several vids documenting the process of getting a Wii-mote to provide a pulse to control the tempo of the prepared surround electronic score. Nathan Selikoff is the visual artist, who wants to use the data for his live immersive computer art. Marc Pinsky is the all-important behind the scenes guy that will make the synchronization possible. Sean McKeowan will be designing lighting – what an experience it should be!

We’ve come quite a ways since my last update on “Four Dimensions”:

1. The piece is about 80% composed now, the bits and pieces heard in this video are not yet final. The piece will last about 6:30 minutes 
2. Marc Pinsky is confident that he’ll have a workable system, as you can see below for the April 20th rehearsal. When the surround sound electronics composition is completed, I’ll export it from Logic into Ableton Live for time slice identification. Marc’s system will control the speed of playback from the Ableton software
3. The production staff is a combined force bringing together Orlando Phil production and administration teams with seasoned Full Sail University faculty and students for what will be a block buster night full of animations, films, lighting, and sound – and great classical music. This concert is a fund raiser for the Orlando Philharmonic 

and includes a full night of festivities – call the Orchestra’s box office @ 407.770.0071 .
Office Hours: Monday-Friday 9:00AM – 5:00PM 
Box Office: PH (407) 770-0071 ~ Fax (407) 770-0068


  • Reply


    15 02 2012

    Hi Keith
    This is very cool stuff.

  • Reply


    16 02 2012

    This is wonderful and exciting!!!!!
    I’m probably overstepping my bounds here talking like I know what’s going on, but let me check in to see if I understand what Mark and you have done.
    Mark mapped your wiimote gestures to your score. Ultimately then you will be able to real-time spool the surround track off the disk using the same gestures, right? Cool. The concert performance requires the conductor’s gestures to be pre-mapped like you’re just doing here, right? Then during the performance Mark’s software can correlate the pre mapped gestures with the real time ones. Is that right?
    Otherwise there’d have to be some cross-correlation between your gestures and the maestro’s (meaning you’d have to train him how to conduct it!)
    Looks like a crazy good time.

  • Reply

    Keith Lay

    16 02 2012

    Hi Kev,
    This test only proved that the system captures data that should provide us with an accurate beat point – live. How nuanced it can be has yet to be discovered, but, at this juncture, it seems pretty adaptable to different kinds of beat patterns and conducting styles. I wasn’t just being a metronome, I was trying to throw it off with all kinds of directions and speeds.
    The next steps will be for mark to choose what data to use and how to make it a ‘beat event’, and for me to carve the surround work into specific beats.
    I wish you were here helping us!

Your email address will not be published. Required fields are marked *