VIX Sound+Light+Motion

Research into the Immersive Environment Instrument

by on Sep.19, 2009, under Immersive Environment Instrument, Media, Open Source, Projects, Technology

translate website

UniverCity Net - digital art by Victoria Gibson

UniverCity Net – digital art by Victoria Gibson

As an Integrated Media Artist, I have many project ideas that I am constantly working on. My most recent project proposals have been seeking support to develop a new instrument for me to play in performance. My current goal is to develop a performance instrument that works using gesture control and sensors to create an alternative reality environment.

Although my core training is in music, I have studied dance and motion and have spent years of my life in on-stage performance as a musician. In my recent presentations, I have played computer based instruments and I realize that a lot of the impact of musical presentation is about the gesture of producing the sound.

If I am to continue to use the computer as a performance instrument, I need to develop a controller that responds to dance-like movements. The vision of being able to focus physical energy through dance and control audio and visual elements was crystallized into my need for an Immersive Environment Instrument (IEI).

My research into this topic has been detailed and I will construct a page to organize information about the IEI. [UPDATE: Find it Here] This post is to show the work that is already being done by Arizona State University, School of Art, Media and Engineering that uses motion tracking technology and dance with light and projections.

Here is a link to their Gallery page, I have no permission to repost, and I am very sensitive about these issues. So you have to go and look from their site — it is worth a visit to see the dancers and the light responding to their movements. Video and photographic documentation is available from the link.

What I want to do is different because it is based in Integrated Media performance where dance is a significant portion, but it is used to focus energy that results in other events. My concept is to immerse the audience in an environment that surrounds them with sound+light+motion to create a beautiful virtual reality experience that brings dream to life.

Screens will surround the audience to the limit of their peripheral vision and surround sound will envelope them; because we cannot see behind us but we hear in 360 degrees, there is no need to completely surround  with the visual elements. The artists will become part of the projection as the light will be projected on to them and their shadows create a similar effect to the shadow play of Indonesia as they move.

The most practical scenario for the screens that I can envision is for them to show images in panels, with each panel having an individual projector, so that the image/video can can be changed in location or a fresh image brought forward in real time.

The light will fill the areas between the projections with colour that I also want to control in real time. Parameters like diffusion, colour, intensity and brightness can all be computer controlled with vari-lites.

The sound will surround the viewer from all directions and will be generated using traditional sound sources such as voice, percussion and guitar augmented, effected and accompanied by computer generated sound and pre-recorded samples.

The musicians will use dance-like gestures to control all the parameters simultaneously. Some parameters may be cross-linked to allow simultaneous activation of related image/video/light/sound events.

Once I have the IEI constructed I will compose works that will use pre-organized materials and ideas presented in an improvised flow that responds to the energy of the moment.

To truly realize this concept, I plan to link all performances to a virtual reality computer world, such as Second Life, so that the performance may be shared in mixed reality. The virtual reality may also be projected for the live audience to share the unique effects that can only be experienced virtually. This will be accomplished using the amazingly low latency software developed by Stanford University under the direction of Chris Chafe. This software is based on the jackd audio connection software used in Linux OS, such as the Ubuntu kernel that I use. The software that is used for Telematic concerts, such as the one I wrote about in the post about Chris Chafe, is called jacktrip. This software can be used to link musicians in remote locations with very low latency — depending on the speed of the connection. Before I arrived in Banff, John Adams had recorded a multitrack session using 24/96 sampling rate. This astounded me and the results I witnessed myself, at a lower sampling rate, were exceptionally good.

Due to the interesting work done by dance technologists in the world of motion tracking, I have now joined dance-tech.net and you can visit me there. I am freeing my inner dancer after all these years of trying to limit myself. In the music world all the business people and sucessful artists on panels tell you to limit yourself and categorize your work in a recognizable way so you can market it sucessfully. I did try, but at this point in my life, it has become important to me to do the work as I perceive it. Commercial success has become a lot less important than the satisfaction I get from achieving my goals.

Go to Immersive Environment Instrument Page

Did you like this? If so, please bookmark it,
tell a friend
about it, and subscribe to the blog RSS feed.
:, , , , , , , , , , ,

Comments are closed.

Looking for something?

Use the form below to search the site:

Still not finding what you're looking for? Drop a comment on a post or contact us so we can take care of it!

Visit our friends!

A few highly recommended friends...