Immersive Environment Instrument
It is very exciting for me to be counting down the days until I leave for Amsterdam. I have to get ready for the wireless sensor workshop by purchasing some Arduino parts!
This is my first trip to Europe, so I am very excited and a bit nervous.
June — Amsterdam
Early July — Germany
July 11th — Mons, Belgium for my friend and collaborator Sonia’s show!
No return date set as I have applied for a few events and residencies in Europe that still might come through to extend my stay.
Any helpful tips, please contact me!
My first week in New York city has been so busy that I have not had much time to do a retrospective of my eventful week, from August 15th to 22nd at EMPAC in Troy, NY.
I am happy to report that I was successful in creating a “visual music” presentation on 3 screens with one of the aspects controlled in real time using gesture control.
Adafruit shipped my Arduino boards to EMPAC and I was able to research on the internet and receive help from others in the workshop to understand how to make the system work. I chose the infra-red distance sensor and used information from the Adafruit website to help me program the board. My trusty Ubuntu computer handled all of the Arduino code uploading while I ran the Isadora program on the Macbook.
My performance was thrilling for me, but I was inspired and informed by the work that everyone else in the class showed on Saturday, August 21, 2010.
The first work we saw was by Joff, a theatre instructor who has a company that puts on plays in Second Life. The work included a grassy field with images of human figures that were short videos. The figures would fade in, seem to be sleeping, then move in some way, then fade out. We were invited to interact with the figures, but I found it fascinating just to watch them.
It was eerie and engaging as an installation and Joff might develop it further as a performance or to be more interactive. (continue reading…)
Research into the Immersive Environment Instrument (IEI) continues and this post is about the visual surround aspect of the presentation. For background information on this project click here. There are lots of links to examples, but none I have the right to re-post. My digital artwork, UniversityNet, has become a kind of logo for the IEI, so I post it to break up the text heaviness of research.
My goal is to bring the cinematic experience of film into live performance within a virtual reality simulation. Exploring a non-linear narrative that references three time periods — past, present and future, to construct a communication that is open to individual interpretation. Resonating with an audience on an emotional level, the performer improvises each show as a unique construct. Using a sensor array, events are triggered with dance-like motions that communicate emotional intention through gesture.
The University of California, Santa Barbara is the host institution for the AlloSphere, an incredible facility designed by Dr. JoAnn Kuchera-Morin. Dr. JoAnn Kuchera Morin assembled a team that included world renowned architect, Robert Venturi, to design and construct this feat of engineering. She talks about her work on TED with a really interesting video.
In 2000 she began the creation, design, and development of a Digital Media Center within the California Nanosystems Institute. The culmination of her design is the Allosphere Research Laboratory, a three-story metal sphere inside an echo-free cube, designed for immersive, interactive scientific and artistic investigation of multi-dimensional data sets. She serves as Director of the Allosphere Research Laboratory and Center. . — XMedia Lab
There is a lot of interesting research in Santa Barbara that may apply to the construction of the Immersive Environment Instrument, but it is not portable and it was designed to facilitate research, not performance. There is no room for an audience, but compositions could be shown on video as seen on the TED site. The challenge would be to translate as much of the effect of this multi-million dollar facility into a portable touring show.
Many of the elements of the IEI have been used in performance before, as the idea of surround sound has been in use since the introduction generally credited to Stockhausen. The idea of surround video has been used by a collective, Workspace Unlimited, among others. Their example, Hybrid Space 360, is really interesting because it has already been presented at EMPAC – Experimental Media and performing arts Center Rensselaer in New York, 2008. There are photos on their site, but as it in Flash, you have to navigate yourself, click on projects to view the data . . . I cannot even link directly.
Their interesting concept is of an interactive installation, not a performer controlled environment. The Immersive Environment Instrument will tour with Victoria Gibson in Girl Can Dream, the first composition for the instrument. Later, other musicians will be invited to join Victoria to expand the sensor array and integrate other personality styles.
A lot of recent development in software has made still photos panoramic and there are even very inexpensive point and shoot cameras that offer the ability to select the nodes (nodes are individual photos — see Wikipedia) required to stitch together photos in this way. With the proper viewer, an on-line panorama can be achieved. See the Apple VR site for examples of Quicktime panoramas (plug-in required) .
Ubuntu Users — The most promising thread in this area seems to be freepv not found in a repository by Synaptic yet. Build instructions on the site.
Anyone with the quicktime plug-in can view movies made of panoramic photos on this company’s site — Studio 360. These examples are not the photo itself, but a movie of the panoramic photo.
The concept of surround video is tricky, so I propose to set up separate nodes, the same way panoramic photos are really separate photos. This will also give me more flexibility in using lights to soften the edges of each node to aid in blending or to use the nodes a unique, distinct images. I do not expect surround video to be available soon as large studios are having enough of a challenge with Imax, Omnimax and conventional 3D films. I think that it is more cost effective and realistic to project each node independently and let the instrument create a Cinematic Virtual Reality (CVR) illusion. Cinematic Virtual Reality is a term I created for the visual component to the Immersive Environment Instrument.
UniverCity Net – digital art by Victoria Gibson
As an Integrated Media Artist, I have many project ideas that I am constantly working on. My most recent project proposals have been seeking support to develop a new instrument for me to play in performance. My current goal is to develop a performance instrument that works using gesture control and sensors to create an alternative reality environment.
Although my core training is in music, I have studied dance and motion and have spent years of my life in on-stage performance as a musician. In my recent presentations, I have played computer based instruments and I realize that a lot of the impact of musical presentation is about the gesture of producing the sound.
If I am to continue to use the computer as a performance instrument, I need to develop a controller that responds to dance-like movements. The vision of being able to focus physical energy through dance and control audio and visual elements was crystallized into my need for an Immersive Environment Instrument (IEI). (continue reading…)