Group Submission

We wanted to challenge the accepted conventions for interacting with film. We created a textural environment that was unique to each viewer, that moves with the audiences perspective and develops through their communication with the film. The situation we created can only truly be experienced by the user, even when spectators are present in the same room. We brought 3D film in to a physical space and invited our audience to ask questions of the environment and it’s relationship to the screen.

How it works.

Introduction Video

We set this film up at the entrance to the installation to allow the audience to understand how to manipulate the environment.

Creating the environment essentially boiled down to three distinct elements; the film, the sound, and the users relationship to both.

VISUAL WORLD.

Following on from our initial beta testing we  took part in a masterclass with world renowned 3D film expert Ludger Pfanz from the Staatliche Hochschule für Gestaltung Karlsruhe (Karlsruhe College of Arts and Design). Over two days Ludger explained the importance of removing ourselves from the structures of filming in 2D and challenged us to relate the 3D world through new artistic relationships to the unique technical possibilities of 3D film.

For this particular part of the process we used a professional, all-in-one rig to easily critically talk about the concepts of 3D filming and it’s relationship to the narrative of the film. However, we did not have access to such equipment for the remainder of our project and drew on the techniques we had researched to film it with two standard SLR cameras.

The film created guides us through 3 scenes that loosely follow the course of a day in an attempt to broaden the focus from an individual object on the screen. We tried to use 3 different overall depths of convergence (the distance from the main focus in shot) to explore those images’ relationship to the user and then the users relationship to the screen within the space.

FINAL 3D FILM

Final 3D Film

Please use Red/Cyan Glasses to view the film online. These can be made simply following these steps.

 ————————-

 

SOUND WORLD

Our intention was to create a soundtrack to a film that the user could manipulate, respond to and control. We wanted to create a sonic environment that blurred the lines between film score, diagetic sound and stand alone textured sound.

As 3D works by the communication between two independent sources focusing on the same object, both composers on the project approached the musical element in the same way, collaborating through a fixed set of rules but working independently to create a soundtrack/world that had parallels to the depth of convergence on the screen.

[soundcloud url=”http://api.soundcloud.com/tracks/89422609″ params=”” width=” 100%” height=”166″ iframe=”true” /]

The sound design for the film was also brought in to the users control but built as a bridge between the on screen action and the installation itself, being music more direct in it’s relationship to the image.

[soundcloud url=”http://api.soundcloud.com/tracks/89427396″ params=”” width=” 100%” height=”166″ iframe=”true” /]

The final outcome created a space where the sound world grew out from the screen and enveloped the users environment, supporting and developing the 3 dimensional aspect of the project.

The following is a binaural recording demonstrating the spatial audio interaction of our 3 dimensional environment. Using 2 omni-directional DPA microphones taped to headphones, we tried to recreate the head-related transfer function (HRTF) of the ears. The result is a representation of what the user may hear while interacting with our 3D installation.

NOTE: HEADPHONES REQUIRED

 ————————-

 

TEXTURAL WORLD

The project revolved around our ability to immerse the user into the space and make the film a truly 3Dimensional experience. Max/MSP allowed us to achieve this level of interaction by using an XBOX Kinect to input data about the users movement in to the patch.

The Max/MSP Patch

Alex guides us through the Max/MSP Patch built for the project.

The patch can be dowloaded here: AG_KinectVia

Note: The Max/MSP externals required for this patch can be downloaded from the sites referenced at the bottom of this page.

The programme uses two different pieces of software. Synapse, which is receiving data from the XBOX Kinect and tracks the user’s X, Y and Z axis, sends the Open Sound Control data into Max/MSP. After receiving the OSC data from Synapse, the data is fed into the Kinect Via Synapse Max/MSP patch that was developed by Jon Bellona. This particular patch reads the OSC data and sends out numerical information that is much simpler to work with. After analyzing the numbers received, we were able to think about how we wanted to utilize all of this information creatively. As shown in the above film, the interactor’s arms are in control of a music stem and a sound effects stem respectively. The user has the ability to control the volume of that stem and also which loudspeaker the particular sound would come from (front left, front right, back left or back right) by raising or lowering their arms and pointing in the direction of the desired loudspeaker. The interactor’s torso is in control of the main music stem which moves contrary to the body as the interactor moves around the space (i.e. if the interactor moves in front of the back left speaker, the stem will be played out of the front right etc.).

 

The Outcome

The Life in 3D project brought together our collective experiences and interests and allowed us to embark on a process that was new to each of us. Though the environment we have presented above was a full representation of our technical abilities and creative direction at the time, we have since gone on to develop the application further and explore new situations in which to interact with the technology. Life in 3D brought us together to explore “how to bring stereoscopic vision onto the screen and ambisonic/ binaural/ surround sound into the cinematic experience in order to create immersive audiovisual environments.” We took this brief and answered as sophisticatedly and creatively as we could, however, as we see it, this is only the beginning of our collective’s investigation in to the possibilities the project holds.

 

References and Research.

1. Bellona, Jon P. Kinect-Via-Synapse. Computer software. Jon Bellona: Intermedia Artist. Jon Bellona, n.d. Web. <deecerecords.com/>

2. Schacher, Jan. Ambipanning~. Computer software. ICST Institute for Computer Music and Sound Technology. Zurich University of the Arts, n.d. Web. <www.icst.net>.

3. www.electronicproducts.com/News/Surround_sound_vs_3D_sound.aspx

4. en.wikipedia.org/wiki/3D_audio_effect

5. cinematography-howto.wonderhowto.com/how-to/build-3d-camera-rig-for-recording-and-shooting-3d-videos-and-films-424098/

6. Gardner, William G. “Spatial Audio Reproduction: Toward Individual Binaural Sound.” Tenth Annual Symposium on Frontiers In Engineering (2005). Web. 1 Feb. 2013.

7. Wijnand A. IJsselsteijn ; Huib de Ridder and Joyce Vliegen
”Effects of stereoscopic filming parameters and display duration on the subjective assessment of eye strain”, Proc. SPIE 3957, Stereoscopic Displays and Virtual Reality Systems VII, 12 (May 3, 2000); doi:10.1117/12.384448;

8. IJsselsteijn, W.A.; de Ridder, H.; Vliegen, J.; , “Subjective evaluation of stereoscopic images: effects of camera parameters and display duration,” Circuits and Systems for Video Technology, IEEE Transactions on , vol.10, no.2, pp.225-233, Mar 2000

9. Jeffrey, Bamford S., and Vanderkooy John. “Ambisonic Sound for Us.” AES E-Library. Audio Engineering Society, Oct. 1995. Web. 3 Feb. 2013.

10. Vinodh, Kumar and Unver, Ertu (2010) Stereoscopic 3D Filming : Huddersfield University case study. [Video]