Study: Color, Sound, Light, and Lissajous Figures
My goal was to create an instrument that artistically actualizes the phenomena of phase interference which can be heard as auditory beats and seen as Lissajous curves. I initially added color in a way historically described by one of my favorite composers, Alexander Scriabin. After experimentation, however, I found a color and shape palette that is more true to my own musical-visual experiences.
Because of our group’s discussions about the importance of a final performance that is more interactive than passive, I decided to use controllers to modify audio and visual components in real time. In this version, a gametrak (piloted by Jessmine XinYan Zhang) is being used to control the camera angles and object rotation (based on a patch from x37v.com), and a knock-off “Air-Flow” PS3 controller (bought for 2 pounds from a Car-Boot sale in the Omni-Center in Edinburgh) is used to control sonic elements and visual pallettes.
Here is the screen capture and audio from the computer from the February 26, 2014 performance:
Overall I am pleased with the sound and visuals interaction. The live performance aspects, however, have room to improve upon. Here is the live performance from the Alison House Atrium from February 26, 2014:
Overall, the sound/visual interaction was successful, but the performers are too dark to be seen! It’s nearly impossible from this video to see the intimate interactions between performers, sounds, and visuals. For a next performance, better lighting will be used.
After a search for a way to create Lissajous curves on the C74 blog (Max/MSP website), I found an efficient way to render a 2 dimensional figure using Jitter and OpenGL. After carefully studying a patch from Oli Larken, I managed to make this shape 3 dimensional with the Z dimension and brightness being modified from live sound input. This is my hacked version, not in presentation mode:
You can see the gametrak patch mentioned early in the top left corner.
Some data mapping was necessary to allow for an aesthetically pleasing sound and visual interaction, but most of these mapping decisions were based on psycho-acoustic boundaries. For example, Lissajous figures look most interesting (to me) when the frequency drawing them is under 20 Hz, but human hearing only starts at 20 Hz. In addition to this perceptual consideration, I noted that Lissajous figures become fabric like as the two sine waves generating their figure separate by more than about 5-10 Hz (depending on initial frequency). But with frequency separation this great, human hearing segregates these sounds into separate tones instead of a single timbre. I mapped data accordingly so as to not detract from either sonic or visual aesthetic of the instrument.
As mentioned above, the sonic and visual components are based on the phenomena of phase cancellations that occur between two slightly out-of-tune sine waves. There are three sound generators in this instrument, each which control the visuals via shared data input or mapped envelope followers. This is my performance patch:
Here is the patch that is processing the audio:
You can see the PS3 controller patch that I am using at top of the screen.
The only extra sound effect used in this version (other than the exploitation of phase cancellation in sine waves triggering low-pass, high-resonant filters in a rhythmic way) is reverb. Specifically I am using a series of Max/MSP externals called HISSTools. This external allows me to take incoming sound and convolve it through multiple reverbs. Although many impulses are loaded, the main impulse response that I am using is one that I personally recorded at the University of Edinburgh pool in the fall.
By using controllers, the Lissajous Beat Organ takes on an exciting life of its own! Even though I sometimes forget my own controls, the way in which the PS3 controller is mapped allows for new sounds to be created from a wide array of gestures. The sonic and visual textures that can be created in real time would be nearly impossible to achieve with only a mouse or track pad.
For future versions, controls will be mapped without using global send and receive objects. In this way, I will have more abilities to change my sounds and visuals throughout the piece. In addition to this, as rehearsals take place, the sonic and visual content of this instrument will be modified to meet the needs of the group as a whole. Perhaps the gametrak will control sonic element or audio generated from another performer and modified in my instrument. I look forward to seeing how this instrument will work to create a dialogue and eventual performance with the other members of the Audio-Visual Ensemble in the coming weeks!