Audio Visual Instrument- Bass Lamp

Concept

To develop sonic occurrences that communicate direct correlation to visual counterparts, perceptively inseparable in intent, and to acknowledge silence and the absence of visual stimuli as a necessary and effective contrast.

Gaining inspiration from Tim Ingold’s article “Against Soundscape,” and discussions with Martin Parker, I became fascinated with the idea of manipulating light, rather than image to see how this might prove compelling in is own right.

Tim Ingold observes, “It is of course to light, and not to vision, that sound should be compared. The fact however that sound is so often and apparently unproblematically compared to sight rather than light reveals much about our implicit assumptions regarding vision and hearing, which rest on the curious idea that the eyes are screens which let no light through, leaving us to reconstruct the world inside our heads, whereas the ears are holes in the skull which let the sound right in so that it can mingle with the soul.”

Even with our eyes closed it is still possible to perceive flashes, flickering or the presence of light and gain some indication that there is activity and movement. Light can be projected onto surfaces, broken by other objects, used to induce shadows, or add subtle touches begetting mood or ambience.

My Role

Construct a system that is highly responsive, expressively dynamic and diverse, which can be improvised in real time using a “traditional” instrument (standard bass guitar) to play electronic sounds that trigger and modulate specific DMX light movement in a way that directly connects sound and vision (light).  Submission one outlines my progress, observations, trials, tribulations, and aims to discuss plans for further development.

DMX setup

When I began this course, I had no previous experience with DMX and needed to conduct an extensive amount of research to over come numerous technical issues to get my system working. Step one was to get four VISAGE 0493 LED lights controlled remotely through Ableton Live running DMX Max for Live devices. Most of the preliminary documentation is explained in detail on my AVE blog.

DMX First Run

DMX Progress

DMX… a bit further

Findings:

The easiest way to bridge connection from Live to the DMX LED lights was to send MIDI data out of an Audio/MIDI interface and into the school’s DMX console which converts MIDI to DMX.  After modifying Matthew Colling’s Max for Live patches to accommodate four lights, I was to some extent able to control them from within Ableton Live.

The easiest way is sometimes not necessarily the best way as the DMX lights performed sluggish and were very latent.  The lights would at times remain on when switched off and were unpredictable and difficult to control precisely. Additionally, they would flicker intermittently and pulse on and off on their own accord.  After speaking with Collings, he confirmed having the same issue, which he was not able to resolve .

Matt M4L DMX Devices

Although, I experienced limitations with only being able to control two channels with the DMAX devices via the Enttec DMX USB Pro, the setup was much more responsive, less latent, did not flicker, and handled more accurately.  Seeking perfection, I went back to trouble shooting the Enttec box and, after much tinkering, discovered that the issue was with Olaf Matthew’s Max/MSP external ‘dmxusbpro’.  I was able to overcome the channel limitations by using a beta abstraction by David Butler imp.dmx that focuses on jitter matrices to store, read and write data rather than reading straight MIDI values. Using the imp.dmx help file, I turned this into a 27 channel (four lights- each 7 channels) Max for Live Device.

T-Ø_DMX M4L Device Presentation

T-Ø impdmx Max

Up to this point, the Enttec setup has been more stable and the device functions somewhat as intended.  I did however need to limit the number of channels to 27 instead of 512 to accommodate a higher frame rate as to not overload the device when modulating large amounts of control data.

Audio Setup

The way in which a performer interacts in real time performance adds another dimension to the visual component.  I aim to hide my light emitting, distracting, computer from audience view and have toyed with the idea of performing behind a screen back lit by DMX lights (see video on DMX improv with shadows).   Although there is still much work to be done refining a setup that will allow me to do such, I have put together a working model that uses audio to MIDI conversion to control virtual instruments. The bass’s audio input can be additionally added as another voice, processed and manipulated in real time.

Equipment:

Ableton Live 9
PUSH Controller
Korg NanoKontrol
Roland EV-5 Foot controller
Max For Live
Bass Guitar
SoftStep Foot Controller
NI Virtual Instruments

Audio Setup

Instrument voicing:

CH 1- Bass Audio Input (Amp simulated bass sounds, Rhythmic Clicks/Beats, Distorted) 
CH 2- Sub Bass (Sustained or Arpeggiated)
Ch 3- Pads, Leads, Atmospheric Noise

T-Ø AVE instrument Live

Mapping Sound to Light

As a means to bridge visual and sonic events, I have experimented with different methods of mapping audio frequency to DMX control data.

EX 1- Three Lights, Three Voices, Three Colors

In the first example, using the DMX console setup, I’ve daisy-chained and mapped three different colored lights (Red, Blue, White). Each respond to a different audio source in Live, routed to envelope followers, that are mapped to control an individual light’s DMX values (0-255).

Rhythmic- Red > Light 1 (right)
Sub- Blue > Light 2 (middle)
Noise Lead – White > Light 3 (left)


Findings: Although this scenario might be interesting for a short period of time.  It did not communicate an expansive dynamic range of expression. However, projecting onto an object or wall might be worth further investigation.

EX 2- Screen, Lights, Proximity

Wanting to explore greater dimension and possibility, I brought in a huge 10 by 10 foot back projection screen borrowed from LTSTS (no easy feat to transport or assemble).

Screen

Due to it’s size, we were limited to conducting experiments in a bright noisy Atrium in Allison house. Below are two video examples of a 3 light setup without audible instrumentation.

Lights_Screen_No Sound_Close

Lights_Screen_No Sound_Distant

Findings: In this well lit environment, when the LEDs are off you can see a grey background caused by the screen itself.  A dark space is needed for this to be optimally effective.  Additionally, the effects of the lighting change with proximity.  It could prove interesting to stagger light distances at different stages of the performance.

EX 3- Giving Sounds Color and Movement

Using a Sony handy-cam we filmed in a lit Atrium. I am using two synced lights and the improved Enttec DMX Pro setup controlled by 3 different instrument voices. Each instrument voice is assigned a color.  The control is driven by individual audio output, linked to a corresponding envelope follower and mapped to DMX color values.

Angelic Pad- White
Rhythmic Bass – Red
Sub- Blue

EX 4- Combed Voicing, New Permutations

The following is an example of how the basic colors and voicing work in conjunction with one another generating new effects and color combinations but are still able to return to their original state (red, white, blue) when played individually.

EX 5- Adding Shadows

Improvisation combining and switching between voices and lighting control while experimenting with effects produced by shadows.

Findings: Using the Enttec DMX PRO and a projection screen yield a higher-quality dynamic range of expression. The lights are significantly more responsive but the setup still requires tweaking to generate a greater range for fade values. (i.e. contrasting quite to loud and to create a pulsing effect for pulsating sustained sounds).

Our camera distorts when recording sounds linked to the strobe parameter.  This phenomena creates an effect in itself and possibly could be captured, projected, and even fed back and looped, as it looks quite interesting.

Shadows created from behind the screen create an intriguing result and may be useful  to extend meaning, depth and character if carefully executed or choreographed.  Additionally, experimenting with placing myself with my bass guitar or another performer behind the screen might help tie together ideas for a more integrated and engaging audio visual instrument.

Critical Analysis and Moving Forward:

Although these experiments show progress and potential, there is still much work to be done on both the audio and visual fronts.  At this stage, as is and on its  accord, I do not envision my instrument being dynamic or compelling enough to sustain meaning and interest for extended periods of time.  As my group has been working on experiments individually, it’s been difficult to directly access how this will function as part of a larger performance.  Moving forward, I aim to develop my instrument further in its own right as well as work towards integrating it as a subsection of the ensemble.

Plans for further development:

Dial in specific sounds, effects and performance techniques that optimize sound generation and DMX feedback that work well standalone as well as with the rest of the ensemble.

The audio to MIDI conversion needs to be refined.  Tracking bass frequency is no easy task and I often get unexpected and false triggered notes.

Set up foot controllers to aid in modulating sonic and visual elements. Up to this point, these have not been implemented.

Tweak and scale modulation sources and envelope followers to be more dynamic with mode, strobe and fade values.  Develop precise control mechanisms that will allow for better ways of expressing relationships between sound and silence.

Investigate incorporating a dimmer pack and setting up additional lighting sources (lamps) that can be placed around the stage or in the audience. One idea includes switching on/off various audio effect processing or changing instrument voicing to trigger corresponding light states (on, of, dim, bright, flicker).

Work as a group to quickly identify a unifying theme.  Create a map of how our performance will move throughout time and devise how we can keep it engaging throughout its duration. Schedule regular group rehearsals in an effort to better understand how we operate as a dynamic and cohesive unit.

Leave a Reply