Bass guitar DMX instrument
My goal was to construct a system that is highly responsive, expressively dynamic and diverse, which can be improvised in real time using a “traditional” instrument (standard bass guitar) to play electronic sounds that trigger and modulate specific DMX light movement in a way that directly connects sound and vision (light).
Submission one and my blog describes in lengthy detail my initial progress, observations, trials and tribulations.
Post-Submission 1 Progress
Besides handling the bulk of the administration work, organizing and booking equipment, coordinating with the venue, acting as liaison between DMSP groups, etc., I worked on refining my audio visual instrument and rehearsing.
To add diversity and flexibility to visual aspects of my instrument, I designed a method to incorporate standard household lighting. I purchased a 4 channel DMX dimmer pack, rewired five lamps to use Euro plug IEC adaptors and added an additional par can stage lamp.
Although the lamps and lights were directly controlled by my bass guitar’s audio input signal, I required a way to map state changes to occur over time. Through automating envelope parameters, which could be automatically or manually triggered via my setup, I achieved greater control. This in turn contributed to keeping the lighting changes in our performance more varied.
Sonically my instrument needed to be flexible and able to produce a wide range of content over the course of the performance. This ranged from sub bass, low bass, glitched rhythmic passages, percussive events, angelic synth pads, thunder, abstract mid range noise, etc.
Splitting audio input across three tracks, corresponding to different frequency banded instrument voicing, I built a series of multipurpose effect racks, which I mapped to several hardware controllers.
Additionally, since I was able to convert my audio input to MIDI data, I built a multilayer instrument rack that allowed me to select and switch between combinations of virtual instruments and modulate their parameters in real time.
After the group devised a theme, we divided the performance into sections. This was helpful as I was able to automate and interpolate between states by launching scene changes flexibly as needed.
Please refer to my essay on spectromorphology in submission 3 for theoretical context and reference to existing scholarship in the field.
Allegoric Irony of the Documentation
It is worth mentioning that, in my opinion, the video footage of the performance failed to capture the totality of its full scope. It is ironic and disappointing that the footage is only a ‘shadow’ of the live experience. Although we were using three separate cameras, the stage, performers, lighting and live video exceeded the boundaries of what was recorded.
We were advised to have another group document the performance. I had meetings days ahead of time to discuss how and what needed to be captured but still the results, for the most part, were overall unsatisfying. Although, I am grateful for the help, if we had known, we could have made slight adjustments to compensate. With more carefully chosen camera angles, readjustment of props and a slight repositioning of the performers, the performance documentation could have captured a much better perspective and more completely conveyed what we were trying to achieve.