going pro with a second screen

In this session, I talked about the main issues that relate to performing live with a laptop and a screen.
It turns out that the skills required for mixing and presenting work for loudspeakers are  similar to mixing and presenting work for screen. As laptop audio performers develop skill and flexibility with soundcards, digital to analogue conversion, XLR cables, mixers, EQ and compression, audiovisual types develop skills with VGA cables, video splitters, understanding different screen resolutions and how different cables work to restrict the amount of resolution able to passed to a projector. AV people develop ability to tweak visual material such that it reads well on weaker projectors or in brighter/darker environments.
Understanding how your computer behaves with a second screen is essential to gaining confidence on stage and during soundchecks/setups and so it’s a good idea to always work with a second monitor attached.
In your software you need to be able to detect that monitor and know its resolution such that you can adapt your programme material to fit that resolution without stretching pixels.

In particular, we looked at jitter and openGL. We discovered how to make an openGL video plane adapt its scale such that it would not stretch its pixels on screen, no matter what the resolution of the second monitor.

I also showed how to use jitter shaders and the jit.gl.slab object to create effects processes on live and pre-recorded video streams.
We did not look in detail at geometry processing and rendering or using pre-made 3D objects using jit.gl.model, but there is so much information about this on the cycling74 website, especially Andrew Benson’s extremely helpful jitter recipe books (cycling74.com/category/articles/jitter-recipes/) that it seemed unnecessary to open that can of worms until the questions start coming in about this from the group.

See linked max patches for some of the fun>>  AVE2014,

However, think of the flow of information first

  1. Load in a film to jit.qt.movie
  2. If the film loads successfully ask for more information about the film (framecount, moviedim etc)
  3. When you know the size of the movie you can work out how its dimensions fit with the second screen.
  4. Next adapt the scale of the jit.gl.videoplane to match the proportions of the video, in context of the main resolution of the monitor.
  5. Do some processing on the video using jit.gl.slab before sending it to the videoplane
  6. etc…

 

 

we are beyond mapping these days

First session with the group today. I was a bit pushy and asked everyone to meet again on Thursday morning with an audiovisual instrument.
There was a wealth of ideas floating around and a great skill-base for starting the project. This weekend, there must have been some healthy group bonding thanks to Marco, Russell, Shuman and Jessamine going to Newcastle for the Live Visuals conference: www.realtimevisuals.org/conference/.

I felt it was important to talk and work across the domains of sound and vision, to jump out of our disciplinary silos and really think about sound and light, importantly then, for me, it seems that an audiovisual ensmeble is a group playing with sound and light. This doesn’t necessarily mean computers and projectors and instruments, far from it, rather integrating context, performance and ideas via the two things we primarily see and hear in. (Read Time Ingold “Against SoundScape” – www.st-andrews.ac.uk/soundanth/work/ingold/ for more on that idea).

The group punted some great ideas and terms that are bound to be fruitful as things develop

  • Synesthesia
  • A blended approach to audiovision
  • DMX control of light through sound (cf David Butler, max for live library)
  • Plastic man
  • Trigger and spectacle
  • Colour Organs
  • Systems to create things
  • Interactive assets
  • Some elements are fixed, others flexible
  • Libraries and categories of audiovision, what things work as audiovisual complexes, could these be defined and used as performative tools?
  • Performing to video
  • Mad mapper and projection mapping
  • iPhones/Pads useful interfaces and audiovisual interfaces in their own right potentially
  • Actual instruments
  • Metaphors we want to work with
  • Messages and narratives, what’s going on in the world, can we reflect opinions or provoke with audiovisual engines?