The Birth of the ‘Hairy Ball’ – Working Progress of coding

The ‘Hairy Ball’ is one of the visual effects we made for the project ‘Brain Drain’, aiming to interpret the performer’s emotional state in a vivid and reasonable way. I am mainly working on this design.

This visualization is based and developed on the Processing code ‘Noise Sphere’ by David Pena (Ref: Location/Processing/File/Examples/Topics/Geometry/NoiseSphere). But I did much work to develop and re-create visual effects with coding.

Here’s the working progress of the design.

Time Modification
26 Feb 2014 Created basic visual effects for ‘Hairy Ball’

·Excitement: Hair grow randomly longer, the ball looks shimmering

·Engagement: Hairy Ball rotates 360°

·Meditation: Hair moves from the center to the edge of the ball regularly and comes back again.

·Frustration: Hairy Ball trembles

27 Feb 2014 Tried to duplicate the ‘Hairy Ball’ to the quantity of four appearing together but finally dropped the idea
12     March 2014 ·Alarm function with TV noise effect to indicate the headset is not working properly

·Play with colour by adding colours to hair according to different emotional state but finally dropped the idea

·Adding the fifth parameter Boredom: Hairy Ball stays fixed and breathing

20 March 2014 ·Modification to the meditation: speed up the moving to make the transition neat and clean

·Text field for displaying real-time data

27 March 2014 ·Having idea of projecting the ‘Hairy Ball’ visualization to a real sphere like a yoga ball
1 April 2014 ·Combining Processing code with Arduino code

·Connecting Processing code with other visuals with IP

Processing Code: ‘Hairy Ball’

import oscP5.*;
import netP5.*;
import ddf.minim.*;
NetAddress myRemoteLocation1;
NetAddress myRemoteLocation2;
NetAddress myRemoteLocation3;
int cuantos = 3000;//change the density of the hair
Pelo[] lista ;
float[] z = new float[cuantos];
float[] phi = new float[cuantos];
float[] largos = new float[cuantos];
float radio;
float rx = 0;
float ry =0;
boolean checkPlay=false;
Minim minim;
AudioPlayer player;

float excitement=0.8;
float engagement=0.6;
float frustration=0.4;
float meditation=0.2;
float boredom=0;
float lastExcitement;
float lastEngagement;
float lastFrustration;
float lastMeditation;
float lastBoredom;
int excount=0;
int encount=0;
int fcount=0;
int mcount=0;
int bcount=0;
int allcount=1500;//change the time for monitoring the value change of data

OscP5 oscP5;

int angle;

void setup() {
oscP5 = new OscP5(this, 7400);
minim = new Minim(this);
player = minim.loadFile(“WhiteNoiseSmall.aif”);

myRemoteLocation1 = new NetAddress(“172.20.187.146”,5001);
myRemoteLocation2 = new NetAddress(“172.20.187.146”,5001);
myRemoteLocation3 = new NetAddress(“172.20.187.146”,5001);

size(displayWidth, displayHeight, P3D);
radio = 150;//can change the size of the hariball
lista = new Pelo[cuantos];
for (int i=0; i<cuantos; i++) {//drawing the hair
lista[i] = new Pelo();
}
noiseDetail(3);//change the sharpness of the display
}

void draw() {

noCursor();

print(“excitement=”+excitement+”\n”);
print(“engagement=”+engagement+”\n”);
print(“frustration=”+frustration+”\n”);
print(“meditation=”+meditation+”\n”);
print(“boredom=”+boredom+”\n”);
print(“excount=”+excount+”\n”);
print(“encount=”+encount+”\n”);
print(“fcount=”+fcount+”\n”);
print(“mcount=”+mcount+”\n”);
print(“bcount=”+bcount+”\n”);
excitement=excitement+0.0001;
engagement=engagement+0.0005;
frustration=frustration+0.001;
meditation=meditation+0.0015;
//boredom=boredom+0.002;

if(mousePressed==false){ // create an osc message
OscMessage myMessage = new OscMessage(“/test”);

println(meditation+”-“+frustration+”-“+engagement+”-“+excitement+”-“+boredom);
myMessage.add(meditation);
myMessage.add(frustration);
myMessage.add(engagement);
myMessage.add(excitement);
myMessage.add(boredom);
// send the message
oscP5.send(myMessage, myRemoteLocation1);
oscP5.send(myMessage, myRemoteLocation2);
oscP5.send(myMessage, myRemoteLocation3);
}

background(0);

//Text field for monitoring the data values
textSize(20);
fill(255);

text(“EXCITEMENT “+excitement, width-250, height-100);
text(“ENGAGEMENT “+engagement, width-250, height-80);
text(“FRUSTRATION “+frustration, width-250, height-60);
text(“MEDITATION “+meditation, width-250, height-40);
text(“BOREDOM “+boredom, width-250, height-20);
translate(width/2, height/2);//P3D set up
//here is to count the time that any parameter does not change indicating the headset is not working
if (excitement==lastExcitement) {
excount++;
if (excount>allcount) {
excount=allcount+1;
}
}
if (engagement==lastEngagement) {
encount++;
if (encount>allcount) {
encount=allcount+1;
}
}
if (frustration==lastFrustration) {
fcount++;
if (fcount>allcount) {
fcount=allcount+1;
}
}
if (meditation==lastMeditation) {
mcount++;
if (mcount>allcount) {
mcount=allcount+1;
}
}
if (boredom==lastBoredom) {
bcount++;
if (bcount>allcount) {
bcount=allcount+1;
}
}

if (excitement!=lastExcitement) {
excount=0;
}
if (engagement!=lastEngagement) {
encount=0;
}
if (frustration!=lastFrustration) {
fcount=0;
}
if (meditation!=lastMeditation) {
mcount=0;
}
if (boredom!=lastBoredom) {
bcount=0;
}

lastExcitement=excitement;
lastEngagement=engagement;
lastFrustration=frustration;
lastMeditation=meditation;
lastBoredom=boredom;

if (excount>allcount||encount>allcount||fcount>allcount||mcount>allcount||bcount>allcount) {

//here is the effect of TV noise when the headset is not working properly
//reference: www.openprocessing.org/sketch/24107
translate(-width/2, -height/2);
for (int i=width; i>=0; i-=4) {
for (int j=width; j>=0; j-=4) {
fill(random(255));
rect(i, j, 4, 4);
noStroke();
}
}
if(checkPlay==false){
minim = new Minim(this);
player = minim.loadFile(“WhiteNoiseSmall.aif”);
player.play();
checkPlay=true;
}
}
else {
checkPlay=false;
player.close();
minim.stop();
if (boredom>engagement && boredom>frustration && boredom> meditation&& boredom> excitement) {
//ADD YOUR effect FOR BOREDOM HERE
rotateY(0);
rotateX(0);
fill(0);
noStroke();
radio = height/4*boredom;
sphere(radio);

for (int i = 0;i < cuantos; i++) {
lista[i].dibujar();
}
}

if (excitement>engagement && excitement>frustration && excitement> meditation&& excitement>boredom) {

rotateY(radians(0));
rotateX(radians(0));
fill(0);
radio = height/4*excitement; //no more scaling *excitement
noStroke();
sphere(radio);

for (int i = 0;i < cuantos; i++) {
lista[i].dibujar();
}
}

if (engagement>excitement && engagement>frustration && engagement>meditation&&engagement>boredom) {

angle++;//angle needs to be set 360=0
if(angle==360){angle=0;}
rotateY(radians(angle));//ball can spin and rotate
rotateX(radians(angle));
radio = height/4*engagement; //no more scaling *engagement
fill(0);
noStroke();
sphere(radio);

for (int i = 0;i < cuantos; i++) {
lista[i].dibujar();
}
}

if (frustration>excitement && frustration>engagement && frustration>meditation && frustration>boredom) {

float rxp = ((random(150, 450)-(width/2))*0.005);//ball can fluctuate
float ryp = ((random(150, 450)-(height/2))*0.005);
rx = (rx*0.9)+(rxp*0.1);
ry = (ry*0.9)+(ryp*0.1);
rotateY(rx);
rotateX(ry);
radio = height/4*frustration; //no more scaling *frustration
fill(0);
noStroke();
sphere(radio);

for (int i = 0;i < cuantos; i++) {
lista[i].dibujar();
}
}

if (meditation>excitement && meditation>engagement && meditation>frustration&& meditation>boredom) {

rotateY(radians(0));
rotateX(radians(0));
fill(0);
noStroke();
radio = height/4*meditation; //no more scaling *meditation
sphere(radio);

for (int i = 0;i < cuantos; i++) {
lista[i].dibujar();
}
}
}
}
class Pelo {

float z = random(-radio, radio);
float phi = random(TWO_PI);
float largo = random(1.15, 1.2);//control the length of the hair
float theta = asin(z/radio);

void dibujar() {//grow hair

if (meditation>excitement && meditation>engagement && meditation>frustration &&meditation>boredom&& excount<allcount && encount<allcount && fcount<allcount && mcount<allcount&&bcount<allcount) {
float largo = random(1.15, 1.2);//control the length of the hair
float off = (random(millis()* 0.0005, sin(phi))-0.5) * 0.6;//enable the hair looks vivid can change noise to random to acheive falling effect
float offb = (random(millis() * 0.0007, sin(z) * 0.01)-0.5) * 0.6;//enable the hair looks vivid can change noise to random to acheive falling effect
float thetaff = theta+off;
float phff = phi+offb;
float x = radio * cos(theta) * cos(phi);
float y = radio * cos(theta) * sin(phi);
float z = radio * sin(theta);
float xo = radio * cos(thetaff) * cos(phff);
float yo = radio * cos(thetaff) * sin(phff);
float zo = radio * sin(thetaff);
float xb = xo * largo*1;//by mutiple >1 can change the length of each hair
float yb = yo * largo*1;//by mutiple >1 can change the length of each hair
float zb = zo * largo*1;//by mutiple >1 can change the length of each hair
beginShape(LINES);//drawing lines
//strokeWeight(2);
stroke(0);
vertex(x, y, z);
stroke(230,200);
vertex(xb, yb, zb);
endShape();
}

else if (excitement>engagement && excitement>frustration && excitement> meditation &&excitement>boredom&& excount<allcount && encount<allcount && fcount<allcount && mcount<allcount&&bcount<allcount) {
float largo = random(1.2, 1.35);//control the length of the hair
float off = (noise(millis()* 0.0005, sin(phi))-0.5) * 0.3;//enable the hair looks vivid can change noise to random to acheive falling effect
float offb = (noise(millis() * 0.0007, sin(z) * 0.01)-0.5) * 0.3;//enable the hair looks vivid can change noise to random to acheive falling effect
float thetaff = theta+off;
float phff = phi+offb;
float x = radio * cos(theta) * cos(phi);
float y = radio * cos(theta) * sin(phi);
float z = radio * sin(theta);
float xo = radio * cos(thetaff) * cos(phff);
float yo = radio * cos(thetaff) * sin(phff);
float zo = radio * sin(thetaff);
float xb = xo * largo*1;//by mutiple >1 can change the length of each hair
float yb = yo * largo*1;//by mutiple >1 can change the length of each hair
float zb = zo * largo*1;//by mutiple >1 can change the length of each hair
beginShape(LINES);//drawing lines
//strokeWeight(4);
stroke(255);
vertex(x, y, z);
stroke(150,100);
vertex(xb, yb, zb);
endShape();
}

else {
float off = (noise(millis()* 0.0005, sin(phi))-0.5) * 0.3;//enable the hair looks vivid can change noise to random to acheive falling effect
float offb = (noise(millis() * 0.0007, sin(z) * 0.01)-0.5) * 0.3;//enable the hair looks vivid can change noise to random to acheive falling effect
float thetaff = theta+off;
float phff = phi+offb;
float x = radio * cos(theta) * cos(phi);
float y = radio * cos(theta) * sin(phi);
float z = radio * sin(theta);
float xo = radio * cos(thetaff) * cos(phff);
float yo = radio * cos(thetaff) * sin(phff);
float zo = radio * sin(thetaff);
float xb = xo * largo*1;//by mutiple >1 can change the length of each hair
float yb = yo * largo*1;//by mutiple >1 can change the length of each hair
float zb = zo * largo*1;//by mutiple >1 can change the length of each hair
beginShape(LINES);//drawing lines
// strokeWeight(2);
stroke(255);
vertex(x, y, z);
stroke(150,100);
vertex(xb, yb, zb);
endShape();
}
}
}

 

void oscEvent(OscMessage theOscMessage) {
// check if theOscMessage has an address pattern we are looking for
if(theOscMessage.checkAddrPattern(“/AFF/Excitement”) == true) {
// parse theOscMessage and extract the values from the OSC message arguments
//excitement = ceil(theOscMessage.get(0).floatValue()*255);
excitement = theOscMessage.get(0).floatValue();
} else if (theOscMessage.checkAddrPattern(“/AFF/Meditation”) == true) {
meditation =theOscMessage.get(0).floatValue();
}
if(theOscMessage.checkAddrPattern(“/AFF/Engaged/Bored”) == true) {
// parse theOscMessage and extract the values from the OSC message arguments

engagement = theOscMessage.get(0).floatValue();
boredom = 1-engagement; //to seperate boredom from engagement
//println(“ENTERED”);
} else if (theOscMessage.checkAddrPattern(“/AFF/Frustration”) == true) {
frustration = theOscMessage.get(0).floatValue();
}
}

Sound Object Spectrograms and Waveforms

The spectrograms display the frequency on the Y axis, and the timeline on the X axis.

The waveform diagrams show the sounds’ intensity/spl on a timeline.

EbowOnPianoStringSpectrogram EbowOnPianoStringWaveform

This is the Ebow on the piano string. Visible are the even envelope and the layered, continuous pitches of the fundamental frequency and overtones. The transients in the latter part stem from the attached vibrating guitar string ball end. The granular pitch shifter we used for this sound is not employed here.

 

TibetanBowlSpectrogram TibetanBowlWaveform

This is the Tibetan Singing Bowl. In terms of envelope it has similar characteristics as the piano string (the waveform looks exaggerated due to the zoom factor). Likewise we again see continuous pitches/frequencies representing the fundamental plus overtones, if less rich in spectra. The visible low-end rumble is due to room and background noise.

 

LightSensorSpectrogram LightSensorWaveform

This is the light sensor ‘Theremin’. Again we see a steady envelope with no harsh transients. The sharp onset depicts the sound being switched on. Fundamental and overtones are discernible, too. The parabolic shapes in the mid and high frequencies mirror hand movement over the sensor, and the frequency modulation this results in. Of note is the fairly dense high range.

 

WindchimesPatchMotorSpectrogram WindchimesPatchMotorWaveform

In the windchimes’ diagrams we can see some mid to high frequency vertical lines. These display the sound of the servo motor that sets the chimes in motion. The fairly crowded low midrange and low end is due in part to room noise, but mainly stems from the granular harmonizer patch that enhances the chimes’ sound.

 

PianoMovingArmSpectrogram PianoMovingArmWaveform

In the diagrams for the metal arm moving over the piano strings, we can see much sharper onsets and more transients (the vertical lines are much more dominant in both the spectrogram and the waveform), followed by sustained sounds (represented by the horizontal lines, which also illustrate how high frequency sustain is much shorter). Of note is the amazing density over a very wide frequency spectrum. (I haven’t actually included the full audible frequency range in the spectrograms, but capped them off at around 16 kHz to get a little more detail in, and stay a little closer to what many beyond a certain age can hear).

 

WoodenFishSpectrogram WoodenFishWaveform

The diagrams for the wooden fish display the more transient, percussive nature of the sound, with many needle-like peaks in the waveform. In the low mids and low end we can discern the descending melody generated by the employed granulation Maxpatch. Still, some seemingly (but not really) continuous horizontal lines in the spectrogram suggest some steadiness in terms of frequency/pitch.

 

LazerMicTwoModesSpectrogram LazerMicTwoModesWaveform

The lazer mic diagrams represent two different sounds: the melody created by the lazer reflecting off the evenly-spaced moving dots on the turntable, and the flashlight hitting the solar panel a few times near the end. In the first part of the spectrogram we can discern the melodic contour (and pause) the player created. Also visible is the narrow band the sound occupies – many people who listened to this sound found it resembled an old vinyl recording or the sound of their grandparents’ radio. The flashing light’s sharp attacks are better visible in the waveform, as a rapid succession of very short peaks. (The spectrogram does not really convey that sound’s characteristics at this fairly broad zoom level). This sound was almost uniformly experienced as very disturbing and stressful.

Pictures from presentation day

Tibetan singing bowl operated by turntable. Photo courtesy of Sonia Ali

Tibetan singing bowl operated by turntable.
Photo courtesy of Sonia Ali


Marco Melis "playing" the Tibetan singing bowl. Photo courtesy of Sonia Ali

Marco Melis “playing” the Tibetan singing bowl.
Photo courtesy of Sonia Ali

Wheelbearings moving on a turntable. Photo courtesy of Sonia Ali

Wheelbearings moving on a turntable.
Photo courtesy of Sonia Ali


Wheelbearings moving on a turntable. Photo courtesy of Sonia Ali

Wheelbearings moving on a turntable.
Photo courtesy of Sonia Ali


Arduino in custom-made transparent box. Photo courtesy of Sonia Ali

Arduino in custom-made transparent box.
Photo courtesy of Sonia Ali


Projected visuals representing the emotional state of the headset-wearing performer in real-time. Photo courtesy of Sonia Ali

Projected visuals representing the emotional state of the headset-wearing performer in real-time.
Photo courtesy of Sonia Ali


Self-built speaker connected to light-sensor controlled sound source. Photo courtesy of Dan McGurty

Self-built speaker connected to light-sensor controlled sound source.
Photo courtesy of Dan McGurty


Projected visuals representing the emotional state of the headset-wearing performer in real-time. "Speak into me"-pipe in foreground. Photo courtesy of Dan McGurty

Projected visuals representing the emotional state of the headset-wearing performer in real-time. “Speak into me”-pipe in foreground.
Photo courtesy of Dan McGurty


Timo Preece playing the light-sensor. Photo courtesy of Dan McGurty

Timo Preece playing the light-sensor.
Photo courtesy of Dan McGurty


Cameron McNair playing the wheelbearing-moving turntable. Photo courtesy of Dan McGurty

Cameron McNair playing the wheelbearing-moving turntable.
Photo courtesy of Dan McGurty


Motion-sensor controlled windchimes. Photo courtesy of Dan McGurty

Motion-sensor controlled windchimes.
Photo courtesy of Dan McGurty

Final Presentation room setting

We are going to use 3 different types of visualizations for our final presentation.  The first visualization is the “hairy ball” by Whitney, second one is “generative boxes” by Asuka, and the last one is “rader chart” by Alvin.

Hairy BallsScreen Shot 2014-04-23 at 17.56.32

Generative Boxes
Screen Shot 2014-04-23 at 17.59.01

Radar chartScreen Shot 2014-04-23 at 17.53.51 1

 Setting

To project three visualizations, we need three walls and a completely dark room.  First of all, we covered all windows to shut out lights coming from outside.  Secondly, we set three projectors and screens as below.Bl7DJgJIUAAtsZb

  1. Hairy ball
  2. Radar chart
  3. Generative boxes

We tried to put a projector on the top of the left side rack for making a big screen like a cinema, but we found it difficult because there were not enough space for other visualizations.

Sound objects will be set around the participant and the left side of the room because we want audience to play sound objects.  For projection, the space between screens and projectors should be blank.

OSC communication between multi laptops

As there is only one headset, we need to send OSC messages to multi laptops to show 3 different types of visualizations.

Communication method

  1. Processing code in Laptop A receives the OSC message from a head set.
  2. Laptop A sends OSC message to laptop B and C.

oscp

To send and receive message from a laptop to other laptop, processing codes of each laptop have to includes port definition in void setup().

void setup() {

//start oscP5, listening for incoming messages on port 7400
//make sure this matches the port in Mind Your OSCs
oscP5 = new OscP5(this, 5001);
size(displayWidth, displayHeight, OPENGL);
}

“oscP5 = new OscP5(this, 7400); ” means that this code starts oscP5, listening for incoming messages at port 7400.  Port number is arbitrary but laptop A , which receives OSC message from a headset directly, must set port number 7400 because the number matches the port in Mind Your OSCs.

Code for sending message

Furthermore, laptop A includes the code as below.

void setup() {

//start oscP5, listening for incoming messages on port 7400
//make sure this matches the port in Mind Your OSCs
oscP5 = new OscP5(this, 7400);
size(displayWidth, displayHeight, OPENGL);


myRemoteLocation = new NetAddress(“127.0.0.1”,5001);
// laptop B
myRemoteLocation = new NetAddress(“127.0.0.2”,5001); // laptop C

}

The line “myRemoteLocation” defines the IP address and port number of laptop B and C.  This processing send messages to these remote locations.  Using this method, Processing code can send and receive not only value but also texts and numbers.

Reference

Visualization prototype: Generative Boxes

This visualization is the improved version of the “Generative Polygon”.  To make more dynamic motions, I modified some emotion movements according to suggestions from audience.

Previous version

Visualization prototype: Generative Polygon

Effects

  • Excitement : expanding and shrinking boxes / orange
  • Engagement : falling boxes / light green
  • Boredom : hovering box / yellow
  • Frustration : trembling boxes / pink
  • Meditation :  breathing box / blue

These motions reflect the value size of each emotion.

Improvement

I changed motions of engagement, boredom and meditation because they were not very active. Motions of excitement and frustration are exaggerated in the latest visualization, and it seems to be more successful than previous version in terms of audience engagement.

I got the idea of the engagement motions from these links as below.

Expanding, shrinking and breathing motions use same trigonometric functions, sine, cosine and theta. This function makes regular  curves according to mathematical calculation.

Further improvement

I received suggestions that boredom are not very successful because it is not related to boredom motion very well.  Also, there is a issue that only engagement shows rectangles instead of 3D boxes.  I tried to solve the issue, but falling function did not work well.

Visualization prototype: Geometric boxes with Colours

Previous works

This prototype is new version of previous Geometric boxes code.

Playing with colours

We got some suggestions to play with colours in our visualization. We’ve focused on creating more active motions rather than playing with colours because of colour-blindness issues.  In this code, I added a colour function as a complementary function.

The colour shows the strongest emotion in four emotion parameters.

  • Excitement: red
  • Engagement: yellow
  • Frustration: green
  • Meditation: blue

Code is here.

dmsp.digital.eca.ed.ac.uk/blog/braindrain2014/2014/03/11/visualization-prototype-geometric-boxes-with-colours-2/