The Birth of the ‘Hairy Ball’ – Working Progress of coding

The ‘Hairy Ball’ is one of the visual effects we made for the project ‘Brain Drain’, aiming to interpret the performer’s emotional state in a vivid and reasonable way. I am mainly working on this design.

This visualization is based and developed on the Processing code ‘Noise Sphere’ by David Pena (Ref: Location/Processing/File/Examples/Topics/Geometry/NoiseSphere). But I did much work to develop and re-create visual effects with coding.

Here’s the working progress of the design.

Time Modification
26 Feb 2014 Created basic visual effects for ‘Hairy Ball’

·Excitement: Hair grow randomly longer, the ball looks shimmering

·Engagement: Hairy Ball rotates 360°

·Meditation: Hair moves from the center to the edge of the ball regularly and comes back again.

·Frustration: Hairy Ball trembles

27 Feb 2014 Tried to duplicate the ‘Hairy Ball’ to the quantity of four appearing together but finally dropped the idea
12     March 2014 ·Alarm function with TV noise effect to indicate the headset is not working properly

·Play with colour by adding colours to hair according to different emotional state but finally dropped the idea

·Adding the fifth parameter Boredom: Hairy Ball stays fixed and breathing

20 March 2014 ·Modification to the meditation: speed up the moving to make the transition neat and clean

·Text field for displaying real-time data

27 March 2014 ·Having idea of projecting the ‘Hairy Ball’ visualization to a real sphere like a yoga ball
1 April 2014 ·Combining Processing code with Arduino code

·Connecting Processing code with other visuals with IP

Visualization prototype: Generative Boxes

This visualization is the improved version of the “Generative Polygon”.  To make more dynamic motions, I modified some emotion movements according to suggestions from audience.

Previous version

Visualization prototype: Generative Polygon

Effects

  • Excitement : expanding and shrinking boxes / orange
  • Engagement : falling boxes / light green
  • Boredom : hovering box / yellow
  • Frustration : trembling boxes / pink
  • Meditation :  breathing box / blue

These motions reflect the value size of each emotion.

Improvement

I changed motions of engagement, boredom and meditation because they were not very active. Motions of excitement and frustration are exaggerated in the latest visualization, and it seems to be more successful than previous version in terms of audience engagement.

I got the idea of the engagement motions from these links as below.

Expanding, shrinking and breathing motions use same trigonometric functions, sine, cosine and theta. This function makes regular  curves according to mathematical calculation.

Further improvement

I received suggestions that boredom are not very successful because it is not related to boredom motion very well.  Also, there is a issue that only engagement shows rectangles instead of 3D boxes.  I tried to solve the issue, but falling function did not work well.

Visualization prototype: Genetative Polygon

)

)

Previous version

Visualization prototype: Geometric boxes with Colours

Effects

  1. change of object shapes
  2. separate engagement from boredom

This “generative Polygon” is improved version of “geometric boxes”.  The first recognizable modification is object shapes.  I changed object shapes from boxes to polygon because it is easy to change shapes for each emotion.  Also,  this code distinguishes engagement and boredom.  Previous codes could not distinguish between them because a headset reads two emotions together.  In the code, output value 0~50 are set to engagement, value 51~100 are set to boredom.  Each motion of five emotions are as below.

  • Excitement: multiplying objects
  • Engagement: rotating
  • Boredom: moving around
  • Frustration: trembling
  • Meditation:  changing the number of shape points

Visualization prototype: Geometric boxes with Colours

Previous works

This prototype is new version of previous Geometric boxes code.

Playing with colours

We got some suggestions to play with colours in our visualization. We’ve focused on creating more active motions rather than playing with colours because of colour-blindness issues.  In this code, I added a colour function as a complementary function.

The colour shows the strongest emotion in four emotion parameters.

  • Excitement: red
  • Engagement: yellow
  • Frustration: green
  • Meditation: blue

Code is here.

dmsp.digital.eca.ed.ac.uk/blog/braindrain2014/2014/03/11/visualization-prototype-geometric-boxes-with-colours-2/

Visualization prototype: Alert for headset problems

This prototype is improved version of Visualization prototype: Geometric visualization.

The EEG headset often doesn’t work properly.  Most common problem is disconnection of sensors.  To discover these problems, I added an alert function that shows another motion if the headset has problems.

Method

Every 10 seconds, the code saves four emotion values as temporary values to compare previous value and current value.  If previous value and current value are totally same, it means the headset does not work properly.

  1. Every 10 seconds: check each value.
    -excitement 11, engagement 22, frustration 33, meditation 44
  2. Save as temporary values.
    -excitementTemp 11,  engagementTemp 22, frustrationTemp 33, meditationTemp 44
  3. After 10 seconds, check current values.
    excitement 44, engagement 88, frustration 33, meditation 66
  4. Compare all previous temporary values (10 seconds ago) and current values
    excitement==excitementTemp? engagement==engagementTemp? …
  5. If both value are same, alert function starts.
    frustration == frustrationTemp=33 (same!) = Headset is not sending values!

Effects

In the prototype, the code shows noisy lines in the screen.

LED/6v bulbs serial code(Processing+Arduino)

DSC_0026[1]

Processing(Calculate maximum value and send to Arduino):

import processing.serial.*;
import oscP5.*;
import netP5.*;

Serial port;
OscP5 oscP5;
float meditation=0;
float frustration=0;
float engagement=0;
float excitement=0;
boolean start=false;
float max=0;
void setup() {
//size(256, 150);
oscP5 = new OscP5(this, 7400);
println(“Available serial ports:”);
println(Serial.list());

port = new Serial(this, Serial.list()[0], 9600);

}

void draw() {
if(meditation==0 && frustration==0 && engagement==0 && excitement==0){
port.write(0);//LED fading from corner to center
// start = true;
}
// else if(meditation!=0 || frustration!=0 || engagement!=0 || excitement!=0){
// if(start==true){
// port.write(1);//LED all off for 3 sec
// start = false;
// } Continue reading

Visualization prototype: Geometric visualization

)

)

)

This Processing code focuses on generating dynamic and quick transition between emotions.

Effects

Significant issue of previous codes is that visualization stops easily if a participant feels one emotion for a long time.  In this code, boxes , which is main object in the installation, transform quickly in proportion to value size.  Values are utilized for not only defining the biggest value but also changing object’s size, speed and range of vibration.  Also, I prepared threshold value for each motion.  If emotion value takes over the threshold value, each motion exaggerates their motion.

  • Excitement: generating multiple boxes in proportion to value size
  • Engagement: changing object’s rotation speed in proportion to value size
  • Frustration: changing range of vibration in proportion to value size
  • Meditation: changing object’s opacity

Feedback: quick transition, but complicated

Transition became much more responsive and quicker than previous version.  However, the visualization seemed to be complicated because multiple motions appears at the same time. If all motions values take over the threshold value, opaque boxes were rotating, vibrating and being multiplied.  Audience could see quick transition, but it was hard for them to recognize which motion was dominant.

Further improvement

We removed the previous logic that defines biggest value in the code. However, we need to put it back to this new version.  Next version will be simpler.

Code

dmsp.digital.eca.ed.ac.uk/blog/braindrain2014/2014/02/26/335/

Visualization prototype: Generative Neuron

)

This visualization generates neuronic objects according to the biggest emotion value.  After seeing the results of previous experimentation “Emotive particle”,we developed more generative code.

Effects

Basic logic is as same as previous version.  Processing calculates the biggest value, and shows motion related to the biggest emotion.  This code focuses on not only color but also expanding and vanishment.  There are four motions as follows.

  • Excitement: green neurons expands in every direction from the center point
  • Engagement: horizontal neuron line appears
  • Frustration: black neurons eats previous neurons
  • Meditation: blue neurons expands from random points in the whole screen

Feedback: still static, also complicated

To solve the issue that changing color is not effective, this version applied transformation and different pattern of motions.  However, the visualization stopped the motion when a participant continued to feel one particular motion.  Even though neurons generates new branches every second, motions seemed to be static.  This result suggests that the code is much heavier than previous one. After two or three minutes elapsed, there were full neurons and audience could not recognize any changes.

Further improvement

Although neuronic object was more suitable for the concept, output was not generative and responsive.  To improve this issue, we will change object shapes dramatically, and focus on responsive scaling and transformation in proportion of value.

 Code

dmsp.digital.eca.ed.ac.uk/blog/braindrain2014/2014/02/21/processing-code-generative-neuron/

 

Generative neuron

This week we are experimenting with a new visualization that looks more organic and is close to the image of a neuron. We tested the code with the headset and the result of the visualization was not as we expected it so we changed several things in the code such as the radius (strength of line), the colors (added 4 different colors for each parameter) and the velocity. The visualization will be processed even more by adding some vibration and other kind of effects that are applicable to this code. The attached video is a recording of the visualisation without being tested with the headset, only by running the processing script.

Our inspiration originates from the visual that can be found on the following link www.openprocessing.org/sketch/60845

neuronicVisual

Continue reading