Donghui (Whitney) XIE

Report on the Brain Drain Project

Donghui (Whitney) XIE s1318968

The Brain Drain Project aims to interpret human brain’s respond to stimuli in a vivid way. Stimuli could be of all senses. As I read from the reading material, senses of aurality, thermal, odor, colour can all be considered as stimuli (Classen, C., 2005). However, as for our project, we decide to make sound as the stimuli input and visualization as output.

  • Project Mechanism

The information flow of the project is that, different sounds we made stimulate the performer’s brain, the brain wave changes as per and send the waves to the electroencephalogram (EEG) headset via the sensors on it, the EEG headset send the raw data to the software EPOC it works with on the computer, the software EPOC outputs data to software OSCP5 which is a data transfer software connecting with Processing, and with the data, Processing helps us to output the final visualization.

The study of affective descriptors helped us a lot when defining human emotion. It demonstrates emotional change into eight variables: arousing, exciting, pleasant, relaxing, sleep, gloomy, unpleasant and distressing. Plus it gives references about the rating scale for emotions from 1~8 as extremely inaccurate to extremely accurate (James A. Russell and Geraldine Pratt, 1980). As for the EEG headset and the EPOC software, it specifically picks four emotions: excitement, engagement, frustration and meditation. These four emotions are the four parameters that we receive data from the brain. At this stage, we are not able to choose the emotions as they are chosen by default, which come with the EEG headset and EPOC software. The visualization is mainly based on the state of these four emotions. We tried to develop several versions of visualization in this circumstance.

  • Visualization

The whole Brain Drain is divided into two sub-groups, one is the sound team who make sound stimuli using Arduino, and the other is the visualization team who design visual effects as output. I am working in the visualization team to deliver an artistic yet reasonable visual effect. We have experimented a lot by using Processing. Generally, the visualization features four different effects according to different emotional states which are excitement, engagement, frustration and meditation, while remaining its own presentation as a whole. We managed to manipulate the visual effect in such ways as colour, motion, scaling, shaping, etc.

I personally experimented several versions of visualization. As a beginner of Processing, I start with some really simple sketches that output lines, lines with dots, matrix and nets. Screenshots are as below:

Awkward Experimenting Visual Designs

Awkward Experimenting Visual Designs

These experiments are not presentable at that time, but they do serve as the basis of the visualizations afterwards. We worked in group on a complicated Processing code which symbolizes the neuron network. The visualization is presented in an artistic and abstract way that it symbolizes the neuron network and demonstrates the transition of different emotion by different colours. We get some really helpful feedback for it criticizing the limitation of colour changing only. The feedback reminds me of the reading addressing the difference of visualism in West and Desana. Yet the visualism of Desana, with its emphasis on integrating and animating color energies is surely very different from the visualism of the West, with its emphasis on linearity, detached observation and surface appearance (Classen, C., 2005). As a result, we decide to work towards a visualization emphasizes on linearity instead of playing with colour only.

"Neuron Network " Visualization

“Neuron Network ” Visualization

Then I started to experiment with simple 3D objects like boxes and spheres. Lately, I am working on a visualization of 3D hair ball which I present it in a vivid way as if it was a creature that has life. I managed to manipulate its size, motion and looks to suit the four different emotional states. Despite the fact that the 3D hair ball is based on an example that is preinstalled in the Processing, all the motion or behavior of the 3D hair ball is manipulated by me. When the performer is in the state of excitement, the hair will grow longer and shake at the same time; when the performer is in the state of engagement, the 3D hair ball will spin and rotate; when the performer is in the state of frustration, the 3D hair ball will tremble or fluctuate in a struggling way; when the performer is in the state of meditation, the hair will start to move to the edges of the ball in a regular way. I found it interesting that the emotion of excitement is often followed by frustration and thus 3D hair ball is behaving accordingly. In fact, it is not coincidence. The obtained results have suggested not only a strong correlation between insights and feelings like frustration and excitement, but also that EEG measurements have the potential of detecting emotions corresponding to Aha! Moments in a non-intrusive way. Further, it seems that the most accurate detection can be achieved if the generated insights required more thinking time, had already generated frustration in the subject, and contained potentially complex and unexpected information (Cernea. D, Kerren. A, & Ebert A, 2011).

the "Hairy Ball"

the “Hairy Ball”

The major challenge in front of me is that I found it’s hard to find enough field study which can explain how to interpret emotions with visualization or in which way that certain emotion can be related to a certain visual effect. What I found is that: the graphical elements used as agents are produced by generative artist Ludivine Lechat. She drew inspiration from nanotechnology to give the game a microscopic look and feel. The basic elements have a neutral, oval shape and blue colour. High EEG valence readings (normalized on a –1.0 to +1.0 axis) draw more eccentric elements to the screen, such as pink ‘butterflies’ or green ‘caterpillars’. Figure3 shows different elements interacting in the game, where more colourful elements represent emotional brain activity (De Smedt.T, & Menschaert. L, 2012) . The way Ludivine did to visualize the emotion may be our reference to achieve an artistic yet reasonable visualization. We are still on our way of exploring.

Figure 3

Figure 3

References

1 Classen, C., 2005. McLuhan in the rainforest: the sensory world of oral cultures. In D. Howes (ed.), Empire of the Senses: The Sensual Culture Reader: 147-163. Oxford: Berg

2 James A. Russell and Geraldine Pratt, 1980. A description of the affective quality attributed to environments. In Journal of Personality and Social Psychology: 1980, Vol. 38, No. 2, 311~322

3 Cernea, D, Ebert, A, & Kerren, A 2013, A Study Of Emotion-Triggered Adaptation Methods For Interactive Visualization, : SwePub, EBSCOhost, viewed 27 February 2014.

4 Cernea, D, Kerren, A, & Ebert, A 2011, Detecting Insight And Emotion In Visualization Applications With A Commercial EEG Headset, n.p.: SwePub, EBSCOhost, viewed 28 February 2014.

5 De Smedt, T, & Menschaert, L 2012, ‘VALENCE: affective visualisation using EEG’, Digital Creativity, 23, 3/4, pp. 272-277, Business Source Alumni Edition, EBSCOhost, viewed 28 February 2014.