Interactive with Kinect

submission2  AO SHEN(Dora)

 

Idea:

The project is named Environment in (e)motion, the environment we provide is a scene as well as the feeling of the sea. Our theme is “Recovery”, for the projection mapping of the sculptures, the animations are designed to show the hints of life onto the statues, so that they might look alive. The interactive part also inherits the key idea, that is self-recovery. Wherever there is a motion triggers the Kinect, the ocean scene that is projected on the wall will no longer be static. The audience’s motion will be shown as waves and the pattern of the wave will be in the identical position as the audience dynamically moving their body. Therefore, it seems they are actually swimming in the sea, which may make them feel relaxed and a sense of freedom. With physical interactions, it might be much easier for the audience to feel engaging and actually engage in the activity.

Processing:

I found some useful tutorial on using SimpleOpenNI library with Processing 2.0 to connect Kinect and make interactions (all referenced below). Among all the libraries designed to work with Kinect, I think SimpleOpenNI is the most powerful one. It is widely used to play with Kinect, so many helpful tutorials and resources can be found on the internet. Also there are solutions to various problems while designing interactive sketches with Kinect in the Processing forum, which cover most of the questions a beginner might ask.

I use the userMap function instead of depthMap to detect multiple users. The user map is an array of pixels with value 0 if the Kinect determines they are background pixels and a higher value, starting from 1, for the pixels defining the users.

Another useful tool is the fullscreen library. Kinect is hard-coded to show the detected image as 640*480px, but it looks terrible when projected onto the wall, so the fullscreen library helps us to enlarge the sketch as its name suggests.

Original Thought:

The original thought was basically to create a deep ocean scene, and to draw the outline of the participants. The undersea creatures (fish, jellyfish, whale, etc.) will swim around their bodies, and the background is also animated.

Z}7]RRRRSEU{6)H4%IGYWXM

In this sketch, I use blobDetect library to detect large blobs of human body part and combine them into contours, then a flock class to make jellyfish swim as a flock, but outside human body. So that the participants can interact with those jellyfish.

Improvement:

However, after actually testing the above Processing sketch in Sculpture Court, I found it not as much interactive as I had thought. Though it reflects the movements people have made, it seems not to be very interesting to just drifting jellyfish away.

Until one day I saw a sketch on OpenProcessing, people can drag the mouse to trigger ripples with the movement of the mouse, though it can only be played on a computer and is not related to Kinect, it gives me huge inspiration. What if the audience’s figure is not directly projected on the scene but their movement can trigger changes on the scene? The implicit way of interaction seems more interesting and surprising than the first thought, and the audience will find out the differences when they pass through the Kinect and notice the ocean scene is not static any more.

The second one is definitely better. However, when we test it only few days before presentation, our supervisor Rocio gave me another inspiration. At that time we were lack of extensions, so I had to set the projector far away from the wall to connect to the socket, and to put Kinect in the middle of the wall and the projector, but the change of position caused the shadows of participants to appear on the wall. Rocio, however, said it would be more interesting to let the audience play with their shadow than only with an image of the sea.

IMG_2397

Reflection:

Though people said this interactive installment impressed them, there are only a few people actually took part in during our presentation, which was apparently not we expected. The audience were not well-informed what they could do next and felt confused. In fact, after our presentation, some of my friends who attended to our presentation told me that they were not aware that they were allowed to engage in this process, and thought they were only supposed to watch. It might because our projection mapping and interactive installation were gathered in a relatively small area, people could somehow feel uncomfortable to walk through an area that placed four projectors. The projection mappings were still on when the interactive part began, so the audience might be afraid of cutting off the light of those projectors and preferred to maintain a safety clearance to our whole project. The result is not exactly what we want, but at least some of them had fun 🙂

Screenshot 2016-04-14 00.55.09

Screenshot 2016-04-14 00.52.06

Reference:

Kinect Physics Tutorial for Processing[Online]. Available from: <www.creativeapplications.net/processing/kinect-physics-tutorial-for-processing/&gt; [Accessed 10 February 2016].

Microsoft Kinect with Processing[Online]. Available from: <interactivemechanics.com/news/2015/10/kinect-with-processing/&gt; [Accessed 22 February 2016].

Kinect Open Source Programming Secrets[Online]. Available from: <fivedots.coe.psu.ac.th/~ad/kinect/&gt; [Accessed 02 March 2016].

Let’s Connect Shadows[Online]. Available from: <www.bencz.com/hacks/2013/11/06/let's-connect-shadows/&gt; [Accessed 05 March 2016].

Blob contour to box2d b2EdgeChainDef[Online]. Available from: <forum.openframeworks.cc/t/blob-contour-to-box2d-b2edgechaindef/2347&gt; [Accessed 11 March 2016].

Processing Kinect Resources[Online]. Available from: <kinectsen.wikispaces.com/Processing+Kinect+Resources&gt; [Accessed 11 March 2016].

Water Simulation[Online]. Available from: <www.openprocessing.org/sketch/43543&gt; [Accessed 18 March 2016].

Java Water Simulation[Online]. Available from: <neilwallis.com/projects/java/water/index.php&gt; [Accessed 18 March 2016].

 

Leave a Reply