Articulating Gestures
Articulating Gestures
Articulating Gestures
Articulating Gestures

Articulating Gestures

Wintersemester 2015 - 2016

 

Smartphone, DMX, Moving Head, OpenFrameworks, Processing

 

The interaction study presents various possibilities on how to remotely control a light via smartphone aside from the common touch and voice commands.

Wider interaction space

Common smartphones have a GPS unit for location tracking, position and acceleration sensors for motion detection, a light sensor for measuring the ambient brightness, a magnetic field sensor to find north, a proximity sensor for knowing if the display is covered and cameras to film or make photos. By using these features a wider range of interactions can be create:

Controling a moving head

In order to create an readable and easy-to-setup scenario, I decided to work with a moving head stage light. The device brings full control of position, intensity and spread of a light beam, parameters I could node to gestures.

Many gestures I examined alone, while other gestures were tested with fellow students. At the end of the 5-week study an interactive performance were set up and visitors could grab the phone and play with the light beam. There were no instructions but the direct feedback of the moving head and a simple interface on the phone. Many users intuitively discovered gestures and understood how to control the light.

A prototypic interaction was filmed for documentation. It contains a selection of examined gestures.