CAM WOW!

Published: The Interactive Graffiti Project
Hands-Free Interaction with the Kinect

CAM WOW

For the “closed” opening night of the new Contemporary Art Museum (CAM) in Raleigh, co-developing a Kinect-based interactive art project with David Rieder, Patrick Fitzgerald and Lee Cherry.

Presented at the (Closed) Opening of the Contemporary Art Museum (CAM) in Raleigh
Friday, 29 April 2011

The funding for this project came from the 2010-2011 Digital Humanities Scholarship and Research Award (DH-SRA), which was co-funded by the College of Humanities and Social Sciences (CHASS) and the Office of Information Technology (OIT) at NC State University. We thank CHASS Dean Jeffrey Braden and Vice Chancellor of Information Technology and CIO Marc Hoit for their interests in both the digital humanities and interdisciplinary collaboration at NC State.

Project developed by:

Lee Cherry Manager of the Advanced Media Lab
Patrick Fitzgerald Associate Professor of Art + Design
David M Rieder Associate Professor of English and program faculty in CRDM
and Bishon Bopanna MA student in Computer Science

Summary of project:

The “Interactive Graffiti Project” was developed collaboratively by faculty in Art + Design and English at NC State University. It is the first of several projects planned by the group, based on their work with the Microsoft Kinect.

The project is a proof of prototype for a next generation of digital humanities projects based on the capabilities of and data from the Microsoft Kinect. The Kinect is part of a shift in computer design based on natural-user interfaces (NUIs). The “Interactive Graffiti Project” proved to us that the Kinect can be a platform for robust project development.

The Flash project projected on the wall during the pre-opening of CAM Raleigh enabled performers to “write” complex graphics and text on one of ten different “stages” zooming across the scene. Once the Kinect “calibrates” a user, it streams data in the form of x-y-z coordinates through a tcp server to a Flash application. That data is then used as the basis for user-interactions with the visual assets on the screen.

Patrick Fitzgerald conceived of the project, guided its development as our project lead, and designed all of the visual assets for it, which included complex, multi-layered movie clips, short videos, text, and line drawings.

Bishon Bopanna rewrote the “stick man” program in the Prime Sense/NITE open source initiative so that x-y-z data comprising a user’s position and movements streamed to an ActionScript3 class for the Flash project. Related to this, he created a TCP server, which allows Flash to get the C++ data from the Kinect through node.js.

David Rieder developed the user registration program in ActionScript 3, which parses, sorts and prioritizes user data streaming from the Kinect sensor from a socket connection to the node-based TCP server.

Lee Cherry developed the ActionScript 3 “engine” that translated the x-y-z user data to the visual assets dynamically moving and transforming on the screen. He also developed a sound visualization that represented the constantly-changing levels of ambient sound in the performance space.

Video:

video of performance


Click image to see video

Photographs:

Photo taken by Lee Cherry

Photo by Lee Cherry

Photo by Lee Cherry

Photo by Lee Cherry

Photo by Lee Cherry