gLike
Visualizing Spatial and Temporal Data from the Xbox Kinect

Gesture and sign language research is primarily concerned with the movement, location, and orientation of articulators in space. Although it is possible to quantify these variables, coding is often imprecise. With a team of four other students, I worked on a project to develop and create new ways to visualize spatial and temporal data using the Xbox Kinect, and to integrate these visualizations into current research practices. The group accomplished these goals through a series of contextual interviews, design sessions, and prototype testing. In the end, our group developed a flexible, intuitive interface that is easily modifiable by researchers, which builds on current methods as well as providing informative visualizations of Kinect data.

Margaret Paveza
User Experience Researcher San Diego, CA