Skip to content
January 9, 2012 / Danii Oliver

Full Body Interaction with the Kinect

My focus in the digital world is interactivity; greater human computer interaction for the purposes of developing, perfecting and marketing VR (virtual reality) and Holodeck Technology. In pursuit of my goal I will endeavor to build applications for entertainment, education, therapy and exercise purposes that utilize motion, depth, audio, fiducial and IR sensors.

I will be building a full body interactive platform to control and navigate an application I have built and an application that is off the shelf.
I will use the off the shelf game to test interactions and determine the ones that suit my program’s immediate needs. After my research is complete I will be able to begin building full body interactions from scratch.

Working with the Kinect I have been able to create gestural activation of simple buttons. I will be taking this a step further to full body position and movement activation of more complex interactions. I will work with the FAAST framework to facilitate my research and development. Examples will be posting on this site with links from this posting.

Setup:
XBox Kinect Camera
Windows OS
FAAST Download : http://projects.ict.usc.edu/mxr/faast/
OpenNI Unstable Build for Windows x86 (32-bit) v1.3.4.3
PrimeSense NITE Unstable Build for Windows x86 (32-bit) v1.4.2.4
Hardware sensors drivers
Microsoft Kinect: SensorKinect-Win-OpenSource32-5.0.3.4.msi
PrimeSensor: PrimeSensor Module Unstable Build for Windows x86 (32-bit) v5.0.3.4

Developed at the University of Southern California Institute for Creative Technologies in collaboration with Belinda Lange, Skip Rizzo, David Krum, and Mark Bolas.

FAAST is middleware to facilitate integration of full-body control with games and VR applications. The toolkit relies upon software from OpenNI and PrimeSense to track the user’s motion using the PrimeSensor or the Microsoft Kinect sensors. FAAST includes a custom VRPN server to stream the user’s skeleton over a network, allowing VR applications to read the skeletal joints as trackers using any VRPN client. Additionally, the toolkit can also emulate keyboard input triggered by body posture and specific gestures. This allows the user add custom body-based control mechanisms to existing off-the-shelf games that do not provide official support for depth sensors.

E. Suma, B. Lange, A. Rizzo, D. Krum, and M. Bolas, “FAAST: The Flexible Action and Articulated Skeleton Toolkit,” Proceedings of IEEE Virtual Reality, pp. 247-248, 2011.

Advertisements

Leave a Reply

Please log in using one of these methods to post your comment:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: