Researchers at USC recently demoed their Flexible Action and Articulated Skeleton Toolkit, which uses Kinect and the OpenNI drivers to enable users to control PC games with body movements and gestures.
A group of researchers at the University of Southern California (USC) has announced new technology that will likely blow the socks off of modern gaming. The Flexible Action and Articulated Skeleton Toolkit (FAAST) uses Microsoft Kinect for the Xbox 360, along with the OpenNI drivers, to enable gamers to control their favorite PC games with body movements and gestures instead of a keyboard and mouse.
Kinect uses motion sensors that build a skeletal map of your body. This map tells games with Kinect support what part of your body has moved -- and to where -- so the game can respond to that movement. Kinect also has voice recognition so you can control games, movies, and other media with your voice. In addition, Kinect ID remembers all of this information so that you are instantly recognized and can jump into a game without having to retrain the system.
OpenNI is a not-for-profit organization that certifies and promotes applications, devices, and middleware used for Natural Interaction (NI). Fortunately for us, OpenNI was founded by the same company that created the technology behind Kinect -- PrimeSense.
OpenNI creates drivers for camera and sensor devices like the Kinect that can be used to interface with other game systems, such as PCs. This enables PC software developers to build applications and games that are controlled by body movements instead of traditional controllers. Currently, drivers are available for Windows and Ubuntu, and Mac OS X support is coming soon (it's currently in an unstable early build).
FAAST is middleware written by a team at USC's Institute for Creative Technologies that provides an interface for applications to access movement data from the Kinect and other motion-sensing devices. In addition, FAAST emulates keyboard, mouse, joystick, and other input devices so that movements and gestures can be bound to a specific key or button. This allows for amazing possibilities in controlling virtually any game or application, as the team at USC demonstrated in their video announcing FAAST (the technical explanation begins at about the 1:15 mark in the video).
FAAST uses a custom VRPN server that can broadcast the body movement data across a network so that the sensors do not need to be connected to the device processing the information. This can enable a variety of research applications where the user's movements are later analyzed. As the video states, the USC ICT team is planning on continuing to expand their software to allow for more precise and higher-level gestures. They also plan to develop applications for medical purposes, including rehabilitation after stroke or brain injury.
In its current state, this technology is a little slow, but it is a great start on a new road toward immersive and active gaming. As new hardware is created and the software is improved, we can start to see applications where this can emerge as the norm instead of a basic, working sample. What uses can you see for this technology besides the demonstrated World of Warcraft? Post your answer in the comments below.