Imagine playing a video game and feeling — yes, feeling — the jolt of a collision. Or the sensation of skidding, braking and acceleration.
Called Surround Haptics, it's one of three new innovative technologies in the works by researchers from Carnegie Mellon University and Disney Research, Pittsburgh.
Surround Haptics is a new tactile technology that allows video game players and film viewers to feel a wide variety of sensations.
For the demo at the SIGGRAPH 2011 International Conference on Computer Graphics and Interactive Techniques in Vancouver, the technology enhanced a high-intensity driving simulator game.
Surround Haptics enabled players to feel road imperfections and objects falling on the car, sense skidding, braking and acceleration, and experience ripples of sensation when cars collide or jump and land.
Another technology being developed by the team involves motion capture.
While traditional techniques use cameras to record the movements of actors, this new motion capture technology uses outward-facing cameras on the actors themselves.
Body-mounted cameras enable the capture of motions, such as running outside or swinging on monkey bars, that would be difficult — if not impossible.
The wearable camera system makes it possible to reconstruct the motions of an actor thanks to a process called structure from motion (SfM).
CMU's Takeo Kanade, a professor of computer science and robotics and a pioneer in computer vision, developed SfM 20 years ago.
The third development is a group of computerized models that reflect a full range of natural expressions — while also giving animators the ability to manipulate them.
The researchers created a method that not only translates the motions of actors into a 3D face model, but also sub-divides it into facial regions that enable animators to create the poses they need.
The models could be used to animate characters for films, video games and exhibits.