Carnegie Mellon University
April 25, 2013

News Brief: With Wave of the Hand, Carnegie Mellon Researchers Create Touch-based Interfaces on Everyday Surfaces

Depth Cameras and Projectors Combine To Make Smart Environments

Contact: Byron Spice / 412-268-9068 / bspice@cs.cmu.edu

world kitPITTSBURGH-Researchers previously have shown that a depth camera system, such as Kinect, can be combined with a projector to turn almost any surface into a touchscreen. But now researchers at Carnegie Mellon University have demonstrated how these touch-based interfaces can be created almost at will, with the wave of a hand.

CMU's WorldKit system enables someone to rub the arm of a sofa to "paint" a remote control for her TV or swipe a hand across an office door to post his calendar from which subsequent users can "pull down" an extended version. These ad hoc interfaces can be moved, modified or deleted with similar gestures, making them highly personalized.

Researchers at Carnegie Mellon's Human-Computer Interaction Institute (HCII) used a ceiling-mounted camera and projector to record room geometries, sense hand gestures and project images on desired surfaces.

But Robert Xiao, an HCII doctoral student, said WorldKit does not require such an elaborate installation. "Depth sensors are getting better and projectors just keep getting smaller," he said. "We envision an interactive 'light bulb' - a miniaturized device that could be screwed into an ordinary light fixture and pointed or moved to wherever an interface is needed."

The system does not require prior calibration, automatically adjusting its sensing and image projection to the orientation of the chosen surface. Users can summon switches, message boards, indicator lights and a variety of other interface designs from a menu. Ultimately, the WorldKit team anticipates that users will be able to custom design interfaces with gestures.

Xiao developed WorldKit with Scott Hudson, an HCII professor, and Chris Harrison, a Ph.D. student. They will present their findings April 30 at CHI 2013, the Conference on Human Factors in Computing Systems, in Paris.

"People have talked about creating smart environments, where sensors, displays and computers are interwoven," said Harrison, who will join the HCII faculty this summer. "But usually, that doesn't amount to much besides mounting a camera up on the ceiling. The room may be smart, but it has no outlet for that smartness. With WorldKit, we say forget touchscreens and go straight to projectors, which can make the room truly interactive."

Though WorldKit now focuses on interacting with surfaces, the researchers anticipate future work may enable users to interact with the system in free space. Likewise, higher resolution depth cameras may someday enable the system to sense detailed finger gestures. In addition to gestures, the system also could be designed to respond to voice commands.

"We're only just getting to the point where we're considering the larger questions," Harrison said, noting a multitude of applications in the home, office, hospitals, nursing homes and schools have yet to be explored.

This work was sponsored in part by a Qualcomm Innovation Fellowship, a Microsoft Ph.D. Fellowship and grants from the Heinz College Center for the Future of Work, the Natural Sciences and Engineering Research Council of Canada and the National Science Foundation.

The HCII is part of Carnegie Mellon's School of Computer Science. Follow the school on Twitter @SCSatCMU.

###