Friday, January 21, 2011
A 'leaning' edge idea
Zoom in on a fourth-floor office in Carnegie Mellon's Newell Simon Hall. A tech entrepreneur reads a spreadsheet on his laptop. Wait. Is that number in cell E5 correct?
As he leans forward to get a closer look, the image magnifies and zooms in -- suddenly, the cell fills the screen.
Making this possible is a recently premiered technology called Lean & Zoom, a software download that uses the cameras found in most laptops to automatically magnify the screen when the user leans in for a closer look. Lean & Zoom LLC, the Carnegie Mellon spinoff company behind the technology, recently premiered the technology at the Consumer Electronics Show in Las Vegas.
Zoom out for a second: The excitement that Lean & Zoom created at the trade show tapped into a growing interest in computing-by-movement. This form of artificial intelligence interprets movement the way a desktop registers a mouse click. Game systems like the Microsoft Kinect, which recently exceeded estimates and sold more than 8 million units over the holidays, can read a user's dance moves and compare them with the dancing avatars on screen.
The Lean & Zoom software, created by CMU grad student Chris Harrison, tracks how close your nose gets to the screen.
Mr. Harrison began development of Lean & Zoom when he came to CMU about 31/2 years ago to begin the computer science doctoral program. His previous big success was the Skinput, a technology that projected buttons on human skin so that pressing buttons on your forearm was like pressing buttons on a television remote.
Around that time, most laptops had started to come automatically equipped with built-in cameras. They were typically used for telecommunication services like Skype, but Mr. Harrison saw a different potential use: a "remarkably sophisticated sensor" capable of "Superman-like magnification," he said.
He first envisioned the technology as software's answer to the schoolmarm: a chance to correct your slouching by watching the screen adjust as you hunched over.
To first establish where the user normally sits, the Lean & Zoom software takes a photo of the user sitting at a comfortable distance. It then uses that photo as an origin point to see how much closer or farther away the face is from the screen.
As you look in for closer detail, the screen magnifies; as you back out, the screen adjusts back to the normal display. If you lean backwards, the screen display stays at the normal dimensions.
To develop the technology that tracked user movement, the team worked with Pittsburgh Pattern Recognition, a Strip District face-tracking firm that had never been involved with a project that used the computer camera to identify distance, said Michael Sipe, vice president of product development.
"It's changing the basic interactions that you would do with a remote control or mouse and keyboard," said Mr. Sipe. "People are thinking there may be more natural ways."
Movement-based computing is still playing catch-up with human intuition, said Mr. Harrison. Humans know not to shout when talking to a person 6 inches away, but computers and artificial intelligence are just beginning to interpret distance and how it affects a display, he said.
The sensitivity of Lean & Zoom software can be configured so that the slightest shift doesn't cause the screen to go crazy like a kaleidoscope. And any screen display -- be it a vacation photo, spreadsheet or YouTube video -- works with the technology.
The software is available for download on the company website, www.LeanAndZoomLLC.com, and costs $27.99. So far, hundreds of downloads have been sold, said Curt Stone, an executive-in-residence at the Carnegie Mellon Quality of Life Technology Center who has become the company's chief executive.
Lean & Zoom became a registered company last fall after entering the center's startup-incubation program.
Mr. Stone helped secure a partnership with KDDI Corp., a Tokyo-based telecommunications firm he called "the AT&T of Japan" that has funded most of the research. A smart phone application of the technology will premiere on Google Android phones in Japan later this year, but the company has started courting angel investors willing to fund a company expansion.
The company now has five part-time workers but expects to assemble a work force of nine full-time employees by the end of the year, said Mr. Stone. The Lean & Zoom team is building a prototype that uses the technology through Google Maps, so leaning in would automatically zoom into the map and display the image in closer detail.
Everyone has an idea on uses for the technology: Mr. Sipe and his team thought the technology could be applied to assembly drawings or blueprints that rotate as your head turns to look at other sides or angles.
The intuitive nature of leaning in for a zoomed image could be applied across many devices, said Mr. Harrison.
Imagine a computer screen on your refrigerator door, he said. When you're standing across the room, the display could read, "You Have Two E-mails" in large type. But as you move closer, the information might grow more detailed until finally the text of the message is displayed.
And even though Mr. Harrison has moved onto other projects and is now conducting research for Microsoft in Seattle, he said he carried with him a side effect of all that Lean & Zoom research.
"My posture's pretty terrible as a computer scientist," he said. "It comes with the job, but I have gotten into some better habits."
Article courtesy of Pittsburgh Post-Gazette