By Michael Sims

“Sitting at my computer, my arm hurting,” Manu Kumar remembers, “I asked myself, ‘Why the heck do I have to move the mouse to click on that link, and why do I have to keep scrolling?’” Kumar (CIT’95, CS’97), a former Alumni Association board member, has just completed his doctoral work at Stanford University. He created EyePoint, an interaction technique for pointing and clicking using a camera that tracks the user’s gaze.

How does EyePoint work?

You look at whatever you want to click on. Then you press and hold a key, or a foot pedal, or any other such trigger. EyePoint brings up a magnified view of the region you were looking at. You look at the target again, and then you simply release the key to perform the action.

What’s your primary goal?

Trying to provide a more lightweight alternative to the mouse. For me and a lot of other people, the mouse is the biggest culprit in repetitive stress injury.

When will EyePoint be available outside labs?

Probably in a few years. We’re using an eye tracker that currently costs about $25,000, but given the technology and economic trends, we expect that cost to come down very soon to be under a hundred dollars. Some laptops already have Web cameras embedded in them. As the resolution of that camera increases, it will become possible to use it for doing eye tracking in the future.

Do you see everyday computer users someday using EyePoint?

Ultimately, you could have one in every computer.