Carnegie Mellon University
February 27, 2013

Press Release: Camera Inside Spiraling Football Provides Ball's-Eye View of Field

Carnegie Mellon, Japanese Researchers Produce Wide-Angle Video With Single Camera

Contact: Byron Spice / 412-268-9068 / bspice@cs.cmu.edu

BallCamPITTSBURGH—Football fans have become accustomed to viewing televised games from a dozen or more camera angles, but researchers at Carnegie Mellon University and the University of Electro-Communications (UEC) in Tokyo suggest another possible camera position: inside the ball itself.

The researchers have shown that a camera embedded in the side of a rubber-sheathed plastic foam football can record video while the ball is in flight that could give spectators a unique, ball's-eye view of the playing field. Because a football can spin at 600 rpm, the raw video is an unwatchable blur. But the researchers developed a computer algorithm that converts the raw video into a stable, wide-angle view.

Kris Kitani, a post-doctoral fellow in Carnegie Mellon's Robotics Institute, is aware that a football league is unlikely to approve camera-embedded footballs for regular play. Even so, the BallCam might be useful for TV, movie productions or training purposes. Two demonstration videos are available on his website: http://www.cs.cmu.edu/~kkitani/Top.html.

One of his co-authors, UEC's Kodai Horita, a visiting graduate student last year at the Robotics Institute, will present a paper about BallCam on March 8 at the Augmented Human International Conference in Stuttgart, Germany.

Kitani said BallCam was developed as part of a larger exploration of digital sports. "We're interested in how technology can be used to enhance existing sports and how it might be used to create new sports," he explained. In some cases, athletic play may be combined with arts or entertainment; a camera-embedded ball, for instance, might be used to capture the expressions on the face of players as they play catch with it.

Other researchers have developed throwable cameras that produce static images or use multiple cameras to capture stabilized video. The BallCam system developed by Kitani and Horita, along with
Hideki Sasaki and Professor Hideki Hoike of UEC, uses a single camera with a narrow field of view to generate a dynamic, wide-angle video.

When the ball is thrown in a clean spiral, the camera records a succession of frames as the ball rotates. When processing these frames, the algorithm uses the sky to determine which frames were made when the camera was looking up and which were made when it was looking down. The upward frames are discarded and the remaining, overlapping frames are stitched together with special software to create a large panorama. Similar stitching software is used by NASA to combine images from Mars rovers into large panoramas and is increasingly found in digital cameras.

The algorithm also makes corrections for some distortions in the image that twist yard lines and occur because of the speed of the ball's rotation. Further work will be necessary to eliminate all of the distortion, Kitani said, and a faster camera sensor or other techniques will be needed to reduce blurring. Multiple cameras might also be added to the football to improve the finished video.

The Robotics Institute is part of Carnegie Mellon's School of Computer Science. Follow the school on Twitter @SCSatCMU.

###

The BallCam system, pictured above, uses a single camera with a narrow field of view to generate a dynamic, wide-angle video.