How can visually impaired people better understand their surroundings?
The World Health Organization reports that 285 million individuals are blind or visually impaired. Those affected by low vision are four times more likely to have a pedestrian injury and face greater risk of injury as they commute, work, and move around the home. For those living with visional impairments, gaining a thorough understanding of the local environment is key to safety and navigation.
The Snapberry device aids people with blindness by translating physical settings into speech and describing the world around them. Snapberry uses a single-board computer and the user's smartphone for an assistant IoT device that vividly describes surroundings and establishes a communication channel between the user and their contacts.
Using a camera, ultrasonic distance sensor, and Raspberry Pi computer, Snapberry connects to the user’s smartphone and a cloud server. The interface is built on a pair of 3D printed glasses, allowing users to customize the size and fit. As the user moves around their environment, Microsoft Vision API uses the attached camera to describe the setting, which the smartphone reads aloud to the wearer. Snapberry utilizes Firebase to connect Raspberry Pi to the smartphone in real-time and to support speech-to-text communication between the wearer and an emergency contact or guardian.