Dustin Adams, a Ph.D student at the University of California at Santa Cruz, has teamed up with colleagues at his school in order to craft an app that helps visually impaired users line up the ideal snapshot.
The researchers also built their own app that dispenses with a “shutter” button as it can be hard for people with a visual impairment to locate. Instead, the app snaps a picture in response to a simple upward swipe gesture. And it merges face detection and the voice accessibility features so that the phone speaks out loud the number of faces detected, helping the user get everyone in shot. Audio cues help get the main subject of a shot in frame and in focus.
As soon as the app’s camera mode is turned on, the phone also begins recording a 30-second audio file which can be restarted at any time with a double tap to the screen. This is to help with photo organising and sharing – and is used as an aide-memoire as to who is in shot. The user can choose to save this sound file along with the time and date, and GPS data that is translated into audio giving the name of the neighbourhood, district or city the shot was taken in.