Inspiration
The technology field is expanding in all directions, and we felt that it was about time disabled people didn't feel handicapped. A smartphone could and should be enough, and that way a disabled person would feel that they have full control over their lives and no dependency on others.
What it does
Our application, while using the back camera of a smartphone acts as the 'eyes' of a blind person. It detects all the objects it sees in the frame and gives voice output to the user. A timer is maintained and the process is repeated after a given interval of time, and the user is informed only when there is a change in his/her surrounding.
How I built it
We have used the phone's back camera and built our own camera application that would automatically capture pictures at constant intervals. We are using the 'Clarifai' api that uses the pictures captured and gives us a list of strings of the objects it detects in the picture. We are then using IBM-Watson's TextToSpeech API to give the speech outputs to the user.
Challenges I ran into
We all worked independently, so merging and compiling everything together was a bit of a task.
Accomplishments that I'm proud of
We have a working prototype (not the final one) that could help blind people in various scenarios.
What I learned
We learned how to use Android SDK of various API's. Constructing a camera application suitable to our needs and interfacing it with the required APIs was challenging but equally interesting and rewarding of a task.
What's next for SoundVision
We want to make the system more interactive and efficient in terms of the output it generates. With machine learning, speech outputs and better hardware interface, we can make the application more usable for the visually handicaps.
Log in or sign up for Devpost to join the conversation.