Inspiration

Just imagine what would it be living 1 day with closed eyes? But there more than 250mln visually disabled people who live their entire life with closed eyes! iSee helps them See!

What it does

Take a picture or video stream your surroundings and iSee will talk you what is around you. Specific niche it focuses on is reading medicine descriptions, and detecting doors and traffic lights!

How we built it

We build it using Google's Vertex AI, Gemini Visual Pro, Google TTS, Python, React native both in Android and iOS

Challenges we ran into

We pitched our first prototype to visually impaired people, but didn't know about accessibility mode Talkback untill they told us. Based on user feedback we updated our app so that it supports Talkback by labeling each buttons. As Google TTS doesn't fully support our native language, we integrated Mohir AI voice assistant as well.

Accomplishments that we're proud of

Our visually impaired user was happy after using our updated app, because it supports native language AI assistant that user can make conversation and integrated messengers like Telegram and Whatsapp. Which is not available in our competitors' apps.

What we learned

The largest lesson we learned was after interview with visually impaired people who told us about accessibility mode. We learned how to make application with accessibility mode and connect to it Generative AI.

What's next for iSee

We are working on adding AI in Russian language, integrate generative AI to video stream, and add "Call to Family" feature

Share this project:

Updates