Inspiration

We believe there is a massive gap between AI and the physical world around us; we wanted to bridge this gap by building OpenGlass. This enables users to interact with their surroundings through an AI wearable which can be attached to any pair of glasses.

So, we built open source smart glasses.

What it does

The user puts on OpenGlass, continuing with their day as normal. At any point, the user can ask questions about anything they have seen throughout the day, as well as anything that they are currently looking at.

Example: I put on the OpenGlass, and walk around Shack15 for a few minutes. However, I cannot find where I left my phone, so I say, 'Hey OpenGlass, where is my phone?'. To which, it would respond with a description of where it last saw my phone. Now let's say, whilst I'm walking around, I'm looking at various pieces of hardware on my table and I don't understand the components. So I say, 'Hey OpenGlass, what hardware am I currently looking at', to which it would explain the pieces of hardware.

How we built it

MIT License

1x Seeed Studio XIAO ESP32 S3 Sense - https://www.amazon.com/dp/B0C69FFVHH/ref=dp_iou_view_item?ie=UTF8&psc=1

1x EEMB LP502030 3.7v 250mAH battery - https://www.amazon.com/EEMB-Battery-Rechargeable-Lithium-Connector/dp/B08VRZTHDL

For the AI components, we use Moondream on Ollama to understand the stream of images, and LLaMa 3 on Groq to enable users to talk about the stream of images, i.e., to explain what the user is looking at and answer any questions. Afterwards, we generate speech (TTS) from the response. We integrated these various pieces in TypeScript. We also used HuggingFace to test out the components of our stack prior to integrating them.

Challenges we ran into

We ran into a few hardware challenges, namely integrating the physical attachment piece onto the glasses frame, in such a way which would work for any pair of glasses. However, we persevered, even printing our own unique custom case holder for the attachment piece, ultimately overcoming this challenge.

Accomplishments that we're proud of

We're proud to have integrated all the various components of the product together in a fully usable way. Also, we've succeeded in making the attachment fit onto any pair of glasses in the most aesthetic manner possible.

What we learned

We have learned lots this weekend. From a development perspective, we learned how to integrate the hardware component with the software component, as well as how to integrate various technologies and APIs in a coherent and performance efficient manner, so that we can understand what's happening in our users' surroundings.

Further, we learned how to best interpret and process the ways in which users want to interact with their environments through OpenGlass. For example, how we should process users' questions on their surroundings and provide the best experience.

What's next for OpenGlass

We will launch OpenGlass as an open source project. Keen to continue working on its development as a team, we will also work closely with the open source community to further improve OpenGlass.

We truly believe that OpenGlass will positively improve people's lives, enabling them to achieve more and do more!

Built With

Share this project:

Updates