Note: Hands interaction must be enabled on Quest prior to starting the app.
Inspiration
We noticed throughout our careers that XR dev firms are typically B2B or, if they are B2C, manage their own proprietary platform with high purchase costs. Our goal is to make training modules and simulations more accessible for your average XR user by publishing apps directly to the Meta Quest store and selling them at a more affordable price point for secondary and post-secondary students.
What it does
Live On is a mixed reality training module prototype where the user learns how to apply first aid to a bleeding open cut wound. The demo includes 3 steps: clean blood from the wound, apply gauze pad, and wrap bandage around limb and gauze.
How we built it
We built this application using Mixed Reality passthrough, the Presence Platform Interaction SDK (hands), custom interaction scripts (Robert), custom 3D models and blend shapes (Susana), and custom UI design (Mariya). Custom scripts include the Interaction class and relevant subclasses (Absorb, Attach, Collect/Wrap) as well as a Progress Tracker class that tracks the user's sequential progress. The leg model is open source, however the remaining models, textures, and blend shapes for blood soaking effects are custom made. The UI design was initially created in Illustrator, replicated with 3D effects in Unity, and managed via the custom Progress Tracker script.
Challenges we ran into
We found that the SDKs don't quite support Unity 6 just yet, so we had to revert to Unity 2022.3 LTS to make full use of the MR tools available. The last-minute shift to work on the hackathon (we build this in 4-5 days) also meant we had to put other efforts on hold (i.e. 3D nuclear reactor environment for VR, pitch deck brand design, etc) during a month of tight deadlines. The team lead (Robert) wanted to ensure that no one was overworking themselves for this project (he is a hackathon/game jam veteran and felt confident that a project of this scope was achievable within this timeframe), so art assets were created on Thursday, Friday, and Monday the weekend of the hackathon deadline; code was written Thursday, Friday, Sunday, and Monday. We also couldn't support controllers in time, just hand interaction. We also didn't have access to the Meta Canvas prefabs and the UI docs were limited, so we needed to figure out how to work with those components from scratch.
Accomplishments that we're proud of
We managed to create a solid demo with a tight production timeline! It's shorter than we would've liked, but we believe the demo is enough to convey the concept of our project.
What we learned
We gained a newfound understanding of the dev tools that Meta provides for Unity developers working in XR and, in particular, MR. The project also gave us renewed interest in mixed reality as a potential medium for SimAcademy projects, as we are now assessing it as an alternative to virtual reality for certain training modules. As a developer who has worked with HoloLens, it has also been enjoyable to work with in a mixed reality environment again. Lastly, while our team has worked together to some extent at OssoVR, we were really able to lean on each other's skills for this project and see what each of us were capable of; seeing each person's individual contributions motivated each of us to keep doing better and improving our own work.
What's next for Live On: A First Aid Simulation
If Meta would have us, we'd like to continue working on Live On via the Oculus Start program and test the waters for users to become certified through XR training modules. With additional funding, we can create a more accurate and robust first aid simulation while integrating the Accredible platform into our app, while also expanding further into CPR/AED and other critical volunteer roles. Without Meta's support, our team will likely source funding through B2B contract work while working on Live On on the side albeit at a much slower pace due to conflicting priorities.
Log in or sign up for Devpost to join the conversation.