-
Our projects' first UI design
-
New buttons and colorways
-
UI Button Work
-
Connecting android to test and view UI
-
Made and integrated logo into the app
-
First successfully designed pop up
-
Finally gained access to camera permissions on android
-
Successful camera opening and closing
-
Achieved accurate machine learning inference percentages
-
Successfully finished, tested, and operated the app.
Inspiration
Our journey began with a simple yet profound realization: healthcare should be accessible, understandable, and right in the palm of your hand. We were inspired by the potential to bridge the gap between patients and healthcare providers through technology. The idea of MediScan was born from the desire to empower individuals to take charge of their health by providing a tool for early detection and management of skin conditions.
What it does
MediScan transforms smartphones into powerful diagnostic tools. It uses a custom-built machine learning model to analyze images of skin abnormalities. Users can snap a picture, and MediScan will identify the condition, provide an assessment with a confidence percentage, and suggest treatments. This not only facilitates early detection but also reduces unnecessary medical visits, saving time and resources.
How we built it
We leveraged TensorFlow tools to train a custom machine learning model and converted it to TensorFlow Lite for mobile integration. We created the app on Android Studio, with Nick programming the front end in xml and myself (Alex) programming the backend using Java. Together we crafted the app to ensure a seamless user experience and accurate results.
Challenges we ran into
- Getting the UI to be exactly how we wanted it with proper popups triggered by certain buttons was difficult at first, but, we found helpful videos on YouTube that guided us through a multitude of errors and allowed us to come to an outcome that we were happy with.
- A majority of our time was mostly spent on building the best possible custom machine learning model using TensorFlow tools in order to ensure the model's accuracy. We basically sat down, scoured the internet, and collected an insane amount of images depicting the different skin conditions to train the model on, as we had to make sure that the model would be trained on multiple variations of skin colors as well as deviations in skin type. We also had to determine the most optimal batch size and learning rate for our model. After we successfully tested and trained the model, we converted it to TensorFlow Lite, to prepare for integration into our android app.
- One of our main issues when integrating the TensorFlow Lite model into Android Studio was that we initially believed our data type was FLOAT32 but it turned out to be UINT8, due to this fact we had to research and find a way to modify the way we prepared the input buffer for the TensorFlow Lite model. After a lot of time researching and searching on the internet, we eventually found the correct way to adjust the classifyImage method accordingly and change how our loop iterated over pixels and extracts R, G, and B values.
- Getting the front end (xml) to connect to the back end (java) was also another challenging aspect of the project that took us a while to really get down, as we had to find ways so that the buttons we created actually executed and ran a function in the backend. For example, if a user presses the “Take Picture” button, it successfully opens the phone app on an android device, asks for permission, and then allows the user to take a picture.
Accomplishments that we're proud of
Together we are proud that we were able to stick with the entire development of the app from start to finish as there were many points where it felt like we should just give up. Yet despite all obstacles, we developed an app that really does have the possibility to revolutionize digital healthcare. It's crazy to think that our hard work has resulted in a tool that can truly make a difference in people's lives.
What we learned
This project definitely taught us the intricacies of machine learning, especially the importance of machine learning model testing, as well as the power of perseverance. From the nuances of UI/UX design to the complexities of backend integration and processes, we gained an immense amount of insights into real world app development, as both my teammate and I have never dove this deep into creating a market ready app.
What's next for MediScan
Due to the success of our apps' development, both my teammate and I believe that there is great potential for our app to go public, and it is something that we would like to see happen. We believe that healthcare should be accessible to anybody and that this app could provide a solution to it. We plan to expand our database, continue refining our learning model, and potentially even start partnerships with medical professionals to enhance the accuracy and reliability of our app.
Built With
- android-studio
- java
- tensorflow
- xml
Log in or sign up for Devpost to join the conversation.