Inspiration

Since several members of the group were runners, we wanted to create an application that would fit in with our hobbies and help others with similar difficulties.

What it does

To fit in the theme of AI for Good, we decided to create a program that would be able to analyze a video of a person running in profile view, providing the user with information on different factors of their running gait and how it could be improved. This included the deviation of their arms from the ideal 90 degree angle, the difference between stride length of their left and right foot, the deviation of their back leg from the ideal 180 degree angle, and the cadence of the runner. This advises runners on how to improve their running form, which could reduce runner injuries and promote a healthy and active lifestyle.

How we built it

Our program utilizes YOLO, an object detection and image segmentation model, to extract the location of specific points in each frame of the video, referred to as keypoints. These keypoints represent important landmarks of a person’s body, such as the important joints and facial features like the eyes and nose. Using these points, we can determine the angles between different joints of the body and perform calculations that can help evaluate a person’s running gait.

To evaluate the arm angle, we determined the angle between the wrist, elbow, and shoulder of the runner for each frame and took the standard deviation of the difference between that angle and 90 degrees. Our analysis of stride length and the angle of the back leg up were reliant on determining the frame in which the foot hits the ground. To do this, we calculated the duration of a stride and the start of one complete stride by modeling the y-coordinate of the foot to a sine function, and using the minimums of the function to determine the frame where the foot is at its lowest point, which would correspond to the frame the foot is hitting the ground. We then calculated the distance between the feet for stride length and the angle between the hip, knee, and foot for the back leg on that specific frame, comparing it to 180 degrees. By counting the number of frames per stride, we could also use the frames per second of the video to calculate the cadence, or steps per minute, of the runner.

Challenges we ran into and What we learned

One of the biggest challenges to overcome was learning and understanding how YOLO works and how to extract data from it. Our team also did not have experience with connecting frontend to backend, and it was challenging to learn how to use Flask to create an API so our interface could connect with our backend data model. Determining the point at which the foot hit the ground was also a challenge we overcame by using the sine technique we described above. Improving the accuracy of tracking the keypoints was also a factor we struggled with, but did not have time to retrain the model to give more accurate tracking specific to running.

Accomplishments that we're proud of

We're proud of picking up a new software, specifically YOLO, as well as finding a creative solution to a challenge that stumped us for a while, namely determining when the foot hit the ground.

What's next for Running Gait Analysis

If given the chance to expand upon our project, one of the biggest improvements would be training YOLO on our own data instead of using the pretrained model. This way we could determine the keypoints with higher confidence levels, and give more accurate data. We could also add more key points such as the toes, so we can analyze other important areas of a runner’s form, such as footstrike. Adding extra features such as determining the distance the runner has ran in the video would give the runner more useful information. We could also improve UI on the frontend. Lastly, enhancing our code to analyze different points of view such as the front and back view of a runner would allow us to evaluate more aspects of a runner’s form.

Share this project:

Updates