An upcoming US based analytics based sports training startup.
The acquisition and provision of high-quality video files for movement analysis is a core component for the training process. There was no presence of standard training data and it was difficult to distinguish between different classes of clubs from naked eye. The club used to move in 360 degree so it was troublesome to identify the difference between various individual’s shots.
Algoscale developed a model to distinguish between different classes of clubs being used by a golf player from video feed and then grading the shot of a player based on its trail. Different angles of club with the body in different frames was derivedto get trail of each shot. Then, similar operation was performed on professional videos to further compare it with the shot of a rookie player. Finally, cosine similarity between both the trails was done to get final grading. Training data was extracted from progolfswingvideos.com and golfloopy.com. The entire video was broken down into different frames of images and at different time intervals. Object identification on the images was implemented to obtain features at granular level and then Deep Learning algorithm like fast RCNN was applied.
Identification accuracy of upto 85% was achieved while grading of shot was done with an accuracy of 75%.
Fast RCNN & Python.