Thanks. A few quick questions - what frame rate did you use for your high speed video and how did you define initial contact and toe off? From my experience even using manual digitization of ‘gold standard’ high speed video has a margin for error.
Just to give you some context - before I joined RunScribe, I was on a research team at a university that RunScribe hired to do an external validation study to test and fine tune the foot strike and toe off detection algorithms. The study used 16 participants, at 3 speeds each, and the participants had a variety of foot strike types, different levels of running experience, and all wore their own choice in shoes, to ensure we were testing the robustness of the algorithm in as many conditions as possible. To allow for ms accuracy, running was filmed at 1000 fps, with a resolution focused entirely on a tiny area right at the treadmill belt. The RunScribe data and high speed video were synchronized, so that we were comparing each step from the high speed video to the same exact step detected by RunScribe. Ten consecutive steps from steady state running were digitized for each condition, and both left and right were measured simultaneously (4 RunScribes in total).
Even using top of the line high speed video equipment, there were often 3-5 frames for foot strike and 2-3 frames for toe off that were incredibly hard to distinguish between and pinpoint exactly when contact starts and ends. Is it when first deformation of the shoe is visible? Is it when there appears to be no space between the shoe and the treadmill? To circumvent the potential source of human error, shoes were marked on the heel and toe, and digital lines imposed on the video taken to ensure each footstep for a participant was digitized at the exact same point, so while there may be some error in choosing the correct frame, the difference should be consistent for each participant, for example participant A may have a human error of 5 ms, and participant B might have an error of -5 ms, but the error is consistent for each participant.
Sorry this is getting long winded! It was a painstaking and time consuming process that I wouldn’t wish on anyone. From the results, it was found that RunScribe was within the gold standard high speed video data by an average of 3 ms. This includes running at 12 minute per mile pace, which is more difficult to detect foot strike and toe off because steps are less impulsive, but we wanted to see how the algorithm performed in all running conditions. I've attached one of the regression plots from this study below.
We’ve had a number of questions regarding accuracy & validation, and we want to be transparent in our approach, especially given the range of accuracy you noted in the devices in market. I’ve been working on a summary of research to share with our community, and will include the link on this thread when we publish.
We are confident in our algorithms and data, but know that many of our users are doing their own analysis - which we fully support! I wanted to share how difficult the process can be and how much there is to consider when interpreting the results. I’d be happy to chat with you more about your results or any future validations you have planned, if you’d like to email me at firstname.lastname@example.org we could discuss your work directly.