Annotation of foot-contact and foot-off events is the initial step in post-processing for most quantitative gait analysis workflows. If clean force plate strikes are present, the events can be automatically detected. Otherwise, annotation of gait events is performed manually, since reliable automatic tools are not available. Automatic annotation methods have been proposed for normal gait, but are usually based on heuristics of the coordinates and velocities of motion capture markers placed on the feet. These heuristics do not generalize to pathological gait due to greater variability in kinematics and anatomy of patients, as well as the presence of assistive devices. In this paper, we use a data-driven approach to predict foot-contact and foot-off events from kinematic and marker time series in children with normal and pathological gait. Through analysis of 9092 gait cycle measurements we build a predictive model using Long Short-Term Memory (LSTM) artificial neural networks. The best-performing model identifies foot-contact and foot-off events with an average error of 10 and 13 milliseconds respectively, outperforming popular heuristic-based approaches. We conclude that the accuracy of our approach is sufficient for most clinical and research applications in the pediatric population. Moreover, the LSTM architecture enables real-time predictions, enabling applications for real-time control of active assistive devices, orthoses, or prostheses. We provide the model, usage examples, and the training code in an open-source package.
Bibliographical noteFunding Information:
The study was funded by the Mobilize Center, National Institutes of Health Big Data to Knowledge (BD2K) Center of Excellence supported through Grant U54EB020405. Our research was supported by the Mobilize Center, a National Institutes of Health Big Data to Knowledge (BD2K) Center of Excellence through Grant U54EB020405.
© 2019 Kidzin´ski et al. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.