Estimating Whole-Body Walking Motion from Inertial Measurement Units at Wrist and Heels Using Deep Learning
Yuji Kumano*,**, Suguru Kanoga**, , Masataka Yamamoto*,*** , Hiroshi Takemura* , and Mitsunori Tada**
*Graduate School of Science and Technology, Tokyo University of Science
2641 Yamazaki, Noda, Chiba 278-8510, Japan
**Artificial Intelligence Research Center, National Institute of Advanced Industrial Science and Technology (AIST)
***Graduate School of Advanced Science and Engineering, Hiroshima University
A recurrent-neural-network-based deep-learning model was developed to estimate the three-axis joint angles of an entire body with 17 bones during walking from three inertial measurement units (IMUs) — one each on the left wrist and heels. In this model, the acceleration and angular velocity of the previous 49 frames and current frame were considered as inputs. The architecture comprises two hidden layers (two long short-term memory layers) and a dense layer. The performance of the model was evaluated using the National Institute of Advanced Industrial Science and Technology (AIST) Gait Database 2019 public dataset. Consequently, the root mean squared error of each joint angle was less than 12.28°. A comparison of the estimation results of the same model with IMUs at the pelvis and shanks revealed that the proposed model is advantageous in terms of balanced measurement accuracy and ease of use in realizing whole-body motion capture. Although the accuracy of the model was better than those of previous models in estimating the general whole-body motion from six IMUs, it was worse than that of a previous model in estimating only the lower-limb motion from three IMUs attached to the pelvis and shanks during walking. In the proposed model, IMUs are attached to the left wrist and heels, and whole-body motion can be easily captured using a smartwatch and smart shoes.
-  H. Hörder, I. Skoog, and K. Frändin, “Health-Related Quality of Life in Relation to Walking Habits and Fitness: A Population-based Study of 75-Year-Olds,” Qual. Life Res., Vol.22, No.6, pp. 1213-1223, 2013.
-  A. M. Aurand, J. S. Dufour, and W. S. Marras, “Accuracy Map of an Optical Motion Capture System with 42 or 21 Cameras in a Large Measurement Volume,” J. Biomech., Vol.58, pp. 237-240, 2017.
-  I. Hillel, E. Gazit, A. Nieuwboer, L. Avanzino, L. Rochester, A. Cereatti, U. D. Croce, M. O. Rikkert, B. R. Bloem, E. Pelosin, S. D. Din, P. Ginis, N. Giladi, A. Mirelman, and J. M. Hausdorff, “Is Every-Day Walking in Older Adults more Analogous to Dual-Task Walking or to Usual Walking? Elucidating the Gaps between Gait Performance in the Lab and during 24/7 Monitoring,” Eur. Rev. Aging Phys. Act., Vol.16, No.1, pp. 1-12, 2019.
-  M. Schepers, M. Giuberti, and G. Bellusci, “Xsens MVN: Consistent Tracking of Human Motion Using Inertial Sensing,” Xsens Technologies, Vol.1, No.8, 2018.
-  C. Z. Y. Choo, J. Y. Chow, and J. Komar, “Validation of the Perception Neuron System for Full-Body Motion Capture,” PLoS One, Vol.17, No.1, e0262730, 2022.
-  T. von Marcard, B. Rosenhahn, M. J. Black, and G. Pons-Moll, “Sparse Inertial Poser: Automatic 3D Human Pose Estimation from Sparse IMUs,” Eurogr. Symp. Geom. Process., Vol.36, No.2, pp. 349-360, 2017.
-  Y. Huang, M. Kaufmann, E. Aksan, M. J. Black, O. Hilliges, and G. Pons-Moll, “Deep Inertial Poser: Learning to Reconstruct Human Pose from Sparse Inertial Measurements in Real Time,” ACM Trans. Graph., Vol.37, No.6, pp. 1-15, 2018.
-  X. Yi, Y. Zhou, and F. Xu, “TransPose: Real-Time 3D Human Translation and Pose Estimation with Six Inertial Sensors,” ACM Trans. Graph., Vol.40, No.4, pp. 1-13, 2021.
-  M. Mundt, W. Thomsen, T. Witter, A. Koeppe, S. David, F. Bamer, W. Potthast, and B. Markert, “Prediction of Lower Limb Joint Angles and Moments during Gait Using Artificial Neural Networks,” Med. Biol. Eng. Comput., Vol.58, No.1, pp. 211-225, 2020.
-  M. Mundt, A. Koeppe, F. Bamer, S. David, and B. Markert, “Artificial Neural Networks in Motion Analysis–Applications of Unsupervised and Heuristic Feature Selection Techniques,” Sensors, Vol.20, No.16, 4581, 2020.
-  E. Dorschky, M. Nitschke, C. F. Martindale, A. J. van den Bogert, A. D. Koelewijn, and B. M. Eskofier, “CNN-Based Estimation of Sagittal Plane Walking and Running Biomechanics From Measured and Simulated Inertial Sensor Data,” Front. Bioeng. Biotechnol., Vol.8, pp. 1-14, 2020.
-  M. S. Renani, A. M. Eustace, C. A. Myers, and C. W. Clary, ”The Use of Synthetic IMU Signals in the Training of Deep Learning Models Significantly Improves the Accuracy of Joint Kinematic Predictions,” Sensors, Vol.21, No.17, 5876, 2021.
-  Y. Kobayashi, N. Hida, K. Nakajima, M. Fujimoto, and M. Mochimaru, “AIST Gait Database 2019,” 2019. https://unit.aist.go.jp/harc/ExPART/GDB2019.html [Accessed October 5, 2022]
-  Y. Endo, M. Tada, and M. Mochimaru, “Dhaiba: Development of Virtual Ergonomic Assessment System with Human Models,” Proc. 3rd Int. Digital Human Modeling Symps., #58, 2014.
-  N. Magnenat-Thalmann and D. Thalmann, “Human Body Deformations Using Joint-Dependent Local Operators and Finite-Element Theory,” N. I. Badler, B. A. Barsky, and D. Zeltzer (Eds.), “Making Them Move: Mechanics, Control, and Animation of Articulated Figures,” Morgan Kaufmann Publishers Inc., pp. 243-262, 1991.
-  A. D. Young, M. J. Ling, and D. K. Arvind, “IMUSim: A Simulation Environment for Inertial Sensing Algorithm Design and Evaluation,” Proc. 10th ACM/IEEE Int. Conf. Inf. Process. Sens. Netw., pp. 199-210, 2011.
-  M. Loper, N. Mahmood, J. Romero, G. Pons-Moll, and M. J. Black, “SMPL: A Skinned Multi-Person Linear Model,” ACM Trans. Graph., Vol.34, No.6, 2015.
-  M. Schuster and K. K. Paliwal, “Bidirectional Recurrent Neural Networks,” IEEE Trans. Signal Process., Vol.45, No.11, pp. 2673-2681, 1997.
-  X. Du, R. Vasudevan, and M. Johnson-Roberson, “Bio-LSTM: A Biomechanically Inspired Recurrent Neural Network for 3-D Pedestrian Pose and Gait Prediction,” IEEE Robot. Autom. Lett., Vol.4, No.2, pp. 1501-1508, 2019.
This article is published under a Creative Commons Attribution-NoDerivatives 4.0 Internationa License.