single-au.php

IJAT Vol.13 No.4 pp. 506-516
doi: 10.20965/ijat.2019.p0506
(2019)

Paper:

Riding Motion Capture System Using Inertial Measurement Units with Contact Constraints

Tsubasa Maruyama*,†, Mitsunori Tada**, and Haruki Toda**

*Human Augmentation Research Center, National Institute of Advanced Industrial Science and Technology (AIST)
6-2-3 Kashiwanoha, Kashiwa, Chiba 277-0882, Japan

Corresponding author

**Artificial Intelligence Research Center, National Institute of Advanced Industrial Science and Technology (AIST), Tokyo, Japan

Received:
November 27, 2018
Accepted:
April 3, 2019
Published:
July 5, 2019
Keywords:
ergonomic design, motion capture, inertial measurement unit, digital human, human-machine interaction
Abstract

The measurement of human motion is an important aspect of ergonomic mobility design, in which the mobility product is evaluated based on human factors obtained by digital human (DH) technologies. The optical motion-capture (MoCap) system has been widely used for measuring human motion in laboratories. However, it is generally difficult to measure human motion using mobility products in real-world scenarios, e.g., riding a bicycle on an outdoor slope, owing to unstable lighting conditions and camera arrangements. On the other hand, the inertial-measurement-unit (IMU)-based MoCap system does not require any optical devices, providing the potential for measuring riding motion even in outdoor environments. However, in general, the estimated motion is not necessarily accurate as there are many errors due to the nature of the IMU itself, such as drift and calibration errors. Thus, it is infeasible to apply the IMU-based system to riding motion estimation. In this study, we develop a new riding MoCap system using IMUs. The proposed system estimates product and human riding motions by combining the IMU orientation with contact constraints between the product and DH, e.g., DH hands in contact with handles. The proposed system is demonstrated with a bicycle ergometer, including the handles, seat, backrest, and foot pedals, as in general mobility products. The proposed system is further validated by comparing the estimated joint angles and positions with those of the optical MoCap for three different subjects. The experiment reveals both the effectiveness and limitations of the proposed system. It is confirmed that the proposed system improves the joint position estimation accuracy compared with a system using only IMUs. The angle estimation accuracy is also improved for near joints. However, it is observed that the angle accuracy decreases for a few joints. This is explained by the fact that the proposed system modifies the orientations of all body segments to satisfy the contact constraints, even if the orientations of a few joints are correct. This further confirms that the elapsed time using the proposed system is sufficient for real-time application.

Cite this article as:
T. Maruyama, M. Tada, and H. Toda, “Riding Motion Capture System Using Inertial Measurement Units with Contact Constraints,” Int. J. Automation Technol., Vol.13 No.4, pp. 506-516, 2019.
Data files:
References
  1. [1] W. Karwowski, M. M. Soares, and N. A. Stanton, “Human factors and ergonomics in consumer product design: methods and techniques,” CRC Press, 2011.
  2. [2] J. Schmidt, D. Krüger, S. Eilmus, K. Paetzold, S. Wartzack, and D. Krause, “Design for mobility – A methodical approach,” Proc. of 12th Int. Design Conf., pp. 1101-1110, 2012.
  3. [3] J. Yang, J. H. Kim, K, Abdel-Malek, T. Marler, S. Beck, and G. R. Kopp, “A new digital human environment and assessment of vehicle interior design,” Computer-Aided Design, Vol.39, Issue 7, pp. 548-558, 2007.
  4. [4] T. Lin, T. Dayasoma, V. A. Ubesiri, and V. Wickramaratne, “Ergonomics product development of park bench in CAD environment,” Int. J. Automation Technol., Vol.10, No.2, pp. 153-162, 2016.
  5. [5] Y. Endo, M. Tada, and M. Mochimaru, “Hand modeling and motion reconstruction for individuals,” Int. J. Automation Technol., Vol.8, No.3, pp. 376-387, 2014.
  6. [6] D. A. Winter, “Biomechanics and motor control of human movement,” 4th Edition, Wiley, 2009.
  7. [7] P. Plantard, H. P. H. Shum, A.-S. Le Perrres, and F. Multon, “Validation of an ergonomic assessment method using Kinect data in real workplace conditions,” Applied Ergnomics, Vol.65, pp. 562-569, 2017.
  8. [8] Motion Capture Systems – VICON. https://www.vicon.com/ [Accessed November 20, 2018]
  9. [9] D. Mavrikios, V. Karabatsou, M. Pappas, and G. Chryssolouris, “An efficient approach to human motion modeling for the verification of human-centric product design and manufacturing in virtual environments,” Robotics and Computer-Integrated Manufacturing, Vol.23, No.5, pp. 533-543, 2007.
  10. [10] M. Windolf, N. Götzen, and M. Morlock, “Systematic accuracy and precision analysis of video motion capturing systems – exemplified on the Vicon-460 system,” J. of Biomechanics, Vol.41, No.12, pp. 2776-2780, 2008.
  11. [11] D. Roetenberg, H. Luinge, and P. Slycke, “Xsens MVN: full 6DoF human motion tracking using miniature inertial sensors,” Xsens Technologies B.V. Technical Report, pp. 1-10, 2009.
  12. [12] Perception Neuron by Noitom. https://neuronmocap.com/ [Accessed November 20, 2018]
  13. [13] A. Filippeschi, N. Schmitz, M. Miezal, G. Bleser, E. Ruffaldi, and D. Stricker, “Survey of motion tracking methods based on inertial sensors: a focus on upper limb human motion,” Sensors, Vol.17, Issue 6, pp. 1257:1-1257:40, 2017.
  14. [14] X. Robert-Lachaine, H. Mecheri, C. Larue, and A. Plamondon, “Accuracy and repeatability of single-pose calibration of inertial measurement units for whole-body motion analysis,” Gait & Posture, Vol.54, pp. 80-86, 2017.
  15. [15] W. H. K. de Vries, H. E. J. Veeger, C. T. M. Baten, and F. C. T. van der Helm, “Magnetic distortion in motion labs, implications for validating inertial magnetic sensors,” Gait & Posture, Vol.29, Issue 4, pp. 535-541, 2009.
  16. [16] A. Y. C. Nee, S. K. Ong, G. Chryssolouris, and D. Mourtzis, “Augmented reality applications in design and manufacturing,” CIRP Annals, Vol.61, Issue 2, pp. 657-679, 2012.
  17. [17] M. Lempereur, S. Brochard, F. Leboeuf, and O. Rémy-Néris, “Validity and reliability of 3D marker based scapular motion analysis: A systematic review,” J. of Biomechanics, Vol.47, No.10, pp. 2219-2230, 2014.
  18. [18] K. Ogata, H. Tanaka, and Y. Matsumoto, “Estimating zero moment point and floor reaction force using visual markers,” Proc. of 2016 IEEE Int. Symp. on Robotics and Intelligent Sensors (IRIS), pp. 134-139, 2016.
  19. [19] E. Ceseracciu, Z. Sawacha, and C. Cobelli, “Comparison of markerless and marker-based motion capture technologies through simultaneous data collection during gait: proof of concept,” PLOS ONE, Vol.9, Issue 3, e87640, 2014.
  20. [20] N. Sarafianos, B. Boteanu, B. Ionescu, and I. A. Kakadiaris, “3D Human pose estimation: A review of the literature and analysis of covariates,” Computer Vision and Image Understanding, Vol.152, pp. 1-20, 2016.
  21. [21] D. Mehta, S. Sridhar, O. Sotnychenko, H. Rhodin, M. Shafiei, H.-P. Seidel, W. Xu, D. Casas, and C. Theobalt, “VNect: real-time 3D human pose estimation with a single RGB camera,” ACM Trans. on Graphics, Vol.36, No.4, pp. 44:1-44:14, 2017.
  22. [22] Z. Cao, T. Simon, S.-E. Wei, and Y. Sheikh, “Realtime multi-person 2D pose estimation using part affinity fields,” 2017 IEEE Conf. on Computer Vision and Pattern Recognition (CVPR), pp. 1302-1310, 2017.
  23. [23] H. Rhodin, C. Richardt, D. Casas, E. Insafutdinov, M. Shafiei, H.-P. Seidel, B. Schiele, and C. Theobalt, “EgoCap: egocentric marker-less motion capture with two fisheye cameras,” ACM Trans. on Graphics, Vol.35, No.6, pp. 162:1-162:11, 2016.
  24. [24] L. Guo and S. Xiong, “Accuracy of base of support using an inertial sensor based motion capture system,” Sensors, Vol.17, Issue 9, pp. 2091:1-2091:24, 2007.
  25. [25] M. Al-Amri, K. Nicholas, K. Button, V. Sparkes, L. Sheeran, and J. L. Daview, “Inertial measurement units for clinical movement analysis: reliability and concurrent validity,” Sensors, Vol.18, Issue 3, pp. 719:1-719:29, 2018.
  26. [26] T. Seel, J. Raisch, and T. Schauer, “IMU-based joint angle measurement for gait analysis,” Sensors, Vol.14, Issue 4, pp. 6891-6909, 2014.
  27. [27] M. Kok, J. D. Hol, and T. B. Schon, “An optimization-based approach to human body motion capture using inertial sensors,” IFAC Proc. Volumes, Vol.47, Issue 3, pp. 79-85, 2014.
  28. [28] T. Maruyama, M. Tada, A. Sawatome, and Y. Endo, “Constraint-based real-time full-body motion-capture using inertial measurement units,” Proc. of 2018 IEEE Int. Conf. on Systems, Man, and Cybernetics, pp. 4298-4303, 2018.
  29. [29] Y. Huang, M. Kaufmann, E. Aksan, M. J. Black, O. Hilliges, and G. Pons-Moll, “Deep Inertial Poser: learning to reconstruct human pose from sparse inertial measurements in real time,” ACM Trans. on Graphics, Vol.37, No.6, pp. 185:1-185:15, 2018.
  30. [30] Y. Endo, M. Tada, and M. Mochimaru, “Dhaiba: development of virtual ergonomic assessment system with human models,” Proc. of Digital Human Modeling 2014, Paper #58, 2014.
  31. [31] C. Mousas, “Full-body locomotion reconstruction of virtual characters using a single inertial measurement unit,” Sensors, Vol.17, Issue 11, pp. 2589:1-2589:21, 2017.
  32. [32] https://unit.aist.go.jp/hiri/dhrg/ja/dhdb/91-92/index.html (in Japanese) [Accessed November 20, 2018]
  33. [33] J. E. Dennis and J. Moré, “Quasi-Newton methods, motivation and theory,” Society for Industrial and Applied Mathmatics Review, Vol.19, No.1, pp. 46-89, 1977.
  34. [34] MTw Awinda - Products - Xsens 3D motion tracking. https://www.xsens.com/products/mtw-awinda/ [Accessed November 20, 2018]

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Apr. 22, 2024