single-au.php

IJAT Vol.19 No.4 pp. 678-690
doi: 10.20965/ijat.2025.p0678
(2025)

Research Paper:

Assembly Movement Analysis by Work Classification Using Motion Capture and Machine Learning

Ryuto Kawane*, Koki Karube*, Masao Sugi** ORCID Icon, Tomohiro Nakada*** ORCID Icon, and Tetsuo Yamada*,† ORCID Icon

*Department of Informatics, The University of Electro-Communications
1-5-1 Chofugaoka, Chofu, Tokyo 182-8585, Japan

Corresponding author

**Department of Mechanical Engineering and Intelligent Systems, The University of Electro-Communications
Tokyo, Japan

***Department of English Communication, Bunkyo Gakuin University
Tokyo, Japan

Received:
February 29, 2024
Accepted:
April 8, 2025
Published:
July 5, 2025
Keywords:
motion tracking, work analysis, skill transfer, inexperienced and experienced workers, digital transformation
Abstract

Recently, the manufacturing industry has digitized skills through motion capture to solve issues such as human resource development and skill transmission. However, the amount of data on body movements obtained from motion capture is enormous, and machine learning techniques are required for data mining. Elemental tasks are useful for conducting work analysis, where the unit of analysis or unit element is divided by the entire work. This study proposes an assembly movement analysis method based on work classification using motion capture and machine learning. Here, the differences between motions of experienced and inexperienced workers were classified using motion capture and deep learning software for the worker’s experience level and body part analysis.

Cite this article as:
R. Kawane, K. Karube, M. Sugi, T. Nakada, and T. Yamada, “Assembly Movement Analysis by Work Classification Using Motion Capture and Machine Learning,” Int. J. Automation Technol., Vol.19 No.4, pp. 678-690, 2025.
Data files:
References
  1. [1] Ministry of Economy, Trade, and Industry, “White Paper on Monodzukuri 2021 (FY2020 White Paper on Manufacturing Industries),” Ministry of Economy, Trade, and Industry, 2021 (in Japanese).
  2. [2] IE Dictionary Editorial Committee, “The Editorial Committee of Industrial Engineering Dictionary,” Nikkan Kogyo Shimbun, 1973 (in Japanese).
  3. [3] A. Fujita, “New editon of the fundamental of IE,” Kenpakusha, 1978 (in Japanese).
  4. [4] M. Moghaddam, N. C. Wilson, A. S. Modestino, K. Jona, and S. C. Marsella, “Exploring augmented reality for worker assistance versus training,” Advanced Engineering Informatics, Vol.50, Article No.101410, 2021. https://doi.org/10.1016/j.aei.2021.101410
  5. [5] S. Taki and S. Yonezawa, “Motion analysis of lathe machining work using a digital position display device,” Int. J. Automation Technol., Vol.16, No.5, pp. 625-633, 2022. https://doi.org/10.20965/ijat.2022.p0625
  6. [6] W. Ertel, “Introduction to artificial intelligence,” Springer, 2017. https://doi.org/10.1007/978-3-319-58487-4
  7. [7] A. Nakazawa, “Keywords you should know motion capture,” The J. of the Institute of Image Information and Television Engineers, Vol.63, pp. 1224-1227, 2009 (in Japanese). https://doi.org/10.3169/itej.63.1224
  8. [8] S. Taki, Y. Feng, and Y. Kajihara, “Evaluation method of shielded metal arc welding skills using surface shape analysis,” J. of Japan Industrial Management Association, Vol.72, Issue 4E, pp. 272-280, 2022. https://doi.org/10.11221/jima.72.272
  9. [9] H. Iwasaki, K. Aoki, T. Funahashi, T. Sagiyama, Y. Miwata, and H. Koshimizu, “Two way modeling of human inspection mechanism for its quantification,” J. of Society of Automotive Engineers of Japan, Vol.67, pp. 83-89, 2013 (in Japanese).
  10. [10] R. Kawane, H. Ijuin, R. Nakajima, M. Sugi, S. Yamada, and T. Yamada, “Measurement of disassembly work using optical motion capture,” Proc. of the 18th Global Conf. on Sustainable Manufacturing, pp. 103-110, 2023. https://doi.org/10.1007/978-3-031-28839-5_12
  11. [11] Z. Liu, Q. Liu, W. Xu, Z. Liu, Z. Zhou, and J. Chen, “Deep learning-based human motion prediction considering context awareness for human-robot collaboration in manufacturing,” Procedia CIRP, Vol.83, pp. 272-278, 2019. https://doi.org/10.1016/j.procir.2019.04.080
  12. [12] M. Wilhelm, V. M. Manghisi, A. Uva, M. Fiorentino, V. Bräutigam, B. Engelmann, and J. Schmitt, “ErgoTakt: A novel approach of human-centered balancing of manual assembly lines,” Procedia CIRP, Vol.97, pp. 354-360, 2020. https://doi.org/10.1016/j.procir.2020.05.250
  13. [13] R. Kawane, H. Ijuin, M. Sugi, R. Nakajima, K. Okamoto, T. Nakada, S. Matsuno, and T. Yamada, “Work movement visualization and worker estimation methods using motion capture and machine learning,” J. of The Society of Plant Engineers Japan, Vol.34, No.4, pp. 34-44, 2023 (in Japanese).
  14. [14] M. H. Rahman, M. R. Hasan, N. I. Chowdhury, M. A. B. Syed, and M. U. Farah, “Predictive health analysis in industry 5.0: A scientometric and systematic review of motion capture in construction,” Digital Engineering, Vol.1, Article No.100002, 2024. https://doi.org/10.1016/j.dte.2024.100002
  15. [15] C. Saeki, Y. Kubo, M. Suga, H. Egusa, and A. Ohta, “Development and realization of system for passing down skills and techniques on digital motion analysis,” MAZDA Technical Review, No.38, pp. 127-132, 2021 (in Japanese).
  16. [16] NTT DATA Mathematical Systems Inc., (in Japanese). https://www.msi.co.jp/deeplearner/ [Accessed February 27, 2024]
  17. [17] C. Brambilla, R. Marani, L. Romeo, M. L. Nicora, F. A. Storm, G. Reni, M. Malosio, T. D’Orazio, and A. Scano, “Azure Kinect performance evaluation for human motion and upper limb biomechanical analysis,” Heliyon, Vol.9, Issue. 11, Article No.e21606, 2023. https://doi.org/10.1016/j.heliyon.2023.e21606
  18. [18] Toshiba Clip Team, “Exploring the secrets of engineering and maestro through motion capture,” Toshiba, 2019. https://www.toshiba-clip.com/en/detail/p=342 [Accessed April 20, 2025]
  19. [19] NAC Image Technology, Inc (in Japanese). https://www.nacinc.jp/analysis/motion-capture/mac3d-system/ [Accessed April 20, 2025]
  20. [20] Monotaro (in Japanese). https://www.monotaro.com/p/4210/1997/ [Accessed April 20, 2025]
  21. [21] H. Nishijima, R. Kawane, R. Nakajima, M. Sugi, and T. Yamada, “A study of work analysis using motion capture and video,” Society of Plant Engineers Japan, Autumn Meeting, pp. 42-43, 2022 (in Japanese).
  22. [22] M. P. Kadaba, H. K. Ramakrishnan, and M. E. Wootten, “Measurement of lower extremity kinematics during level walking,” J. of Orthopaedic Research, Vol.8, Issue 3, pp. 383-392, 1990. https://doi.org/10.1002/jor.1100080310
  23. [23] M. Terada, S. Tsuji, T. Suzuki, and S. Fukushima, “New data analysis textbook in Python,” Shoeisha, 2018 (in Japanese).

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Jul. 04, 2025