single-au.php

IJAT Vol.19 No.5 pp. 733-740
doi: 10.20965/ijat.2025.p0733
(2025)

Research Paper:

Inference of Cognitive Load When Understanding Mechanical Drawings by Electroencephalography for Skill Acquisition Interviews

Hideto Sairenchi, Hikaru Yokoyama ORCID Icon, and Keiichi Nakamoto ORCID Icon

Tokyo University of Agriculture and Technology
2-24-16 Naka-cho, Koganei, Tokyo 184-8588, Japan

Corresponding author

Received:
February 28, 2025
Accepted:
June 10, 2025
Published:
September 5, 2025
Keywords:
machining skill, process planning, cognitive load, electroencephalography, machine learning
Abstract

Process planning needs to be strongly standardized to achieve high-quality and efficient machining without depending on the skill level of the operator. However, it is difficult to automate process planning by acquiring the skills from interviews with skillful operators due to complicated decision-making processes. In contrast, electroencephalography (EEG) has attracted considerable attention recently to obtain brain activity as a key to revealing operators’ decision-making processes. Therefore, this study aims to infer the cognitive load of the operators by EEG when understanding mechanical drawings for skills acquisition interviews. EEG data were classified by frequency bands, such as alpha and beta. The brain activity within these frequency bands reflects the characteristics of cognitive load across tasks. The degree of alpha over beta was visualized by calculating the power spectrum ratio of the EEG data. Multiple types of preliminary tasks were prepared with different difficulty levels to infer cognitive load. Subsequently, a machine-learning model was constructed by adapting the common spatial pattern method to infer cognitive load, which varied according to the difficulty level, from the obtained timeseries EEG data. After validation of cognitive load inference, a machine-learning model was applied to the EEG data obtained during the understanding of mechanical drawings to classify the difficulty levels. The inference results demonstrated the possibility of identifying time-varying cognitive load and support the interviews to acquire machining skills.

Cite this article as:
H. Sairenchi, H. Yokoyama, and K. Nakamoto, “Inference of Cognitive Load When Understanding Mechanical Drawings by Electroencephalography for Skill Acquisition Interviews,” Int. J. Automation Technol., Vol.19 No.5, pp. 733-740, 2025.
Data files:
References
  1. [1] Q. An, J. Nakagawa, J. Yasuda, W. Wen, H. Yamakawa, A. Yamashita, and H. Asama, “Skill extraction from nursing care service using sliding sheet,” Int. J. Automation Technol., Vol.12, No.4, pp. 533-541, 2018. https://doi.org/10.20965/ijat.2018.p0533
  2. [2] Q. An, Y. Ishikawa, W. Wen, S. Ishiguro, K. Ohata, H. Yamakawa, Y. Tamura, A. Yamashita, and H. Asama, “Skill abstraction of physical therapists in hemiplegia patient rehabilitation using a walking assist robot,” Int. J. Automation Technol., Vol.13, No.2, pp. 271-278, 2019. https://doi.org/10.20965/ijat.2019.p0271
  3. [3] W. Sakarinto, H. Narazaki, and K. Shirase, “A decision support system for capturing CNC operator knowledge,” Int. J. Automation Technol., Vol.5, No.5, pp. 655-662, 2011. https://doi.org/10.20965/ijat.2011.p0655
  4. [4] H. Kodama, T. Hirogaki, E. Aoyama, and K. Ogawa, “Investigation of end-milling condition decision methodology based on data mining for tool catalog database,” Int. J. Automation Technol., Vol.6, No.1, pp. 61-74, 2012. https://doi.org/10.20965/ijat.2012.p0061
  5. [5] M. Hashimoto and K. Nakamoto, “A neural network based process planning system to infer tool path pattern for complicated surface machining,” Int. J. Automation Technol., Vol.13, No.1, pp. 67-73, 2019. https://doi.org/10.20965/ijat.2019.p0067
  6. [6] A. Hayashi and Y. Morimoto, “Study on process design based on language analysis and image discrimination using CNN deep learning,” Int. J. Automation Technol., Vol.17, No.2, pp. 112-119, 2023. https://doi.org/10.20965/ijat.2023.p0112
  7. [7] N. Komura, K. Matsumoto, S. Igari, T. Ogawa, S. Fujita, and K. Nakamoto, “Computer aided process planning for rough machining based on machine learning with certainty evaluation of inferred results,” Int. J. Automation Technol., Vol.17, No.2, pp. 120-127, 2023. https://doi.org/10.20965/ijat.2023.p0120
  8. [8] K. Nishida, M. Itoh, and K. Nakamoto, “Augmented reality-based system for skill transfer of workpiece fixturing in turning operations,” Int. J. Automation Technol., Vol.17, No.2, pp. 136-143, 2023. https://doi.org/10.20965/ijat.2023.p0136
  9. [9] M. Lušić, C. Fischer, K. S. Braz, M. Alam, R. Hornfeck, and J. Franke, “Static versus dynamic provision of worker information in manual assembly: A comparative study using eye tracking to investigate the impact on productivity and added value based on industrial case examples,” Procedia CIRP, Vol.57, pp. 504-509, 2016. https://doi.org/10.1016/j.procir.2016.11.087
  10. [10] J. Dou, L Zhang, Q. Zhao, Q. Pei, and J. Qin, “Human-machine interface evaluation of CNC machine control panel through multidimensional experimental data synchronous testing analysis method,” Int. J. of Performability Engineering, Vol.13, No.8, pp. 1195-1205, 2017.
  11. [11] J. Niemann, A. Basson, C. Fussenecker, K. Kruger, M. Schlösser, S. Turek, and H. U. Amarnath, “Implementation of eye-tracking technology in holonic manufacturing systems,” Procedia – Social and Behavioral Sciences, Vol.238, pp. 37-45, 2018. https://doi.org/10.1016/j.sbspro.2018.03.005
  12. [12] S. Taki and S. Yonezawa, “Motion analysis of lathe machining work using a digital position display device,” Int. J. Automation Technol., Vol.16, No.5, pp. 625-633, 2022. https://doi.org/10.20965/ijat.2022.p0625
  13. [13] T. Ihara and Y. Ito, “A new concept of CAPP based on flair of experienced engineers—Analyses of decision-making processes of experienced process engineers,” CIRP Annals, Vol.40, No.1, pp. 437-440, 1991. https://doi.org/10.1016/S0007-8506(07)62024-1
  14. [14] K. Yamaguchi, M. Yamaguchi, Y. Kondo, and S. Sakamoto, “Analysis of turning process relative to machining technician’s skills,” Advanced Materials Research, Vols.655-657, pp. 2152-2155, 2013. https://doi.org/10.4028/www.scientific.net/AMR.655-657.2152
  15. [15] Q. Lohmeyer and M. Meboldt, “How we understand engineering drawings: An eye tracking study investigating skimming and scrutinizing sequences,” Int. Conf. on Engineering Design, pp. 359-368, 2015.
  16. [16] T. Yoshikawa, F. Nakamura, E. Sogabe, and K. Nakamoto, “Acquisition of skills for process planning through eye tracking when understanding mechanical drawings,” Int. J. Automation Technol., Vol.17, No.2, pp. 128-135, 2023. https://doi.org/10.20965/ijat.2023.p0128
  17. [17] J. Chen, J. E. Taylor, and S. Comu, “Assessing task mental workload in construction projects: A novel electroencephalography approach,” J. of Construction Engineering and Management, Vol.143, No.8, Article No.04017053, 2017. https://doi.org/10.1061/(ASCE)CO.1943-7862.0001345
  18. [18] P. Antonenko, F. Paas, R. Grabner, and T. van Gog, “Using electroencephalography to measure cognitive load,” Educational Psychology Review, Vol.22, No.4, pp. 425-438, 2010. https://doi.org/10.1007/s10648-010-9130-y
  19. [19] H. Niu, J. Hao, Z. Ming, X. Yang, and L. Wang, “Characterization and classification of EEG signals evoked by different CAD models,” Human Factors and Ergonomics in Manufacturing & Service Industries, Vol.34, No.4, pp. 292-308, 2024. https://doi.org/10.1002/hfm.21027
  20. [20] B. Cao, H. Niu, J. Hao, X. Yang, and Z. Ye, “Spatial visual imagery (SVI)-based electroencephalograph discrimination for natural CAD manipulation,” Sensors, Vol.24, No.3, Article No.785, 2023. https://doi.org/10.3390/s24030785
  21. [21] P. A. Hebbar, S. Vinod, A. K. Shah, A. A. Pashilkar, and P. Biswas, “Cognitive load estimation in VR flight simulator,” J. of Eye Movement Research, Vol.15, No.3, pp. 1-16, 2024. https://doi.org/10.16910/jemr.15.3.11
  22. [22] A. R. John, A. K. Singh, T.-T. N. Do, A. Eidels, E. Nalivaiko, A. M. Gavgani, S. Brown, M. Bennett, S. Lal, A. M. Simpson, S. M. Gustin, K. Double, F. R. Walker, S. Kleitman, J. Morley, and C.-T. Lin, “Unraveling the physiological correlates of mental workload variations in tracking and collision prediction tasks,” IEEE Trans. on Neural Systems and Rehabilitation Engineering, Vol.30, pp. 770-781, 2022. https://doi.org/10.1109/TNSRE.2022.3157446
  23. [23] C. Lim, J. A. Barragan, J. M. Farrow, J. P. Wachs, C. P. Sundaram, and D. Yu, “Physiological Metrics of Surgical Difficulty and Multi-Task Requirement during Robotic Surgery Skills,” Sensors, Vol.23, No.9, Article No.4354, 2023. https://doi.org/10.3390/s23094354
  24. [24] J. Wu, X. Chen, M. Zhao, and C. Xue, “Cognitive characteristics in wayfinding tasks in commercial and residential districts during daytime and nighttime: A comprehensive neuroergonomic study,” Advanced Engineering Informatics, Vol.61, Article No.102534, 2024. https://doi.org/10.1016/j.aei.2024.102534
  25. [25] M. Keskin, K. Ooms, A. O. Dogru, and P. D. Maeyer, “Exploring the cognitive load of expert and novice map users using EEG and eye tracking,” Int. J. of Geo-Information, Vol.9, No.7, Article No.429, 2020. https://doi.org/10.3390/ijgi9070429
  26. [26] M. J. Boring, K. Ridgeway, M. Shvartsman, and T. R. Jonker, “Continuous decoding of cognitive load from electroencephalography reveals task-general and task-specific correlates,” J. Neural Eng., Vol.17, No.5, Article No.056016, 2020. https://doi.org/10.1088/1741-2552/abb9bc
  27. [27] T. McMahan, I. Parberry, and T. D. Parsons, “Evaluating player task engagement and arousal using electroencephalography,” Procedia Manufacturing, Vol.3, pp. 2303-2310, 2015. https://doi.org/10.1016/j.promfg.2015.07.376
  28. [28] J. Müller-Gerking, G. Pfurtscheller, and H. Flyvbjerg, “Designing optimal spatial filters for single-trial EEG classification in a movement task,” Clinical Neurophysiology, Vol.110, No.5, pp. 787-798, 1999. https://doi.org/10.1016/S1388-2457(98)00038-8

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Sep. 05, 2025