Research Paper:
Augmented Reality-Based System for Skill Transfer to Assist Workpiece Fixturing on a Machining Center
Manato Miyata*, Masaaki Imahashi**, Masatoshi Itoh***, and Keiichi Nakamoto*,

*Tokyo University of Agriculture and Technology
2-24-16 Naka-cho, Koganei, Tokyo 184-8588, Japan
Corresponding author
**Imahashi Co., Ltd.
Hitachi, Japan
***Yamazaki Mazak Corporation
Oguchi, Japan
Machining operations require preparation work called a “setting operation,” such as workpiece fixturing. This operation affects the machining time and machining accuracy and strongly depends on the skill of the operator. Thus, to improve machining efficiency, skill transfer to an unskilled operator via extraction and generalization of the skills regarding the setting operation is desired. In contrast, to acquire the skills of a skilled operator, eye tracking is expected to determine an operator’s attention by estimating the eyeball direction. In addition, augmented reality (AR) technology, which overlays virtual images on the real world, is a promising human-computer interaction technology for skill transfer in industry. Therefore, in this study, an AR-based assist system was developed to demonstrate workpiece fixturing on a machining center. This was achieved by incorporating the skills acquired from interviews based on the cognitive task analysis and gaze data of a skilled operator. Workpiece fixturing was assumed on a machining center, and five types of jigs were prepared. The dimensions of the workpiece and target were first detected. The removal area to be machined was calculated as the difference between the workpiece and target shapes. Workpiece fixturing was determined based on the shape and dimensions of the workpiece, as well as the positional relationships between the jig and removal area, according to the acquired skills. Case studies confirmed that the system improved machining efficiency by reducing the setting operation time.
- [1] C.-C. Liao, D.-H. Hwang, E. Wu, and H. Koike, “AI coach: A motor skill training system using motion discrepancy detection,” Proc. of the Augmented Humans Int. Conf. 2023, pp. 179-189, 2023. https://doi.org/10.1145/3582700.3582710
- [2] S. Shin, H. Hashimoto, K. Mitsuhashi, and S. Yokota, “Proposal for a design theory of a database for skill learning from the viewpoint of service,” Int. J. Automation Technol., Vol.12, No.4, pp. 514-523, 2018. https://doi.org/10.20965/ijat.2018.p0514
- [3] A. Hayashi and Y. Morimoto, “Study on process design based on language analysis and image discrimination using CNN deep learning,” Int. J. Automation Technol., Vol.17, No.2, pp. 112-119, 2023. https://doi.org/10.20965/ijat.2023.p0112
- [4] I. Nishida, E. Yamada, and H. Nakatsuji, “Automated process planning system for machining injection molding dies using CAD models of product shapes in STL format,” Int. J. Automation Technol., Vol.17, No.6, pp. 619-626, 2023. https://doi.org/10.20965/ijat.2023.p0619
- [5] N. Komura, K. Matsumoto, S. Igari, T. Ogawa, S. Fujita, and K. Nakamoto, “Computer aided process planning for rough machining based on machine learning with certainty evaluation of inferred results,” Int. J. Automation Technol., Vol.17, No.2, pp. 120-127, 2023. https://doi.org/10.20965/ijat.2023.p0120
- [6] I. Nishida, H. Sawada, and K. Shirase, “Automated generation of product assembly order based on geometric constraints between parts,” Int. J. Automation Technol., Vol.17, No.2, pp. 167-175, 2023. https://doi.org/10.20965/ijat.2023.p0167
- [7] K. Morishige and S. Mori, “Tool path generation for 5-axis rough cutting using haptic device,” Int. J. Automation Technol., Vol.14, No.5, pp. 808-815, 2020. https://doi.org/10.20965/ijat.2020.p0808
- [8] H. Sawada, Y. Nakabo, Y. Furukawa, N. Ando, T. Okuma, H. Komoto, and K. Masui, “Digital tools integration and human resources development for smart factories,” Int. J. Automation Technol., Vol.16, No.3, pp. 250-260, 2022. https://doi.org/10.20965/ijat.2022.p0250
- [9] A. Y. C. Nee, S. K. Ong, G. Chryssolouris, and D. Mourtzis, “Augmented reality applications in design and manufacturing,” CIRP Annals, Vol.61, No.2, pp. 657-679, 2012. https://doi.org/10.1016/j.cirp.2012.05.010
- [10] S. Webel, U. Bockholt, T. Engelke, N. Gavish, M. Olbrich, and C. Preusche, “An augmented reality training platform for assembly and maintenance skills,” Robotics and Autonomous Systems, Vol.61, No.4, pp. 398-403, 2013. https://doi.org/10.1016/j.robot.2012.09.013
- [11] K. Tainaka, Y. Fujimoyo, T. Sawabe, M. Kanabara, and H. Kato, “Selection framework of visualization methods in designing AR industrial task-support systems,” Computers in Industry, Vol.145, Article No.103828, 2023. https://doi.org/10.1016/j.compind.2022.103828
- [12] D. Segovia, H. Ramírez, M. Mendoza, M. Mendoza, E. Mendoza, and E. González, “Machining and dimensional validation training using augmented reality for a lean process,” Procedia Computer Science, Vol.75, pp. 195-204, 2015. https://doi.org/10.1016/j.procs.2015.12.238
- [13] D. Segovia, M. Mendoza, E. Mendoza, and E. González, “Augmented reality as a tool for production and quality monitoring,” Procedia Computer Science, Vol.75, pp. 291-300, 2015.
- [14] C. K. Yang, Y. H. Chen, T. J. Chuang, K. Shankhwar, and S. Smith, “An augmented reality-based training system with a natural user interface for manual milling operations,” Virtual Reality, Vol.24, pp. 527-539, 2020. https://doi.org/10.1007/s10055-019-00415-8
- [15] K. Nishida, M. Itoh, and K. Nakamoto, “Augmented reality-based system for skill transfer of workpiece fixturing in turning operations,” Int. J. Automation Technol., Vol.17, No.2, pp. 136-143, 2023. https://doi.org/10.20965/ijat.2023.p0136
- [16] C. H. Morimoto and M. R. M. Mimica, “Eye gaze tracking techniques for interactive applications,” Computer Vision and Image Understanding, Vol.98, No.1, pp. 4-24, 2005. https://doi.org/10.1016/j.cviu.2004.07.010
- [17] K. Rayner, “Eye movements in reading and information processing: 20 years of research,” Psychological Bulletin, Vol.124, No.3, pp. 372-422, 1998. https://doi.org/10.1037/0033-2909.124.3.372
- [18] M. Wilf, A. Korakin, Y. Bahat, O. Kore, N. Galor, O. Dagan, W. G. Wright, J. Friedman, and M. Plotnik, “Using virtual reality-based neurocognitive testing and eye tracking to study naturalistic cognitive-motor performance,” Neuropsychologia, Vol.194, Article No.108744, 2024. https://doi.org/10.1016/j.neuropsychologia.2023.108744
- [19] M. R. White, H. Braund, D. Howes, R. Egan, A. Gegenfurtner, J. J. G. V. Merrienboer, and A. Szulewski, “Getting inside the expert’s head: An analysis of physician cognitive processes during trauma resuscitations,” Annals of Emergency Medicine, Vol.72, No.3, pp. 289-298, 2018. https://doi.org/10.1016/j.annemergmed.2018.03.005
- [20] L. M. Hopper, R. A. Gulli, L. H. Howard, F. Kano, C. Krupenye, A. M. Ryan, and A. Paukner, “The application of noninvasive, restraint-free eye-tracking methods for use with nonhuman primates,” Behavior Research Methods, Vol.53, pp. 1003-1030, 2021. https://doi.org/10.3758/s13428-020-01465-6
- [21] S. Taki and S. Yonezawa, “Motion analysis of lathe machining work using a digital position display device,” Int. J. Automation Technol., Vol.16, No.5, pp. 625-633, 2022. https://doi.org/10.20965/ijat.2022.p0625
- [22] T. Yoshikawa, F. Nakamura, E. Sogabe, and K. Nakamoto, “Acquisition of skills for process planning through eye tracking when understanding mechanical drawings,” Int. J. Automation Technol., Vol.17, No.2, pp. 128-135, 2023. https://doi.org/10.20965/ijat.2023.p0128
- [23] S. Taki and S. Yonezawa, “Motion analysis of lathe machining work using a digital position display device,” Int. J. Automation Technol., Vol.16, No.5, pp. 625-633, 2022. https://doi.org/10.20965/ijat.2022.p0625
- [24] K. Yamaguchi, M. Yamaguchi, Y. Kondo, and S. Sakamoto, “Analysis of turning process relative to machining technician’s skills,” Advanced Materials Research, Vols.655-657, pp. 2152-2155, 2013. https://doi.org/10.4028/www.scientific.net/AMR.655-657.2152
- [25] B. M. Knisely, J. S. Joyner, and M. V. Cooke, “Cognitive task analysis and workload classification,” MethodsX, Vol.8, Article No.10123, 2021. https://doi.org/10.1016/j.mex.2021.101235
This article is published under a Creative Commons Attribution-NoDerivatives 4.0 Internationa License.