single-rb.php

JRM Vol.34 No.6 pp. 1441-1450
doi: 10.20965/jrm.2022.p1441
(2022)

Paper:

Inspection of the Most Suitable Approach and Information Projection Method for Interactions in the Night Flight of a Projector-Mounted Drone

Ryosuke Kakiuchi, Dinh Tuan Tran, and Joo-Ho Lee

Graduate School of Information Science and Engineering, Ritsumeikan University
1-1-1 Noji-higashi, Kusatsu, Shiga 525-8577, Japan

Received:
February 14, 2022
Accepted:
August 19, 2022
Published:
December 20, 2022
Keywords:
drone, human-robot interaction, guard robot, projector
Abstract

Security is considered heavy labor work owing to night shifts and long working hours. In recent years, the number of security guards has been increasing, and the labor shortage for security guards has become a problem in Japan. To overcome these problems, robots have been used for security purposes. However, most are unable to interact with users or guards at night. In this study, a drone called aerial ubiquitous display (AUD) is proposed to improve the problems of existing security methods using robots and to provide night security. The AUD enables human-drone interaction and night security. Moreover, when an AUD interacts with humans, the drone must come close to the human and project information on the ground. Therefore, this study investigated the optimal parameters for a projector-equipped drone to approach a human at night. In addition, by comparing these results with those of the day approach, we verified whether there would be a change in perception between daytime and nighttime. Furthermore, an experiment was conducted to investigate the types of projections that are most likely to capture a user’s attention.

Projection from the sky by drone

Projection from the sky by drone

Cite this article as:
R. Kakiuchi, D. Tran, and J. Lee, “Inspection of the Most Suitable Approach and Information Projection Method for Interactions in the Night Flight of a Projector-Mounted Drone,” J. Robot. Mechatron., Vol.34 No.6, pp. 1441-1450, 2022.
Data files:
References
  1. [1] Y. Shimosasa, K. Wakabayashi, T. Moriguchi, M. Sugiura, H. Fujise, and K. Kotani, “Development and Safety Policy of the Outdoor Security Robot ALSOK Guardrobo i,” J. of the Robotics Society of Japan, Vol.24, No.2, pp. 156-158, doi: 10.7210/jrsj.24.156, 2006 (in Japanese).
  2. [2] Y. Shimosasa, J. Kanemoto, K. Hakamada, H. Horii, T. Ariki, Y. Sugawara, F. Kojio, A. Kimura, and S. Yuta, “Some results of the test operation of a security service system with autonomous guard robot,” 2000 26th Annual Conf. of the IEEE Industrial Electronics Society (IECON 2000), 2000 IEEE Int. Conf. on Industrial Electronics, Control and Instrumentation, 21st Century Technologies, Vol.1, pp. 405-409, doi: 10.1109/IECON.2000.973184, 2000.
  3. [3] M. Saitoh, Y. Takahashi, A. Sankaranarayanan, H. Ohmachi, and K. Marukawa, “A mobile robot testbed with manipulator for security guard application,” Proc. of 1995 IEEE Int. Conf. on Robotics and Automation, Vol.3, pp. 2518-2523, doi: 10.1109/ROBOT.1995.525637, 1995.
  4. [4] C. Kikumoto, Y. Harimoto, K. Isogaya, T. Yoshida, and T. Urakubo, “Landing Site Detection for UAVs Based on CNNs Classification and Optical Flow from Monocular Camera Images,” J. Robot. Mechatron., Vol.33, No.2, pp. 292-300, doi: 10.20965/jrm.2021.p0292, 2021.
  5. [5] S. Yamauchi and K. Suzuki, “Verification of Model Accuracy and Photo Shooting Efficiency of Large-Scale SfM for Flight Path Calculation,” J. Robot. Mechatron., Vol.33, No.2, pp. 322-328, doi: 10.20965/jrm.2021.p0322, 2021.
  6. [6] D. He, H.-M. Chuang, J. Chen, J. Li, and A. Namiki, “Real-Time Visual Feedback Control of Multi-Camera UAV,” J. Robot. Mechatron., Vol.33, No.2, pp. 263-273, doi: 10.20965/jrm.2021.p0263, 2021.
  7. [7] D. Tezza and M. Andujar, “The State-of-the-Art of Human-Drone Interaction: A Survey,” IEEE Access, Vol.7, pp. 167438-167454, doi: 10.1109/ACCESS.2019.2953900, 2019.
  8. [8] J. R. Cauchard, L. E. Jane, K. Y. Zhai, and J. A. Landay, “Drone & Me: An Exploration into Natural Human-Drone Interaction,” Proc. of the 2015 ACM Int. Joint Conf. on Pervasive and Ubiquitous Computing, pp. 361-365, doi: 10.1145/2750858.2805823, 2015.
  9. [9] A. Wojciechowska, J. Frey, S. Sass, R. Shafir, and J. R. Cauchard, “Collocated Human-Drone Interaction: Methodology and Approach Strategy,” 2019 14th ACM/IEEE Int. Conf. on Human-Robot Interaction (HRI), pp. 172-181, doi: 10.1109/HRI.2019.8673127, 2019.
  10. [10] J. R. Cauchard, K. Y. Zhai, M. Spadafora, and J. Landay, “Emotion encoding in Human-Drone Interaction,” 2016 11th ACM/IEEE Int. Conf. on Human-Robot Interaction (HRI), pp. 263-270, doi: 10.1109/HRI.2016.7451761, 2016.
  11. [11] T. Imazaike, T. D. Tuan, and J.-H. Lee, “Human action recognition using infrared camera for night security by drone,” SICE System Integration Division Annual Conf. (SI 2020), 2020 (in Japanese).
  12. [12] M. M. Bradley and P. J. Lang, “Measuring emotion: The self-assessment manikin and the semantic differential,” J. of Behavior Therapy and Experimental Psychiatry, Vol.25, Issue 1, pp. 49-59, doi: 10.1016/0005-7916(94)90063-9, 1994.
  13. [13] J. D. Morris, “Observations: SAM: the Self-Assessment Manikin; an efficient cross-cultural measurement of emotional response,” J. of Advertising Research, Vol.35, No.6, pp. 63-68, 1995.
  14. [14] R. Kakiuchi, D. T. Tran, and J.-H. Lee, “Evaluation of behavior detection and investigation of sound noise of a drone with an infrared camera at night,” 27th Int. Symposium on Artificial Life and Robotics (AROB 2022), pp. 358-363, 2022.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Dec. 13, 2024