single-au.php

IJAT Vol.16 No.3 pp. 286-295
doi: 10.20965/ijat.2022.p0286
(2022)

Technical Paper:

Improving Remote Spatial Understanding by Transmitting Spherical Images via Video Chat Applications

Kazuma Aoyama*1,*2,†, Kiyosu Maeda*3, Ryoko Ueoka*1,*4, Shigeo Makioka*5, Nobukazu Sakura*6, Kunihiko Nakashima*6, Michitaka Hirose*1, and Tomohiro Amemiya*2,*7

*1Research Center for Advanced Science and Technology, The University of Tokyo
4-6-1 Komaba, Meguro-ku, Tokyo 153-8904, Japan

Corresponding author

*2Virtual Reality Educational Research Center, The University of Tokyo, Tokyo, Japan

*3Graduate School of Interdisciplinary Information Studies, The University of Tokyo, Tokyo, Japan

*4zeroinon Inc., Tokyo, Japan

*5Tokyo Office, DENSO CORPORATION, Tokyo, Japan

*6Machinery & Tools Division, DENSO CORPORATION, Agui, Japan

*7Graduate School of Information Science and Technology, The University of Tokyo, Tokyo, Japan

Received:
October 29, 2021
Accepted:
March 29, 2022
Published:
May 5, 2022
Keywords:
virtual reality, telepresence technology, spherical camera, lookaround system, video chat application
Abstract

Manufacturing functions are often performed by groups of engineers who cooperate and gather at work sites. However, since the beginning of the COVID-19 pandemic, the movement and activities of groups of people have been restricted, especially in indoor spaces. This reduction in travel by engineers also implies a reduction in associated costs. Telepresence technology, which is studied in the field of virtual reality, can be used as a way to reduce travel. Telepresence allows users to engage with a site from a remote location as if they were present. Thus, engineers would be able to participate in a working group without the necessity of physically traveling to the site to cooperate with local manufacturing people. A variety of telepresence systems have been proposed; however, relatively few methods have been widely implemented compared with video chat applications that have recently become an established infrastructure in many companies. This is most likely because most proposed systems use robots, head-mounted displays, or dedicated multi-functional applications that require engineers to learn how to use them. One way to use a video chat application to understand a remote space is to have a remote participant move a camera used in a video chat application. In contrast, many VR social networking services use a viewing method with which users can change their viewing direction on the computer screen. In this study, we demonstrate that a system that allows users to rotate their viewing perspective on a laptop computer screen can provide an easier understanding of a virtual space than a system that requires a remote person to move a webcam. Based on these results, we propose a system that allows users to view a remote location on a laptop computer screen via a video chat application and an off-the-shelf spherical camera, and evaluate its usefulness.

Cite this article as:
K. Aoyama, K. Maeda, R. Ueoka, S. Makioka, N. Sakura, K. Nakashima, M. Hirose, and T. Amemiya, “Improving Remote Spatial Understanding by Transmitting Spherical Images via Video Chat Applications,” Int. J. Automation Technol., Vol.16 No.3, pp. 286-295, 2022.
Data files:
References
  1. [1] L. Alem, F. Tecchia, and W. Huang, “HandsOnVideo: Towards a Gesture-based Mobile AR System for Remote Collaboration,” L. Alem and W. Huang (Eds.), “Recent Trends of Mobile Collaborative Augmented Reality Systems,” Springer, pp. 135-148, 2011.
  2. [2] W. Huang and L. Alem, “Handsinair: a wearable system for remote collaboration on physical tasks,” Proc. of the 2013 Conf. on Computer Supported Cooperative Work Companion, pp. 153-156, 2013.
  3. [3] M. L. Chenechal, T. Duval, V. Gouranton, J. Royan, and B. Arnaldi, “Vishnu: virtual immersive support for helping users an interaction paradigm for collaborative remote guiding in mixed reality,” Proc. of the IEEE 3rd VR Int. Workshop on Collaborative Virtual Environments, pp. 9-12, 2016.
  4. [4] W. Huang and L. Alem, “Supporting hand gestures in mobile remote collaboration: a usability evaluation,” Proc. of the 25th BCS Conf. on Human-Computer Interaction, British Computer Society, pp. 211-216, 2011.
  5. [5] W. Huang, L. Alem, F. Tecchia, and H. B. Duh, “Augmented 3D hands: a gesture-based mixed reality system for distributed collaboration,” J. Multimodal User Interfaces, Vol.12, pp. 77-89, 2018.
  6. [6] D. Anton, G. Kurillo, A. Y. Yang, and R. Bajcsy, “Augmented Telemedicine Platform for Real-Time Remote Medical Consultation,” L. Amsaleg, G. Guðmundsson, C. Gurrin, B. Jónsson, and S. Satoh (Eds.), “MultiMedia Modeling,” Springer, pp. 77-89, 2017.
  7. [7] F. Tecchia, L. Alem, and W. Huang, “3D helping hands: a gesture based MR system for remote collaboration,” Proc. of the 11th ACM SIGGRAPH Int. Conf. on Virtual-Reality Continuum and its Applications in Industry (VRCAI ’12), pp. 323-328, 2013.
  8. [8] S. D’Angelo and D. Gergle, “An eye for design: Gaze visualizations for remote collaborative work,” Proc. of the 2018 CHI Conf. on Human Factors in Computing Systems, Association for Computing Machinery, 349, 2018.
  9. [9] L. Gao, H. Bai, W. He, M. Billinghurst, and R. W. Lindeman, “Real-time visual representations for mobile mixed reality remote collaboration,” SIGGRAPH Asia 2018 Virtual & Augmented Reality (SA ’18), 15, 2018.
  10. [10] T. Piumsomboon, Y. Lee, G. A. Lee, A. Dey, and M. Billinghurst, “Empathic Mixed Reality: Sharing What You Feel and Interacting with What You See,” Proc. of 2017 Int. Symp. on Ubiquitous Virtual Reality (ISUVR), pp. 38-41, 2017.
  11. [11] J. Amores, X. Benavides, and P. Maes, “ShowMe: A Remote Collaboration System that Supports Immersive Gestural Communication,” Proc. of the 33rd Annual ACM Conf. Extended Abstracts on Human Factors in Computing Systems (CHI EA ’15), pp. 1343-1348, 2015.
  12. [12] S. Günther, S. Kratz, D. Avrahami, and M. Mühlhäuser, “Exploring Audio, Visual, and Tactile Cues for Synchronous Remote Assistance,” Proc. of the 11th PErvasive Technologies Related to Assistive Environments Conf. (PETRA ’18), pp. 339-344, 2018.
  13. [13] D. Aschenbrenner, M. Rojkov, F. Leutert, J. Verlinden, S. Lukosch, M. E. Latoschik, and K. Schilling, “Comparing Different Augmented Reality Support Applications for Cooperative Repair of an Industrial Robot,” Proc. of the 2018 IEEE Int. Symp. on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct), pp. 69-74, 2018.
  14. [14] P. Wang, X. Bai, M. Billinghurst, S. Zhang, X. Zhang, S. Wang, W. He, Y. Yan, and H. Ji, “AR/MR Remote Collaboration on Physical Tasks: A Review,” Robotics and Computer-Integrated Manufacturing, Vol.72, 102071, 2021.
  15. [15] S. Tachi, K. Komoriya, K. Sawada, T. Nishiyama, T. Itoko, M. Kobayashi, and K. Inoue, “Telexistence cockpit for humanoid robot control,” Advanced Robotics, Vol.17, pp. 199-217, 2012.
  16. [16] H. Baier, M. Buss, F. Freyberger, and G. Schmidt, “Interactive stereo vision telepresence for correct communication of spatial geometry,” Advanced Robotics, Vol.17, pp. 219-233, 2003.
  17. [17] S. Nefti-Meziani, U. Manzoor, S. Davis, and S. K. Pupala, “3D perception from binocular vision for a low cost humanoid robot NAO,” Robotics and Autonomous Systems, Vol.68, pp. 129-139, 2015.
  18. [18] C. L. Fernando, M. Furukawa, T. Kurogi, K. Hirota, S. Kamuro, K. Sato, K. Minamizawa, and S. Tachi, “TELESAR V: TELExistence surrogate anthropomorphic robot,” ACM SIGGRAPH 2012 Emerging Technologies (SIGGRAPH ’12), 23, 2012.
  19. [19] C. L. Fernando et al., “Design of TELESAR V for transferring bodily consciousness in telexistence,” Proc. of IEEE/RSJ Int. Conf. on Intelligent Robots and Systems, pp. 5112-5118, 2012.
  20. [20] https://www.doublerobotics.com/ [Accessed October 1, 2021]
  21. [21] A. Nassani, L. Zhang, H. Bai, and M. Billinghurst, “ShowMeAround: Giving Virtual Tours Using Live 360 Video,” Extended Abstracts of the 2021 CHI Conf. on Human Factors in Computing Systems, Association for Computing Machinery, New York, NY, USA, 2021.
  22. [22] T. Amemiya, K. Aoyama, and M. Hirose, “TeleParallax: Low-motion-blur Stereoscopic System with Correct Interpupillary Distance for 3D Head Rotations,” Frontiers in Virtual Reality, Vol.2, 726285, 2021.
  23. [23] https://www.emailtooltester.com/en/blog/video-conferencing-market-share/ [Accessed October 1, 2021]
  24. [24] T. A. Ryan, “Significance tests for multiple comparison of proportions, variances, and other statistics,” Psychological Bulletin, Vol.57, pp. 318-328, 1960.
  25. [25] J. O. Wobbrock, L. Findlater, D. Gergle, and J. J. Higgins, “The aligned rank transform for nonparametric factorial analyses using only anova procedures,” Proc. the SIGCHI Conf. on Human Factors in Computing Systems, pp. 143-146, doi: 10.1145/1978942.1978963, 2011.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Apr. 22, 2024