single-rb.php

JRM Vol.31 No.5 pp. 657-670
doi: 10.20965/jrm.2019.p0657
(2019)

Paper:

Three-Dimensional Aerial Image Interface, 3DAII

Takafumi Matsumaru, Asyifa Imanda Septiana, and Kazuki Horiuchi

Graduate School of Information, Production, and Systems, Waseda University
2-7 Hibikino, Wakamatsu-ku, Kitakyushu, Fukuoka 808-0135, Japan

Received:
February 24, 2018
Accepted:
July 31, 2019
Published:
October 20, 2019
Keywords:
three-dimensional object image, aerial projection display, direct interaction interface, pyramid reflector, parabolic mirrors
Abstract

In this paper, we introduce the three-dimensional aerial image interface, 3DAII. This interface reconstructs and aerially projects a three-dimensional object image, which can be simultaneously observed from various viewpoints or by multiple users with the naked eye. A pyramid reflector is used to reconstruct the object image, and a pair of parabolic mirrors is used to aerially project the image. A user can directly manipulate the three-dimensional object image by superimposing a user’s hand-finger or a rod on the image. A motion capture sensor detects the user’s hand-finger that manipulates the projected image, and the system immediately exhibits some reaction such as deformation, displacement, and discoloration of the object image, including sound effects. A performance test is executed to confirm the functions of 3DAII. The execution time of the end-tip positioning of a robotic arm has been compared among four operating devices: touchscreen, gamepad, joystick, and 3DAII. The results exhibit the advantages of 3DAII; we can directly instruct the movement direction and movement speed of the end-tip of the robotic arm, using the three-dimensional Euclidean vector outputs of 3DAII in which we can intuitively make the end-tip of the robotic arm move in three-dimensional space. Therefore, 3DAII would be one important alternative to an intuitive spatial user interface, e.g., an operation device of aerial robots, a center console of automobiles, and a 3D modelling system. A survey has been conducted to evaluate comfort and fatigue based on ISO/TS 9241-411 and ease of learning and satisfaction based on the USE questionnaire. We have identified several challenges related to visibility, workspace, and sensory feedback to users that we would like to address in the future.

A user can directly manipulates the 3D object image: before/after pinching

A user can directly manipulates the 3D object image: before/after pinching

Cite this article as:
T. Matsumaru, A. Septiana, and K. Horiuchi, “Three-Dimensional Aerial Image Interface, 3DAII,” J. Robot. Mechatron., Vol.31 No.5, pp. 657-670, 2019.
Data files:
References
  1. [1] C. M. Brown, “Human-computer interface design guidelines,” Ablex Publishing, 1988.
  2. [2] F. Golshani, “TUI or GUI – It’s a Matter of Somatics,” IEEE MultiMedia, Vol.14, Issue 1, p. 104, doi: 10.1109/MMUL.2007.24, 2007.
  3. [3] T. Matsumaru, Y. Horiuchi, K. Akai, and Y. Ito, “Truly-Tender-Tailed Tag-Playing Robot Interface through Friendly Amusing Mobile Function,” J. Robot. Mechatron., Vol.22, No.3, pp. 301-307, doi: 10.20965/jrm.2010.p0301, 2010.
  4. [4] K. Hinckley, R. Pausch, J. C. Goble, and N. F. Kassell, “A Survey of Design Issues in Spatial Input,” Proc. of the 7th Annual ACM Symp. on User interface software and technology (UIST ’94), pp. 213-222, doi: 10.1145/192426.192501, 1994.
  5. [5] V. I. Pavlovic, R. Sharma, and T. S. Huang, “Visual Interpretation of Hand Gestures for Human-Computer Interaction: A Review,” IEEE Trans. on Pattern Analysis and Machine Intelligence, Vol.19, No.7, pp. 677-695, doi: 10.1109/34.598226, 1997.
  6. [6] T. Leyvand, C. Meekhof, Y.-C. Wei, J. Sun, and B. Guo, “Kinect Identity: Technology and Experience,” Computer, Vol.44, Issue 4, pp. 94-96, doi: 10.1109/MC.2011.114, 2011.
  7. [7] D. R. Olsen Jr. and T. Nielsen, “Laser Pointer Interaction,” Proc. of the SIGCHI Conf. on Human Factors in Computing Systems (CHI’01), pp. 17-22, doi: 10.1145/365024.365030, 2011.
  8. [8] J. A. Norling, “Anaglyph stereoscopy,” U.S. Patent US2135197A, 1937.
  9. [9] S. Volbracht, K. Shahrbabaki, G. Domik, and G. Fels, “Perspective viewing, Anaglyph stereo or Shutter glass stereo?,” Proc. of 1996 IEEE Symp. on Visual Languages (VL’96), pp. 192-193, doi: 10.1109/VL.1996.545287, 1996.
  10. [10] D. F. McAllister, “Display Technology: Stereo & 3D Display Technologies,” J. P. Hornak (Ed.), “Encyclopedia on Imaging Science and Technology,” pp. 1327-1344, John Wiley and Sons, Inc., doi: 10.1002/0471443395, 2001.
  11. [11] F. E. Ives, “A novel stereogram,” J. of the Franklin Institute, Vol.153, Issue 1, pp. 51-52, 1902.
  12. [12] Y. Ueda, K. Iwazaki, M. Shibasaki, Y. Mizushina, M. Furukawa, H. Nii, K. Minamizawa, and S. Tachi, “HaptoMIRAGE: mid-air autostereoscopic display for seamless interaction with mixed reality environments,” Proc. ACM SIGGRAPH 2014 Emerging Technologies (SIGGRAPH ’14), Article No.10, doi: 10.1145/2614066.2614093, 2014.
  13. [13] I. E. Sutherland, “A head-mounted three dimensional display,” Proc. 1968 Fall Joint Computer Conference (FJCC), pp. 757-764, doi: 10.1145/1476589.1476686, 1968.
  14. [14] O. Matoba and M. Tanaka, “Digital Holographic Measurement and Phase Reconstruction of 3D Object based on Wavefront Data,” 3D Research, Vol.2, Issue 3, pp. 1-7, doi: 10.1007/3DRes.03(2011)1, 2011.
  15. [15] T. Mishina, J. Arai, M. Okui, and F. Okano, “Electronic Holography for Real Objects Using Integral Photography,” B. Javidi, F. Okano, and J.-Y. Son (Eds.), “Three-dimensional Imaging, Visualization, and Display,” Springer-Verlag New York, pp. 389-416, doi: 10.1007/978-0-387-79335-1_20, 2009.
  16. [16] D. E. Smalley, Q. Y. J. Smithwick, V. M. Bove Jr., J. Barabas, and S. Jolly, “Anisotropic leaky-mode modulator for holographic video displays,” Nature, Vol.498, pp. 313-317, doi: 10.1038/nature12217, 2013.
  17. [17] T. Kakue, T. Nishitsuji, T. Kawashima, K. Suzuki, T. Shimobaba, and T. Ito, “Aerial projection of three-dimensional motion-picture by electro-holography and parabolic mirrors,” Scientific Reports, Vol.5, 11750, doi: 10.1038/srep11750, 2015.
  18. [18] K. Wakunami, P.-Y. Hsieh, R. Oi, T. Senoh, H. Sasaki, Y. Ichihashi, M. Okui, Y.-P. Huang, and K. Yamamoto, “Projection-type see-through holographic three-dimensional display,” Nature Communications, Vol.7, 12954, doi: 10.1038/ncomms12954, 2016.
  19. [19] L. Smoot, Q. Smithwick, and D. Reetz, “Volumetric Display Based on Vibrating Mylar Beam Splitter and LED Backlit LCD,” SIGGRAPH 2011 – Emerging Technologies, 2011.
  20. [20] D. Miyazaki, N. Akasaka, K. Okoda, Y. Maeda, and T. Mukai, “Floating three-dimensional display viewable from 360 degrees,” Proc. Stereoscopic Displays and Applications XXIII, 82881H, doi: 10.1117/12.907998, 2012.
  21. [21] Y. Ochiai, K. Kumagai, T. Hoshi, J. Rekimoto, S. Hasegawa, and Y. Hayasaki, “Fairy Lights in Femtoseconds: Aerial and Volumetric Graphics Rendered by Focused Femtosecond Laser Combined with Computational Holographic Fields,” ACM Trans. on Graphics (TOG), Vol.35, Issue 2, Article 17, doi: 10.1145/2850414, 2016.
  22. [22] O. Cakmakci and J. Rolland, “Head-Worn Displays: A Review,” J. of Display Technology, Vol.2, Issue 3, pp. 199-216, doi: 10.1109/JDT.2006.879846, 2006.
  23. [23] T. Hoshi, M. Takahashi, K. Nakatsuma, and H. Shinoda, “Touchable Holography,” Proc. ACM SIGGRAPH 2009 Emerging Technologies (SIGGRAPH 2009), Article No.23, doi: 10.1145/1597956.1597979, 2009.
  24. [24] T. Yoshida, K. Shimizu, T. Kurogi, S. Kamuro, K. Minamizawa, H. Nii, and S. Tachi, “RePro3D: Full-Parallax 3D Display with Haptic Feedback using Retro-Reflective Projection Technology,” 2011 IEEE Int. Symp. on VR Innovation (ISVRI), pp. 49-54, doi: 10.1109/ISVRI.2011.5759601, 2011.
  25. [25] O. Hilliges, D. Kim, S. Izadi, M. Weiss, and A. D. Wilson, “HoloDesk: Direct 3D Interactions with a Situated See-Through Display,” Proc. of the SIGCHI Conf. on Human Factors in Computing Systems (CHI’12), pp. 2421-2430, doi: 10.1145/2207676.2208405, 2012.
  26. [26] A. Yagi, M. Imura, Y. Kuroda, and O. Oshiro, “360-Degree Fog Projection Interactive Display,” Proc. SIGGRAPH Asia 2011 Emerging Technologies (SA’11), Article No.19, doi: 10.1145/2073370.2073388, 2011.
  27. [27] Y. Ochiai, T. Hoshi, and J. Rekimoto, “Pixie Dust: Graphics Generated by Levitated and Animated Objects in Computational Acoustic-Potential Field,” ACM Trans. on Graphics (TOG), Vol.33, No.4, Article 85, doi: 10.1145/2601097.2601118, 2014.
  28. [28] M. Jiono and T. Matsumaru, “Interactive Aerial Projection of 3D Hologram Object,” 2016 IEEE Int. Conf. on Robotics and Biomimetics (IEEE-ROBIO 2016), pp. 1930-1935, doi: 10.1109/ROBIO.2016.7866611, 2016.
  29. [29] A. Butler, O. Hilliges, S. Izadi, S. Hodges, D. Molyneaux, D. Kim, and D. Kong, “Vermeer: Direct Interaction with a 360 degree Viewable 3D Display,” Proc. of the 24th ACM Symp. on User Interface Software and Technology (UIST 2011), pp. 569-576, doi: 10.1145/2047196.2047271, 2011.
  30. [30] A. Jones, I. McDowall, H. Yamada, M. Bolas, and P. Debevec, “Rendering for an Interactive 360 Degree Light Field Display,” ACM Trans. on Graphics (TOG), Vol.26, Issue 3, Article No.40, doi: 10.1145/1276377.1276427, 2007.
  31. [31] C. H. Krah and M. Yousefpor, “Interactive three-dimensional display system,” U.S. Patent 2014/0111479 A1, 2014.
  32. [32] T. Matsumaru and M. Narita, “Calligraphy-Stroke Learning Support System Using Projector and Motion Sensor,” J. Adv. Comput. Intell. Intell. Inform., Vol.21, No.4, pp. 697-708, doi: 10.20965/jaciii.2017.p0697, 2017.
  33. [33] A. I. Septiana, M. Jiono, and T. Matsumaru, “Measuring Performance of Aerial Projection of 3D Hologram Object (3DHO),” 2017 IEEE Int. Conf. on Robotics and Biomimetics (IEEE-ROBIO 2017), pp. 2081-2086, doi: 10.1109/ROBIO.2017.8324726, 2017.
  34. [34] J.-F. Lapointe, P. Savard, and N. G. Vinson, “A comparative study of four input devices for desktop virtual walkthroughs,” Computers in Human Behavior, Vol.27, Issue 6, pp. 2186-2191, doi: 10.1016/j.chb.2011.06.014, 2011.
  35. [35] T. Matsumaru, “Development and Evaluation of Operational Interface Using Touch Screen for Remote Operation of Mobile Robot,” C. Ciufudean and L. Garcia (Eds.), “Advances in Robotics – Modeling, Control and Applications,” pp. 195-217, iConcept Press, 2013.
  36. [36] L. Zhang and T. Matsumaru, “Near-field Touch Interface Using Time-of-flight Camera,” J. Robot. Mechatron., Vol.28, No.5, pp. 759-775, doi: 10.20965/jrm.2016.p0759, 2016.
  37. [37] ISO/TS 9241-411, “Ergonomics of human-system interaction – Part 411: Evaluation methods for the design of physical input devices,” ISO Int. Organization for Standardization, 2012.
  38. [38] JIS Z8519, “Ergonomic requirements for office work with visual displayerminals (VDTs) – Requirements for non-keyboard input devises,” JISC (Japanese Industrial Standards Committee) JSA (Japanese Standards Association), 2007 (in Japanese).
  39. [39] G. A. V. Borg, “Psychophysical bases of perceived exertion,” Medicine and Science Sports and Exercise, Vo14, No.5, pp. 377-381, 1982.
  40. [40] A. M. Lund, “Measuring Usability with the USE Questionnaire,” STC Usability SIG Newsletter, Vol.8, No.2, pp. 3-6, 2001.
  41. [41] R. Likert, “A Technique for the Measurement of Attitudes,” Archives of Psychology, 140, 1932.
  42. [42] J. Payette, V. Hayward, C. Ramstein, and D. Bergeron, “Evaluation of a force-feedback (haptic) computer pointing device in zero gravity,” Proc. 5th Annual Symp. on Haptic Interfaces for Virtual Environments and Teleoperated Systems, ASME DSC, Vol.58, pp. 547-553, 1996.
  43. [43] J. T. Dennerlein and M. C. Yang, “Haptic Force-Feedback Devices for the Office Computer: Performance and Musculoskeletal Loading Issues,” Human Factors, Vol.43, No.2, pp. 278-286, doi: 10.1518/001872001775900850, 2001.
  44. [44] V. Hayward, O. R. Astley, M. Cruz-Hernandez, D. Grant, and G. Robles-De-La-Torre, “Haptic interfaces and devices,” Sensor Review, Vol.24, Issue 1, pp. 16-29, doi: 10.1108/02602280410515770, 2004.
  45. [45] G. Tholey, J. P. Desai, and A. E. Castellanos, “Force Feedback Plays a Significant Role in Minimally Invasive Surgery,” Annals of Surgery, Vol.241, No.1, pp. 102-109, doi: 10.1097/01.sla.0000149301.60553.1e, 2005.
  46. [46] J. C. Gwilliam, M. Mahvash, B. Vagvolgyi, A. Vacharat, D. D. Yuh, and A. M. Okamura, “Effects of Haptic and Graphical Force Feedback on Teleoperated Palpation,” IEEE Int. Conf. on Robotics and Automation, 2009 (ICRA’09), pp. 677-682, doi: 10.1109/ROBOT.2009.5152705, 2009.
  47. [47] F. Danieau, A. Lecuyer, P. Guillotel, J. Fleureau, N. Mollet, and M. Christie, “Enhancing Audiovisual Experience with Haptic Feedback: A Survey on HAV,” IEEE Trans. on Haptics, Vol.6, Issue 2, pp. 193-205, doi: 10.1109/TOH.2012.70, 2013.
  48. [48] E. Maggioni, E. Agostinelli, and M. Obrist, “Measuring the added value of haptic feedback,” Proc. of 2017 9th Int. Conf. on Quality of Multimedia Experience (QoMEX), pp. 1-6, doi: 10.1109/QoMEX.2017.7965670, 2017.
  49. [49] B. Long, S. A. Seah, T. Carter, and S. Subramanian, “Rendering Volumetric Haptic Shapes in Mid-Air using Ultrasound,” ACM Trans. on Graphics, Vol.33, Issue 6, Article No.181, doi: 10.1145/2661229.2661257, 2014.
  50. [50] C. T. Vi, D. Ablart, E. Gatti, C. Velasco, and M. Obrist, “Not just seeing, but also feeling art: Mid-air haptic experiences integrated in a multisensory art exhibition,” Int. J. of Human-Computer Studies, Vol.108, pp. 1-14, doi: 10.1016/j.ijhcs.2017.06.004, 2017.
  51. [51] O. Schneider, K. MacLean, C. Swindells, and K. Booth, “Haptic experience design: What hapticians do and where they need help,” Int. J. of Human-Computer Studies, Vol.107, pp. 5-21, doi: 10.1016/j.ijhcs.2017.04.004, 2017.
  52. [52] H. Ishii, “Tangible User Interfaces,” A. Sears and J. A. Jacko (Eds.), “The Human-Computer Interaction Handbook: Fundamentals, Evolving Technologies and Emerging Applications, Second Edition,” CRC Press, pp. 469-487, ISBN-10: 0805858709, 2007.
  53. [53] S. K. Saha, “Introduction to Robotics,” McGraw-Hill Education, 2014.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Apr. 18, 2024