single-rb.php

JRM Vol.34 No.5 pp. 912-935
doi: 10.20965/jrm.2022.p0912
(2022)

Review:

High-Speed Vision and its Applications Toward High-Speed Intelligent Systems

Masatoshi Ishikawa*,**

*Tokyo University of Science
1-3 Kagurazaka, Shinjuku-ku, Tokyo 162-8601, Japan

**The University of Tokyo
7-3-1 Hongo, Bunkyo-ku, Tokyo 113-8656, Japan

Received:
July 25, 2022
Accepted:
August 20, 2022
Published:
October 20, 2022
Keywords:
high-speed image processing, parallel processing vision, target tracking vision, high-speed visual feedback robot, dynamic projection mapping
Abstract

Currently, high-speed vision based on parallel processing exists, and its various applications as high-speed intelligent systems have been proposed and implemented. The basic goal of high-speed vision is to realize vision capabilities and systems that operate at speeds necessary for intelligent systems, in which intelligence operating at the speed inherently required by the application system is achieved. This paper described the vision chip and parallel image processing architectures, presented outlines of system architectures, image-processing algorithms, and related peripheral technologies; described the concepts required to configure high-speed intelligent systems, such as hierarchical parallel distributed architecture, parallel decomposition, orthogonal decomposition, dynamics matching, latency minimization, high-speed 3D shape measurement, active vision, tracking vision, dynamic compensation, and dynamic projection mapping; and discussed a wide range of application systems in a systematic manner.

Application systems using high-speed vision

Application systems using high-speed vision

Cite this article as:
M. Ishikawa, “High-Speed Vision and its Applications Toward High-Speed Intelligent Systems,” J. Robot. Mechatron., Vol.34 No.5, pp. 912-935, 2022.
Data files:
References
  1. [1] D. Marr, “Vision,” Freeman, 1982.
  2. [2] J. S. Albus, “Brains, Behavior, and Robotics,” McGraw-Hill, 1981.
  3. [3] R. A. Brooks, “A Robust Layered Control System for a Mobile Robot,” IEEE J. Robotics and Automation, Vol.RA-2, No.1, pp. 14-23, 1986.
  4. [4] A. Namiki, K. Hashimoto, and M. Ishikawa, “Hierarchical Control Architecture for High-speed Visual Servoing,” Int. J. of Robotics Research, Vol.22, No.10, pp. 873-888, 2003.
  5. [5] A. Namiki, T. Komuro, and M. Ishikawa, “High Speed Sensory-Motor Fusion Based on Dynamics Matching,” Proc. IEEE, Vol.90, No.7, pp. 1178-1187, 2002.
  6. [6] M. Ishikawa, T. Komuro, A. Namiki, and I. Ishii, “1ms Sensory-Motor Fusion System,” J. M. Hollerbach and D. E. Koditschek (Eds.), Robotics Research, pp. 359-364, Springer, 2000.
  7. [7] J. Aloimonos, I. Weiss, and A. Bandyopadhyay, “Active vision,” Int. J. Computer Vision, Vol.1, No.4, pp. 333-356, 1988.
  8. [8] M. Ishikawa, “Active Sensor System Using Parallel Processing Circuits,” J. Robotics and Mechatronics, Vol.5, No.1, pp. 31-37, 1993.
  9. [9] K. Okumura, K. Yokoyama, H. Oku, and M. Ishikawa, “1ms Auto Pan-Tilt – Video Shooting Technology for Objects in Motion Based on Saccade Mirror with Background Subtraction,” Advanced Robotics, Vol.29, No.7, pp. 457-468, 2015.
  10. [10] M. Ishikawa, “Sensor Fusion: The State of the Art,” J. Robot. Mechatron., Vol.2, No.4, pp. 235-244, 1991.
  11. [11] S. Namiki, K. Yokoyama, S. Yachida, T. Shibata, H. Miyano, and M. Ishikawa, “Online Object Recognition Using CNN-based Algorithm on High-speed Camera Imaging,” Proc. Int. Conf. on Pattern Recognition, Paper No.679, pp. 2025-2032, 2021.
  12. [12] M. Ishikawa, A. Morita, and N. Takayanagi, “High Speed Vision System Using Massively Parallel Processing,” Proc. IEEE/RSJ Int. Conf. on Intelligent Robots and Systems, pp. 373-377, 1992.
  13. [13] M. Ishikawa, K. Ogawa, T. Komuro, and I. Ishii, “A CMOS Vision Chip with SIMD Processing Element Array for 1ms Image Processing,” Proc. IEEE Int. Solid-State Circuits Conf., pp. 206-207, 1999.
  14. [14] T. Komuro, S. Kagami, and M. Ishikawa, “A Dynamically Reconfigurable SIMD Processor for a Vision Chip,” IEEE J. of Solid-State Circuits, Vol.39, No.1, pp. 265-268, 2004.
  15. [15] T. Komuro, A. Iwashita, and M. Ishikawa, “A QVGA-size Pixelparallel Image Processor for 1,000-fps Vision,” IEEE Micro, Vol.29, No.6, pp. 58-67, 2009.
  16. [16] T. Komuro, T. Tabata, and M. Ishikawa, “A Reconfigurable Embedded System for 1000 f/s Real-time Vision,” IEEE Trans. on Circuits and Systems for Video Technology, Vol.20, No.4, pp. 496-504, 2010.
  17. [17] T. Komuro, I. Ishii, M. Ishikawa, and A. Yoshida, “A Digital Vision Chip Specialized for High-speed Target Tracking,” IEEE Trans. on Electron Devices, Vol.50, No.1, pp. 191-199, 2003.
  18. [18] I. Ishii, T. Komuro, and M. Ishikawa, “Method of Moment Calculation for a Digital Vision Chip System,” Proc. Int. Conf. on Computer Architecture for Machine Perception, pp. 41-48, 2000.
  19. [19] M. H. Raibert and J. E. Tanner, “Design and Implementation of a VLSI Tactile Sensing Computer,” Int. J. Robotics Research, Vol.1, No.3, pp. 3-18, 1982.
  20. [20] M. Ishikawa and M. Shimojo, “Pattern Processing LSI for Intelligent Sensors,” Proc. Japan Joint Automatic Control Conf., pp. 399-400, 1985 (in Japanese).
  21. [21] C. Mead, “Analog VLSI and Neural System,” Addison-Wesley, 1989.
  22. [22] J. Wyatt, “Vision Chip Project: Analog VLSI System for Fast Image Acquisition and Early Vision Processing,” Proc. Int. Conf. on Robotics and Automation, pp. 1130-1135, 1991.
  23. [23] Y. Nakabo, M. Ishikawa, H. Toyoda, and S. Mizuno, “1ms Column Parallel Vision System and Its Application of High Speed Target Tracking,” Proc. IEEE Int. Conf. Robotics and Automation, pp. 650-655, 2000.
  24. [24] J. E. Eklund, C. Svensson, and A. Åström, “VLSI Implementation of a Focal Plane Image Processor-A Realization of the Near-Sensor Image Processing Concept,” IEEE Trans. on VLSI Systems, Vol.4, No.3, pp. 322-335, 1996.
  25. [25] T. Yamazaki, H. Katayama, S. Uehara, A. Nose, M. Kobayashi, S. Shida, M. Odahara, K. Takamiya, Y. Hisamatsu, S. Matsumoto, L. Miyashita, Y. Watanabe, T. Izawa, Y. Muramatsu, and M. Ishikawa, “A 1ms High-Speed Vision Chip with 3D-Stacked 140GOPS Column Parallel PEs for Spatio-Temporal Image Processing,” Proc. Int. Solid-State Circuits Conf., pp. 82-83, 2017.
  26. [26] A. Nose, T. Yamazaki, H. Katayama, S. Uehara, M. Kobayashi, S. Shida, M. Odahara, K. Takamiya, S. Matsumoto, L. Miyashita, Y. Watanabe, T. Izawa, Y. Muramatsu, Y. Nitta, and M. Ishikawa, “Design and Performance of a 1ms High-Speed Vision Chip with 3D-Stacked 140 GOPS Column-Parallel PEs,” Sensors, Vol.18, No.5, Article No.1313, 2018.
  27. [27] I. Ishii, Y. Nakabo, and M. Ishikawa, “Target Tracking Algorithm for 1ms Visual Feedback System Using Massively Parallel Processing,” Proc. IEEE Int. Conf. Robotics and Automation, pp. 2309-2314, 1996.
  28. [28] I. Ishii and M. Ishikawa, “Self Windowing for High Speed Vision,” Proc. IEEE Int. Conf. Robotics and Automation, pp. 1916-1921, 1999.
  29. [29] Y. Watanabe, T. Komuro, and M. Ishikawa, “955-fps Real-time Shape Measurement of a Moving/Deforming Object Using High-Speed Vision for Numerous-point Analysis,” Proc. IEEE Int. Conf. on Robotics and Automation, pp. 3192-3197, 2007.
  30. [30] A. Noda, Y. Yamakawa, and M. Ishikawa, “Frame Synchronization for Networked High-Speed Vision Systems,” Proc. IEEE SENSORS 2014, pp. 269-272, 2014.
  31. [31] H. Kim and M. Ishikawa, “High-speed Distributed Camera Network Based on Message Passing Interface,” Proc. Int. Conf. on Information Fusion, pp. 1768-1773, 2016.
  32. [32] H. Kim and M. Ishikawa, “Sub-Frame Evaluation of Frame Synchronization for Camera Network Using Linearly Oscillating Light Spot,” Sensors, Vol.21, Issue 18, Article No.6148, 2021.
  33. [33] Y. Yamakawa, Y. Matsui, A. Noda, M. Ishikawa, and M. Shimojo, “Development of a Sensor Network System with High Sampling Rate Based on Highly Accurate Simultaneous Synchronization of Clock and Data Acquisition and Experimental Verification,” Micromachines, Vol.9, No.7, Article No.325, 2018.
  34. [34] H. Oku, K. Hashimoto, and M. Ishikawa, “Variable-focus lens with 1-kHz bandwidth,” Optics Express, Vol.12, No.10, pp. 2138-2149. 2004.
  35. [35] L. Wang and M. Ishikawa, “Dynamic Response of Elastomer-based Liquid-filled Variable Focus Lens,” Sensors, Vol.19, Issue 21, Article No.4624, 2019.
  36. [36] A. Namiki, Y. Imai, M. Ishikawa, and M. Kaneko, “Development of a High-speed Multifingered Hand System and Its Application to Catching,” Proc. IEEE/RSJ Int. Conf. on Intelligent Robots and Systems, pp. 2666-2671, 2003.
  37. [37] Y. Watanabe, G. Narita, S. Tatsuno, T. Yuasa, K. Sumino, and M. Ishikawa, “High-speed 8-bit Image Projector at 1,000 fps with 3 ms Delay,” Proc. Int. Display Workshops, pp. 1064-1065, 2015.
  38. [38] Y. Watanabe and M. Ishikawa, “High-Speed and High-Brightness Color Single-Chip DLP Projector Using High-Power LED-Based Light Sources,” Proc. Int. Display Workshops, pp. 1350-1352, 2019.
  39. [39] Y. Watanabe, T. Komuro, S. Kagami, and M. Ishikawa, “Multi-Target Tracking Using a Vision Chip and its Applications to Real-Time Visual Measurement,” J. Robot. Mechatron., Vol.17, No.2, pp. 121-129, 2005.
  40. [40] S. Tabata, Y. Watanabe, and M. Ishikawa, “High-speed 3D Sensing with Three-view Geometry Using a Segment Pattern,” Proc. IEEE/RSJ Int. Conf. on Intelligent Robots and Systems, pp. 3900-3907, 2015.
  41. [41] S. Tabata, M. Maruyama, Y. Watanabe, and M. Ishikawa, “Pixelwise Phase Unwrapping Based on Ordered Periods Phase Shift,” Sensors, Vol.19, No.2, Article No.377, 2019.
  42. [42] G. Narita, Y. Watanabe, and M. Ishikawa, “Dynamic Projection Mapping onto Deforming Non-Rigid Surface Using Deformable Dot Cluster Marker,” IEEE Trans. on Visualization and Computer Graphics, Vol.23, No.3, pp. 1235-1248, 2017.
  43. [43] L. Miyashita, Y. Watanabe, and M. Ishikawa, “MIDAS Projection: Markerless and Modelless Dynamic Projection Mapping for Material Representation,” ACM Trans. on Graphics, Vol.37, No.6, Article No.196, 2018.
  44. [44] S. Noguchi, M. Yamada, Y. Watanabe, and M. Ishikawa, “Real-time 3D Page Tracking and Book Status Recognition for High-speed Book Digitization based on Adaptive Capturing,” Proc. IEEE Winter Conf. on Applications of Computer Vision, pp. 137-144, 2014.
  45. [45] Y. Watanabe, M. Tamei, M. Yamada, and M. Ishikawa, “Automatic Page Turner Machine for High-Speed Book Digitization,” Proc. IEEE/RSJ Int. Conf. on Intelligent Robots and Systems, pp. 272-279, 2013.
  46. [46] Y. Watanabe, K. Itoyama, M. Yamada, and M. Ishikawa, “Digitization of Deformed Documents using a High-speed Multi-camera Array,” Proc. Asian Conf. on Computer Vision, pp. 394-407, 2012.
  47. [47] H. Shibayama, Y. Watanabe, and M. Ishikawa, “Reconstruction of 3D Surface and Restoration of Flat Document Image from Monocular Image Sequence,” Proc. Asian Conf. on Computer Vision, pp. 350-364, 2012.
  48. [48] L. Miyashita, R. Yonezawa, Y. Watanabe, and M. Ishikawa, “Rapid SVBRDF Measurement by Algebraic Solution Based on Adaptive Illumination,” Proc. Int. Conf. on 3D Vision, pp. 232-239, 2014.
  49. [49] T. Hayakawa, T. Watanabe, T. Senoo, and M. Ishikawa, “Gain-compensated Sinusoidal Scanning of Galvanometer Mirror in Proportional-integral-differential Control Using Pre-emphasis Technique for Motion Blur Compensation,” Appl. Opt., Vol.55, No.21, pp. 5640-5646, 2016.
  50. [50] T. Hayakawa and M. Ishikawa, “Development of Motion-Blur-Compensated High-speed Moving Visual Inspection Vehicle for Tunnels,” Proc. Int. J. Struct. Civ. Eng. Res., Vol.5, No.2, pp. 151-155, 2016.
  51. [51] L. Miyashita, Y. Watanabe, and M. Ishikawa, “High-Speed Image Rotator for Blur-Canceling Roll Camera,” Proc. Int. Conf. on Intelligent Robots and Systems, pp. 6047-6052, 2015.
  52. [52] Y. Watanabe, T. Komuro, and M. Ishikawa, “A High-speed Vision System for Moment-based Analysis of Numerous Objects,” Proc. of IEEE Int. Conf. on Image Processing, pp. V177-V180, 2007.
  53. [53] K. Okumura, H. Oku, and M. Ishikawa, “High-Speed Gaze Controller for Millisecond-order Pan/tilt Camera,” Proc. IEEE Int. Conf. on Robotics and Automation, pp. 6186-6191, 2011.
  54. [54] T. Sueishi, T. Ogawa, S. Yachida, and M. Ishikawa, “Continuous High-resolution Observation System Using High-speed Gaze and Focus Control with Wide-angle Triangulation,” Proc. SPIE, Vol.11250, Article No.1125012-1-10, 2020.
  55. [55] T. Sueishi, T. Ogawa, S. Yachida, Y. Watanabe, and M. Ishikawa, “High-resolution Observation Method for Freely Swimming Medaka Using High-speed Optical Tracking with Ellipse Self-window,” Proc. Annual Int. Conf. of the IEEE Engineering in Medicine and Biology Society (EMBC2018), Article No.FrPoS-32.41, 2018.
  56. [56] T. Sueishi, M. Ishii, and M. Ishikawa, “Tracking Background-oriented Schlieren for Observing Shock Oscillations of Transonic Flying Objects,” Applied Optics, Vol.56, Issue 13, pp. 3789-3798, 2017.
  57. [57] L. Miyashita, Y. Zou, and M. Ishikawa, “VibroTracker: A Vibrotactile Sensor for Tracking Objects,” Proc. SIGGRAPH 2013, Emerging Technologies, Article No.15, 2013.
  58. [58] H. Oku, N. Ogawa, K. Hashimoto, and M. Ishikawa, “Two-dimensional Tracking of a Motile Microorganism Allowing High-resolution Observation with Various Imaging Techniques,” Review of Scientific Instruments, Vol.76, No.3, Article No.034301, 2005.
  59. [59] H. Oku, N. Ogawa, K. Shiba, M. Yoshida, and M. Ishikawa, “How to Track Spermatozoa using High-Speed Visual Feedback,” Proc. Annual Int. Conf. of the IEEE Engineering in Medicine and Biology Society, pp. 125-128, 2008.
  60. [60] S. Makise, H. Oku, and M. Ishikawa, “Serial Algorithm for Highspeed Autofocusing of Cells using Depth from Diffraction (DFDi) Method,” Proc. IEEE Int. Conf. on Robotics and Automation, pp. 3124-3129, 2008.
  61. [61] S. Huang, Y. Yamakawa, T. Senoo, and M. Ishikawa, “Dynamic Compensation by Fusing a High-speed Actuator and High-speed Visual Feedback with Its Application to Fast Peg-and-hole Alignment,” Advanced Robotics, Vol.28, No.9, pp. 613-624, 2014.
  62. [62] S. Huang, N. Bergström, Y. Yamakawa, T. Senoo, and M. Ishikawa, “Applying High-Speed Vision Sensing to an Industrial Robot for High-Performance Position Regulation under Uncertainties,” Sensors, Vol.16, No.8, Article No.1195, 2016.
  63. [63] S. Huang, K. Shinya, N. Bergström, Y. Yamakawa, T. Yamazaki, and M. Ishikawa, “Towards Flexible Manufacturing: Dynamic Compensation Robot with a New High-speed Vision System,” Int. J. Advanced Manufacturing Technology, Vol.95, Issue 9-12, pp. 4523-4533, 2018.
  64. [64] S. Huang, M. Ishikawa, and Y. Yamakawa, “An Active Assistant Robotic System based on High-Speed Vision and Haptic Feedback for Human-Robot Collaboration,” Proc. Annual Conf. of IEEE Industrial Electronics Society, pp. 3649-3654, 2018.
  65. [65] S. Huang, M. Ishikawa, and Y. Yamakawa, “A Coarse-to-Fine Framework for Accurate Positioning under Uncertainties – from Autonomous Robot to Human-Robot System,” Int. J. Advanced Manufacturing Technology, Vol.108, pp. 2929-2944, 2020.
  66. [66] Y. Yamakawa, K. Kuno, and M. Ishikawa, “Human-Robot Cooperative Task Realization Using High-speed Robot Hand System,” Proc. IEEE Int. Conf. on Advanced Intelligent Mechatronics, pp. 281-286, 2015.
  67. [67] Y. Yamakawa, Y. Matsui, and M. Ishikawa, “Development of a Real-Time Human-Robot Collaborative System Based on 1 kHz Visual Feedback Control and Its Application to a Peg-in-Hole Task,” Sensors, Vol.21, Issue 2, Article No.663, 2021.
  68. [68] N. Bergström, S. Huang, Y. Yamakawa, T. Senoo, and M. Ishikawa, “Towards Assistive Human-Robot Micro Manipulation,” Proc. IEEE-RAS Int. Conf. on Humanoid Robots, pp. 1188-1195, 2016.
  69. [69] T. Senoo, A. Namiki, and M. Ishikawa, “High-Speed Batting Using a Multi-Jointed Manipulator,” Proc. of IEEE Int. Conf. on Robotics and Automation, pp. 1191-1196, 2004.
  70. [70] T. Senoo, A. Namiki, and M. Ishikawa, “Ball Control in High-Speed Batting Motion using Hybrid Trajectory Generator,” Proc. of IEEE Int. Conf. on Robotics and Automation, pp. 1762-1767, 2006.
  71. [71] T. Senoo, A. Namiki, and M. Ishikawa, “High-speed Throwing Motion Based on Kinetic Chain Approach,” Proc. of IEEE/RSJ Int. Conf. on Intelligent Robots and Systems, pp. 3206-3211, 2008.
  72. [72] A. Namiki, Y. Imai, M. Ishikawa, and M. Kaneko, “Development of a High-speed Multifingered Hand System and Its Application to Catching,” Proc. IEEE/RSJ Int. Conf. on Intelligent Robots and Systems, pp. 2666-2671, 2003.
  73. [73] Y. Imai, A. Namiki, K. Hashimoto, and M. Ishikawa, “Dynamic Active Catching Using a High-speed Multifingered Hand and a High-speed Vision System,” Proc. IEEE Int. Conf. on Robotics and Automation, pp. 1849-1854, 2004.
  74. [74] N. Furukawa, A. Namiki, T. Senoo, and M. Ishikawa, “Dynamic Regrasping Using a High-speed Multifingered Hand and a High-speed Vision System,” Proc. IEEE Int. Conf. on Robotics and Automation, pp. 181-187, 2006.
  75. [75] T. Senoo, D. Yoneyama, A. Namiki, and M. Ishikawa, “Tweezers Manipulation Using High-speed Visual Servoing Based on Contact Analysis,” Proc. of IEEE Int. Conf. on Robotics and Biomimetics, pp. 1936-1941, 2011.
  76. [76] D. Shiokata, A. Namiki, and M. Ishikawa, “Robo Dribbling Using a High-Speed Multifingered Hand and a High-Speed Vision System,” Proc. IEEE/RSJ Int. Conf. on Intelligent Robots and Systems, pp. 3945-3950, 2005.
  77. [77] Y. Yamakawa, A. Namiki, and M. Ishikawa, “Motion Planning for Dynamic Folding of a Cloth with Two High-speed Robot Hands and Two High-speed Sliders,” Proc. IEEE Int. Conf. on Robotics and Automation, pp. 5486-5491, 2011.
  78. [78] M. Oka, K. Murakami, S. Huang, H. Sumi, M. Ishikawa, and Y. Yamakawa, “High-speed Manipulation of Continuous Spreading and Aligning a Suspended Towel-like Object,” Proc. IEEE/SICE Int. Symp. on System Integration, pp. 7-12, 2022.
  79. [79] K. Koyama, M. Shimojo, A. Ming, and M. Ishikawa, “Integrated Control of Multi-Degree-of-freedom Hand and Arm Using a Reactive Architecture based on High-Speed Proximity Sensing,” Int. J. of Robotics Research, Vol.38, Issue 14, pp. 1717-1750, 2019.
  80. [80] K. Koyama, K. Murakami, T. Senoo, M. Shimojo, and M. Ishikawa, “High-Speed, Small-Deformation Catching of Soft Objects Based on Active Vision and Proximity Sensing,” IEEE Robotics and Automation Letters, Vol.4, Issue 2, pp. 578-585, 2019.
  81. [81] K. Ito, T. Sueishi, Y. Yamakawa, and M. Ishikawa, “Tracking and Recognition of a Human Hand in Dynamic Motion for Janken (rock-paper-scissors) Robot,” Proc. IEEE Int. Conf. on Automation Science and Engineering, pp. 891-896, 2016.
  82. [82] T. Tamada, W. Ikarashi, D. Yoneyama, K. Tanaka, Y. Yamakawa, T. Senoo, and M. Ishikawa, “High-speed Bipedal Robot Running Using High-speed Visual Feedback,” Proc. IEEE-RAS Int. Conf. on Humanoid Robots, pp. 140-145, 2014.
  83. [83] T. Kadowaki, M. Maruyama, T. Hayakawa, N. Matsuzawa, K. Iwasaki, and M. Ishikawa, “Effects of Low Video Latency between Visual Information and Physical Sensation in Immersive Environments,” Proc. ACM Symp. on Virtual Reality Software and Technology, Article No.84, 2018.
  84. [84] M. S. Alvissalim, M. Yasui, C. Watanabe, and M. Ishikawa, “Immersive Virtual 3D Environment based on 499 fps Hand Gesture Interface,” Proc. Int. Conf. on Advanced Computer Science and Information Systems, pp. 198-203, 2014.
  85. [85] H. Yamamoto, M. Yasui, M. S. Alvissalim, M. Takahashi, Y. Tomiyama, S. Suyama, and M. Ishikawa, “Floating Display Screen Formed by AIRR (Aerial Imaging by Retro-Reflection) for Interaction in 3D Space,” Proc. Int. Conf. on 3D Imaging, Paper No.40, 2014.
  86. [86] M. Ishikawa, “High-speed Projector and Its Applications,” Conf. on Emerging Digital Micromirror Device Based Systems and Applications XI, Photonics West OPTO, Proc. of SPIE, Vol.10932, Article No.109320N-1-7, 2019.
  87. [87] K. Fukamizu, L. Miyashita, and M. Ishikawa, “ElaMorph Projection: Deformation of 3D Shape by Dynamic Projection Mapping,” Proc. Int. Symp. on Mixed and Augmented Reality, pp. 220-229, 2020.
  88. [88] T. Sueishi, H. Oku, and M. Ishikawa, “Lumipen 2: Dynamic Projection Mapping with Mirror-based Robust High-speed Tracking against Illumination Changes,” PRESENCE: Teleoperators and Virtual Environments, Vol.25, No.4, pp. 299-321, 2017.
  89. [89] Y. Mikawa, T. Sueishi, Y. Watanabe, and M. Ishikawa, “VarioLight: Hybrid Dynamic Projection Mapping Using High-speed Projector and Optical Axis Controller,” Emerging Technologies, ACM SIGGRAPH Asia 2018, Article No.17, 2018.
  90. [90] L. Miyashita, T. Yamazaki, K. Uehara, Y. Watanabe, and M. Ishikawa, “Portable Lumipen: Dynamic SAR in Your Hand,” Proc. IEEE Int. Conf. on Multimedia and Expo, pp. 1-6, 2018.
  91. [91] M. Hirano, Y. Yamakawa, T. Senoo, N. Kishi, and M. Ishikawa, “Multiple Scale Aggregation with Patch Multiplexing for High-speed Inter-vehicle Distance Estimation,” Proc. IEEE Intelligent Vehicles Symp., pp. 1436-1443, 2021.
  92. [92] K. Yabuuchi, M. Hirano, T. Senoo, N. Kishi, and M. Ishikawa, “Real-Time Traffic Light Detection with Frequency Patterns Using a High-Speed Camera,” Sensors, Vol.20, No.14, pp. 4035-4035, 2020.
  93. [93] M. Ikura, L. Miyashita, and M. Ishikawa, “Stabilization System for UAV Landing on Rough Ground by Adaptive 3D Sensing and High-speed Landing Gear Adjustment,” J. Robot. Mechatron., Vol.33, No.1 pp. 108-118, 2020.
  94. [94] S. Tanaka, T. Senoo, and M. Ishikawa, “Non-Stop Handover of Parcel to Airborne UAV Based on High-Speed Visual Object Tracking,” Proc. Int. Conf. on Advanced Robotics, pp. 414-419, 2019.
  95. [95] M. Jiang, R. Sogabe, K. Shimasaki, S. Hu, T. Senoo, and I. Ishii, “500-fps Omnidirectional Visual Tracking Using Three-Axis Active Vision System,” IEEE Trans. on Instrumentation and Measurement, Vol.70, pp. 1-11, 2021.
  96. [96] S. Hu, K. Shimasaki, M. Jiang, T. Senoo, and I. Ishii, “A Simultaneous Multi-object Zooming System Using an Ultrafast Pan-tilt Camera,” IEEE Sensors J., Vol.21, No.7, pp. 9436-9448, 2021.
  97. [97] S. Hu, H. Dong, K. Shimasaki, M. Jiang, T. Senoo, and I. Ishii, “Omnidirectional Panoramic Video System with Frame-by-frame Ultrafast Viewpoint Control,” IEEE Robotics and Automation Letters, Vol.7, No.2, pp. 4086-4093, 2022.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Dec. 06, 2024