single-rb.php

JRM Vol.25 No.1 pp. 53-59
doi: 10.20965/jrm.2013.p0053
(2013)

Paper:

Consideration of Scanning Line Density and Capture of Shape of Human Movement from 3D Laser Scanning Sensor Using Roundly Swinging Mechanism

Mitsuhiro Matsumoto* and Shin’ichi Yuta**

*Department of Control and Information Systems Engineering, Kurume National College of Technology, 1-1-1 Komorino, Kurume-shi, Fukuoka 830-8555, Japan

**Graduate School of System and Information Engineering, University of Tsukuba, 1-1-1 Tennoudai, Tsukuba-shi, Ibaraki 305-8573, Japan

Received:
November 30, 2011
Accepted:
March 23, 2012
Published:
February 20, 2013
Keywords:
scanning line density, 3D laser scanning sensor, roundly swinging mechanism
Abstract
A 3D SOKUIKI sensor (3D laser scanning sensor) with a roundly swinging mechanism can detect the range distance of a belt area at a certain vertical height and horizontal view angle without any converging points and without twisting any signal cables. It is useful for observing the movement of people and for capturing the shape of human movement. We analyzed the line-to-line distance as the scanning line density for this type of sensor. The entire belt area of directions is scanned twice by both positively and negatively inclined scanning lines in one period of a whole scan. The line-to-line distance depends on the vertical height and is dense at both vertical ends and sparse in the middle. As a result, the scanning density at center front is 1/2.5 (40%) compared to using ideal vertical direction control. Since ideal vertical direction control of a range-measuring beam is not technically possible at this time, this scanning density provided by the roundly swinging mechanism can be considered to be reasonably good and useful. The 3D SOKUIKI sensor using this roundly swinging mechanism can capture the shape of human movement.
Cite this article as:
M. Matsumoto and S. Yuta, “Consideration of Scanning Line Density and Capture of Shape of Human Movement from 3D Laser Scanning Sensor Using Roundly Swinging Mechanism,” J. Robot. Mechatron., Vol.25 No.1, pp. 53-59, 2013.
Data files:
References
  1. [1] J. H. Lee, T. Tsubouchi, K. Yamamoto, and S. Egawa, “People tracking using a robot in motion with laser range finder,” Proc. of The 2006 IEEE/RSJ Int. Conf. on Intelligent Robots and Systems, pp. 2936-2942, 2006.
  2. [2] M. Hashimoto, Y. Matsui, and K. Takahashi, “Moving-Object Tracking with In-Vehicle Multi-LRSs,” J. of Robotics and Mechatronics, Vol.20, No.3, pp. 367-377, 2008.
  3. [3] T. Ogino, M. Tomono, T. Akimoto, and A. Matsumoto, “Human Following by an Omnidirectional Mobile Robot Using Maps Built from Laser Range-Finder Measurement,” J. of Robotics and Mechatronics, Vol.22, No.1, pp. 28-35, 2010.
  4. [4] H. Noguchi, T. Mori, T. Matsumoto, M. Shimosaka, and T. Sato, “Multiple-Person Tracking by Multiple Cameras and Laser Range Scanners in Indoor Environments,” J. of Robotics and Mechatronics, Vol.22, No.2, pp. 221-229, 2010.
  5. [5] M. Kristou, A. Ohya, and S. Yuta, “Target Person Identification and Following from a Moving Robot,” J. of Robotics and Mechatronics, Vol.23, No.1, pp. 163-172, 2011.
  6. [6] Y. Hosoda, K. Yamamoto, R. Ichinose, S. Egawa, J. Tamamoto, and T. Tsubouchi, “Collision Avoidance Control of Human-Symbiotic Robot,” Trans. of The Japan Society of Mechanical Engineers, Series C, Vol.77, No.775, pp. 1051-1061, 2011 (in Japanese).
  7. [7] A. Carballo, A. Ohya, and S. Yuta, “Laser Reflection Intensity and Multi-Layered Laser Range Finders for People Detection,” Proc. of The 19th IEEE Int. Symposium on Robot and Human Interactive Communication Symposium, pp. 406-411, 2010.
  8. [8] A. Carballo, A. Ohya, and S. Yuta, “People Detection using Range and Intensity Data from Multi-Layered Laser Range Finders,” Proc. of The 2010 IEEE/RSJ 2010 Int. Conf. on Intelligent Robots and Systems, pp. 5849-5854, 2010.
  9. [9] H. Kawata, T. Mori, and S. Yuta, “Design and Realization of 2-Dimensional Optical Range Sensor for Environment Recognition in Mobile Robots,” J. of Robotics and Mechatronics, Vol.17, No.2, pp. 116-120, 2005.
  10. [10] Y. Watanabe and T. Suehiro, “Development of Sensor Module for Outdoor Robot – Sphere Type Sensor Module Using CCD Camera and LRF –,” Bulletin of The Fukuoka Industrial Technology Center, No.16, pp. 107-110, 2006 (in Japanese).
  11. [11] Y. Kondo and T. Fujita, “3D Sensing Based on Laser Range Finder Control,” Proc. of The 27th Annual Conf. of the Robotics Society of Japan, p. 1I1-04, 2009 (in Japanese).
  12. [12] E. Takeuchi, K. Ohnom, and S. Tadokoro, “Robust Localization Method based on Free-space Observation Model using 3D-Map,” Proc. of The 2010 IEEE Int. Conf. on Robotics and Biomimetics, pp. 973-979, 2010.
  13. [13] K. Ito, “Shape Measurement of the Cylindrical Objects by 3D scanner,” Master’s Thesis, Keio University, 2010 (in Japanese).
  14. [14] M. Matsumoto, T. Yoshida, T. Mori, and S. Yuta, “3D SOKUIKI Sensor Module by Roundly Swinging Mechanism and SCIP-3D Command System,” Trans. of The Japan Society of Mechanical Engineers, Series C, Vol.75, No.760, pp. 3314-3323, 2009 (in Japanese).
  15. [15] M. Matsumoto and S. Yuta, “3D Laser Range Sensor Module with Roundly Swinging Mechanism for Fast and Wide View Range Image,” Proc. of The 2010 IEEE Int. Conf. on Multisensor Fusion and Integration for Intelligent Systems, pp. 156-161, 2010.
  16. [16]
    Supporting Online Materials:[a] UPenn MAGIC 2010 First-Round Competition (online).
    http://www.grasp.upenn.edu/success_story/upenn_magic_2010_first_round_competition
    [Accessed: Mar. 1, 2011]
  17. [17] [b] Meet Stephen ColBot: Special communications relay robot (online).
    http://lablog.engin.umich.edu/2010/11/meet-stephen-colbot-special.html
    [Accessed: Mar. 1, 2011]
  18. [18] [c] Kinect (online).
    http://www.xbox.com/kinect/
    [Accessed: Jan. 31, 2012]
  19. [19] [d] SwissRanger SR4000 Overview (online).
    http://www.mesa-imaging.ch/prodview4k.php
    [Accessed: Jan. 31, 2012]

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Dec. 02, 2024