single-rb.php

JRM Vol.22 No.4 pp. 430-438
doi: 10.20965/jrm.2010.p0430
(2010)

Paper:

Development of Deskwork Support System Using Pointing Gesture Interface

Masao Sugi*, Hisato Nakanishi**, Masataka Nishino***,
Yusuke Tamura**, Tamio Arai**, and Jun Ota**

*Tokyo University of Agriculture and Technology

**The University of Tokyo

***Honda Motor Co. Ltd.

Received:
January 11, 2010
Accepted:
April 13, 2010
Published:
August 20, 2010
Keywords:
attentive workbench (AWB), deskwork support system, gesture-based input interface
Abstract
The authors have proposed a deskwork support system called an “Attentive Workbench” (AWB), which uses a camera, a projector, and automatically moving trays to support the user both physically and informationally. This research intends to build an interface between the user and the AWB system based on pointing gestures. Considering the specific purposes of the AWB, a simple, reliable, and highly responsive interface is implemented. We demonstrate a deskwork support system with actual automatically moving trays that uses the proposed interface.
Cite this article as:
M. Sugi, H. Nakanishi, M. Nishino, Y. Tamura, T. Arai, and J. Ota, “Development of Deskwork Support System Using Pointing Gesture Interface,” J. Robot. Mechatron., Vol.22 No.4, pp. 430-438, 2010.
Data files:
References
  1. [1] T. Sato, Y. Nishida, and H. Mizoguchi, “Robotic room: symbiosis with human through behavior media,” Robotics and Autonomous Systems, Vol.18, pp. 185-194, 1996.
  2. [2] R. Brooks, “The intelligent room project,” Proc. of the 2nd Int. Cognitive Technology Conf., pp. 271-278, 1997.
  3. [3] P. Wellner, “Interacting with paper on the DigitalDesk,” Communications of the ACM, Vol.36, No.7, pp. 87-96, 1993.
  4. [4] B. Ullmer and H. Ishii, “The metaDESK: models and prototypes for tangible user interfaces,” Proc. of the 10th annual ACM symposium on User Interface Software and Technology, pp. 223-232, 1997.
  5. [5] K. Oka, Y. Sato, and H. Koike, “Real-time fingertip tracking and gesture recognition,” IEEE Computer Graphics and Applications, Vol.22, No.6, pp. 64-71, 2002.
  6. [6] M. Sugi, M. Nikaido, Y. Tamura, J. Ota, T. Arai, K. Kotani, K. Takamasu, S. Shin, H. Suzuki, and Y. Sato, “Motion Control of Self-Moving Trays for Human Supporting Production Cell ‘Attentive Workbench’,” Proc. of the 2005 IEEE Int. Conf. on Robotics and Automation, pp. 4091-4096, 2005.
  7. [7] K. Kotani, K. Takamasu, Y. Ashkenazy, H. E. Stanley, and Y. Yamamoto, “Model for cardiorespiratory synchronization in humans,” Physical Review E, Vol.65, 051923, pp. 1-9, 2002.
  8. [8] M. Sugi, I. Matsumura, Y. Tamura, M. Nikaido, J. Ota, T. Arai, K. Kotani, K. Takamasu, H. Suzuki, A. Yamamoto, Y. Sato, S. Shin, and F. Kimura, “Quantitative Evaluation of Automatic Parts Delivery in “Attentive Workbench” Supporting Workers in Cell Production,” J. of Robotics and Mechatronics, Vol.21, No.1, pp. 135-145, 2009.
  9. [9] Y. Tamura, M. Sugi, J. Ota, and T. Arai, “Deskwork Support System Based on the Estimation of Human Intention,” Proc. of the 13th IEEE Int. Workshop on Robot and Human Interactive Communication (RO-MAN 2004), pp. 413-418, 2004.
  10. [10] M. Kavrakli, M. Taylor, and A. Trapeznikov, “Designing in Virtual Reality (DesIRe): A Gesture-Based Interface,” Proc. of the 2nd Int. Conf. on Digital Interactive Media in Entertainment and Arts, pp. 131-136, 2007.
  11. [11] A. F. Abate, M. De Marsico, S. Levialdi, V. Mastronardi, S. Ricciardi, and G. Tortora, “Gesture Based Interface for Crime Scene Analysis: A Proposal,” Proc. of the Int. Conf. on Computational Science and Its Applications, Part II, pp. 143-154, 2008.
  12. [12] B. A. Sawyer, “Magnetic Positioning Device,” US patent, Vol.3, pp. 457-482, 1969.
  13. [13] X. Chen, K. Takamasu, and M. Nikaidou, “Evaluation of thrust force and positioning accuracy of a new linear motor,” Proc. of the 6th Int. Symposium on Measurement Technology and Intelligent Instruments, p. 126, 2003.
  14. [14] M. Erdmann, T. Lozano-Pérez, “On Multiple Moving Objects,” Algorithmica, Vol.2, pp. 477-521, 1987.
  15. [15] K. Tsukada and M. Yasumura, “Ubi-Finger: Gesture Input Device for Mobile Use,” Proc. of the 5th Asia Pacific Conf. on Computer Human Interaction (APCHI 2002), Vol.1, pp. 388-400, 2002.
  16. [16] K. Irie, N. Wakamura, and K. Umeda, “Construction of an Intelligent Room Based on Gesture Recognition –Operation of Electric Appliances with Hand Gestures–,” Proc. of the 2004 IEEE/RSJ Int. Conf. on Intelligent Robots and Systems, pp. 193-198, 2004.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Apr. 22, 2024