Attentive Deskwork Support System
Yusuke Tamura, Masao Sugi, Tamio Arai, and Jun Ota
The University of Tokyo, 7-3-1 Hongo, Bunkyo-ku, Tokyo 113-8656 Japan
We propose an attentive deskwork support system that quickly delivers required objects to people who work at desks. To meet this goal, we propose methods to understand a user’s request for support. To know the existence of the user request, the system uses the characteristics of the user’s hand and eye movements and detects hand-reaching movements. The system understands the content of the user’s request by integrating sensory and contextual information using a probabilistic model. Finally, the system determines a point of delivery by predicting a user’s hand movement and delivers required objects by using self-moving trays. The experiments are conducted to evaluate the usefulness of the system proposed here.
-  Y. Tamura, M. Sugi, T. Arai, and J. Ota, “Estimation of user’s request for attentive deskwork support system,” Cutting Edge Robotics 2009, 2010 (in Press).
-  S. Kajikawa, T. Okino, K. Ohba, and H. Inooka, “Motion planning for hand-over between human and robot,” Proc. of the IEEE/RSJ Int. Conf. on Intelligent Robots and Systems, pp. 193-199, 1995.
-  A. Agah and K. Tanie, “Human interaction with a service robot: Mobile-manipulator handing over an object to a human,” Proc. of the IEEE Int. Conf. on Robotics and Automation, pp. 575-580, 1997.
-  T. Sato, T. Harada, and T. Mori, “Environment-type robot system “robotic room” featured by behavior media, behavior contents, and behavior adaptation,” IEEE/ASME Trans. on Mechatronics, Vol.9, No.3, pp. 529-534, 2004.
-  B. A. Sawyer, “Magnetic positioning device,” US Patent, Vol.3, pp. 457-482, 1969.
-  J. L. Dallaway and R. D. Jackson, “The user interface for interactive robotic workstations,” Proc. of the IEEE/RSJ Int. Conf. on Intelligent Robots and Systems, pp. 1682-1686, 1994.
-  S. Ishii, S. Tanaka, and F. Hiramatsu, “Meal assistance robot for severely handicapped people,” Proc. of the IEEE Int. Conf. on Robotics and Automation, pp. 1308-1313, 1995.
-  R. Cipolla and N. J. Hollinghurst, “Human-robot interface by pointing with uncalibrated stereo vision,” Image and Vision Computing, Vol.14, pp. 171-178, 1996.
-  Y. Tamura, M. Sugi, T. Arai, and J. Ota, “Target identification through human pointing gesture based on human-adaptive approach,” J. of Robotics and Mechatronics, Vol.20, No.4, pp. 515-525, 2008.
-  J. R. Millán, F. Renkens, J. Mouriño, and W. Gerstner, “Noninvasive brain-actuated control of a mobile robot by human EEG,” IEEE Trans. on Biomedical Engineering, Vol.51, No.6, pp. 1026-1033, 2004.
-  O. Fukuda, T. Tsuji, M. Kaneko, and A. Otsuka, “A human-assisting manipulator teleoperated by EMG signals and arm motions,” IEEE Trans. on Robotics and Automation, Vol.19, No.2, pp. 210-222, 2003.
-  C. Prablanc, J. F. Echallier, E. Komilis, and M. Jeannerod, “Optimal response of eye and hand motor system in pointing a visual target,” Biological Cybernetics, Vol.35, pp. 113-124, 1979.
-  T. Flash and N. Hogan, “The coordination of arm movements: An experimentally confirmed mathematical model,” The J. of Neuroscience, Vol.5, No.7, pp. 1688-1703, 1985.
-  K. Oka, Y. Sato, and H. Koike, “Real-time fingertip tracking and gesture recognition,” IEEE Computer Graphics and Applications, Vol.22, No.6, pp. 64-71, 2002.
-  V. Raghavan, J. Molineros, and R. Sharma, “Interactive evaluation of assembly sequences using augmented reality,” IEEE Trans. on Robotics and Automation, Vol.15, No.3, pp. 435-449, 1999.
-  P. Wellner, “Interacting with paper on the DigitalDesk,” Communications of the ACM, Vol.36, No.7, pp. 87-96, 1993.
-  H. Koike, Y. Sato, and Y. Kobayashi, “Integrating paper and digital information on EnhancedDesk: A method for realtime finger tracking on an augmented desk system,” ACM Trans. on Computer-Human Interaction, Vol.8, No.4, pp. 307-322, 2001.
-  B. Leibe, T. Starner, W. Ribarsky, Z. Wartell, D. Krum, J. Weeks, B. Singletary, and L. Hodges, “Toward spontaneous interaction with the perceptive workbench,” IEEE Computer Graphics and Applications, Vol.20, No.6, pp. 54-65, 2000.
-  J. Rekimoto, “SmartSkin: An infrastructure for freehand manipulation on interactive surfaces,” Proc. of the SIGCHI Conf. on Human Factors in Computing Systems, pp. 113-120, 2002.
-  M. Topping, “An overview of the development of Handy 1, a rehabilitation robot to assist the severely disabled,” J. of Intelligent and Robotic Systems, Vol.34, pp. 253-263, 2002.
-  M. Sugi, I. Matsumura, Y. Tamura, J. Ota, and T. Arai, “Quantitative evaluation of human supporting production system “Attentive Workbench”,” Proc. of the IEEE Conf. on Automation Science and Engineering, pp. 531-535, 2007.
-  Y. Kado, T. Kamoda, Y. Yoshiike, P. R. De Silva and M. Okada, “Sociable Dining Table: The Effectiveness of a “KonKon” Interface for Reciprocal Adaptation,” Proc. of the 5th ACM/IEEE Int. Conf. on Human-robot interaction, pp. 105-106, 2010.
-  P. P. Maglio, R. Barrett, C. S. Campbell, and T. Selker, “SUITOR: An attentive information system,” Proc. of the Int. Conf. on Intelligent User Interfaces, pp. 169-176, 2000.
-  P. P. Maglio, T. Matlock, C. S. Campbell, S. Zhai, and B. A. Smith, “Gaze and speech in attentive user interfaces,” Lecture Notes in Computer Science, Springer, Vol.1948, pp. 1-7, 2000.
-  P. P. Maglio and C. S. Campbell, “Attentive agents,” Communications of the ACM, Vol.46, No.3, pp. 47-51, 2003.
-  R. Vertegaal, “Designing attentive interfaces,” Proc. of the 2002 Symposium on Eye Tracking Research and Applications, pp. 23-30, 2002.
-  R. Vertegaal, “Attentive user interfaces,” Communications of the ACM, Vol.46, No.3, pp.31-33, 2003.
-  S. Zhai, “What’s in the eyes for attentive input,” Communications of the ACM, Vol.46, No.3, pp.34-39, 2003.
-  J. S. Bradbury, J. S. Shell, and C. B. Knowles, “Hands on cooking: Towards an attentive kitchen,” Extended Abstracts on Human Factors in Computing Systems, pp. 996-997, 2003.
-  T. Selker, “Visual attentive interfaces,” BT Technology J., Vol.22, No.4, pp. 146-150, 2004.
-  D. Chen and R. Vertegaal, “Using mental load for managing interruptions in physiologically attentive user interfaces,” Extended Abstracts on Human Factors in Computing Systems, pp. 1513-1516, 2004.
-  V. Novak, C. Sandor, and G. Klinker, “An AR workbench for experimenting with attentive user interfaces,” Proc. of the IEEE/ACMInt. Symposium on Mixed and Augmented Reality, pp. 284-285, 2004.
-  R. J. K. Jacob, “What you look at is what you get: Eye movementbased interaction techniques,” Proc. of the SIGCHI Conf. on Human Factors in Computing Systems, pp. 11-18, 1990.
-  S. Zhai, C. Morimoto, and S. Ihde, “Manual and gaze input cascaded (MAGIC) pointing,” Proc. of the SIGCHI Conf. on Human Factors in Computing Systems, pp. 246-253, 1999.
-  A. Monden, K. Matsumoto, and M. Yamato, “Evaluation of gazeadded target selection methods suitable for general GUIs,” Int. J. of Computer Applications in Technology, Vol.24, No.1, pp. 17-24, 2005.
This article is published under a Creative Commons Attribution-NoDerivatives 4.0 International License.