JRM Vol.28 No.1 pp. 61-68
doi: 10.20965/jrm.2016.p0061


Cloud/Crowd Sensing System for Annotating Users Perception

Wataru Mito and Masahiro Matsunaga

SECOM Co., Ltd.
SECOM SC Center, 8-10-16 Shimorenjaku, Mitaka, Tokyo 181-8528, Japan

July 16, 2015
December 27, 2015
February 20, 2016
annotation, interaction sensing, perception, life log

Cloud/Crowd Sensing System for Annotating Users Perception

Overview of cloud/crowd sensing system

Reduction of burden of life support services has been studied for future ultra-aging society. However, highly advanced systems of the life support services often cause low accessibility. If the accessibility were low, service users would have difficulty in forecasting the system behavior and feel uneasy. In this paper, a cloud/crowd sensing system is proposed. Triggered by a monitoring result from sensors used in a life support service system, a character agent of the proposed system gives users dialogues and acquires information about their subjective views. A prototype of the cloud/crowd sensing system is described and evaluated in the paper. Anxiety of the users due to low accessibility could be removed by applying the proposed sensing system to the life support system.

Cite this article as:
W. Mito and M. Matsunaga, “Cloud/Crowd Sensing System for Annotating Users Perception,” J. Robot. Mechatron., Vol.28, No.1, pp. 61-68, 2016.
Data files:
  1. [1] K. Hashimoto, F. Saito, T. Yamamoto, and K. Ikeda, “A field study of the human support robot in the home environment,” 2013 IEEE Workshop on Advanced Robotics and Its Social Impacts, pp. 143-150, 2013.
  2. [2] H. Iwata and S. Sugano, “Design of human symbiotic robot TWENDY-ON,” 2009 IEEE Int. Conf. on Robotics and Automation, pp. 580-596, 2009.
  3. [3] M. Quigley, K. Conley, B. Gerkey, J. Faust, T. Foote, J. Leibs, R. Wheeler, and A. Y. Ng, “ROS: an open-source Robot Operating System,” ICRA workshop on open source software, 2009.
  4. [4] Y. Nakamura, S. Muto, Y. Maeda, M. Mizukawa, M. Motegi, and Y. Takashima, “Proposal of Framework Based on 4W1H and Properties of Robots and Objects for Development of Physical Service System,” J. of Robotics and Mechatronics, Vol.26, No.6, pp. 758-771, 2014.
  5. [5] E. Mestheneos, “New Ambient Assistive Technologies: The Users’ Perspectives, IOS Press, Handbook of Ambient Assisted Living,” pp. 749-762, 2013.
  6. [6] S. Yamada, “Designing “Ma” between human and robot,” TDU Press, pp. 39-40, 2007.
  7. [7] K. Takeshi, S. Hiroshi, and Y. Hidekazu, “Fundamental Study on Emotion Estimation Using Dynamic Recognition of Facial Expression,” Human interface: Proc. of the Symposium on Human Interface, Vol.14, pp. 77-82, 1998.
  8. [8] W. Qiongqiong and K. Shinoda, “A Regression Approach to Emotion Estimation in Spontaneous Speech,” Proc. of the Autumn Meeting of Acoustical Society of Japan, pp. 87-88, 2013.
  9. [9] M. Hayashi, M. Kanbara, N. Ukita, and N. Hagita, “Life-logging Framework for Collecting Location and Purpose Information by Virtual Agent on Smart Phone,” IEICE Technical Report, Vol.114, No.32, pp. 57-62, 2014.
  10. [10] J. Tennenbaum, E. Sohar, R. Adar, T. Gilat, and D. Yaski, “The physiological significance of the cumulative discomfort index (Cum DI),” Harefuah, Vol.60, pp. 315-319, 1961.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, IE9,10,11, Opera.

Last updated on Nov. 20, 2018