JACIII Vol.15 No.6 pp. 731-736
doi: 10.20965/jaciii.2011.p0731


Proposal ofWearable Multiremote Controller Using Head-Tracking

Kohei Miyata*, Yuhki Kitazono**, Shiyuan Yang*,
and Seiichi Serikawa*

*Kyushu Institute of Technology, 1-1 Sensui-cho, Tobata-ku, Kitakyushu-city, Fukuoka 804-8550, Japan

**Kitakyushu National College of Technology, 5-20-1 Shii, Kokuraminami-ku, Kitakyushu-city, Fukuoka 802-0985, Japan

December 20, 2010
May 20, 2011
August 20, 2011
head mounted display, remote control, head tracking
Recently, various remote controllers are developed. However the operation of remote controller is very difficult for the physically handicapped persons who cannot move their hand. The wearable multiremote controller we are proposing uses head tracking via a camera and a head-mounted display. It is portable and can operate household appliances. It is also operated intuitively in head tracking.
Cite this article as:
K. Miyata, Y. Kitazono, S. Yang, and S. Serikawa, “Proposal ofWearable Multiremote Controller Using Head-Tracking,” J. Adv. Comput. Intell. Intell. Inform., Vol.15 No.6, pp. 731-736, 2011.
Data files:
  1. [1] Y. Kitazono, K. Ishida, L. Zhang, and S. Serikawa, “Proposal of Easily Operated Remote Control Using a WEB Camera,” Proc. of SCIS & ISIS 2008, No.FR-C4-3, pp. 1100-1104, 2008.
  2. [2] K. Mitsui, H. Igaki, K. Takemura, M. Nakamura, and K. Matsumoto, “Exploiting Eye Gaze Information for Operating Services in Home Network System,” 2006 Int. Symposium on Ubiquitous Computing Systems, Seoul, Korea, pp. 13-27, 2006.
  3. [3] H. Igaki, M. Nakamura, and K. Matsumoto, “A Service-Oriented Framework for Networked Appliances to Achieve Appliance Interoperability and Evolution in Home Network System,” Proc.of Int. Workshop on Principles of Software Evolution (IWPSE 2005), Lisbon, Portugal, pp. 61-64, 2005.
  4. [4] M. Nakamura, A. Tanaka, H. Igaki, H. Tamada, and K. Matsumoto, “Constructing Home Network Systems and Integrated Services Using Legacy Home Appliances and Web Services,” Int. J. of Web Services Research, Vol.5, No.1, pp. 82-98, 2008.
  5. [5] J. P. Chin, V. A. Diehl, and K. L. Norman, “Development of an instrument measuring user satisfaction of the human-computer interface,” Proc. of the SIGCHI Conf. on Human factors in computing systems,Washington, D.C., United States, pp. 15-19, 1988.
  6. [6] H. Kato and M. Billinghurst, “Marker Tracking and HMD Calibration for a Video-Based Augmented Reality Conferencing System,” IWAR 1999, pp. 85-94, 1999.
  7. [7] H. Bay, T. Tuytelaars, and L. V. Gool, “SURF: Speeded Up Robust Features, Computer Vision – ECCV 2006,” Lecture Notes in Computer Science, Vol.3951/2006, pp. 404-417, 2006.
  8. [8] S. Baker, D. Scharstein, J. P. Lewis, S. Roth. M. J. Black, and R. Szeliski, “A database and evaluation methodology for optical flow,” In Proc. of the IEEE Int. Conf. on computer vision, 2007.
  9. [9] Y. Kameda, A. Imiya, and N. Ohnishi, “A Convergence Proof for the Horn-Schunck Optical-Flow Computation Scheme Using Neighborhood Decomposition,” Lecture Notes in Computer Science, Vol.4958/2008, pp. 262-273, 2008.
  10. [10] S. Baker, “Lucas-Kanade 20 Years On: A Unifying Framework,” Int. J. of Computer Vision, Vol.56, No.3, pp. 221-255, 2004.
  11. [11] H. Hayakawa and T. Shibata, “Block-Matching-Based Motion Field Generation Utilizing Directional Edge Displacement,” Computers & Electrical Engineering, Vol.36, pp. 617-625, 2010.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Apr. 05, 2024