JACIII Vol.14 No.7 pp. 840-851
doi: 10.20965/jaciii.2010.p0840


Behavior Generation and Evaluation of Negotiation Agent Based on Negotiation Dialogue Instances

Daisuke Katagami*, Yusuke Ikeda**, and Katsumi Nitta**

*Department of Applied Computer Science, Faculty of Engineering, Tokyo Polytechnic University, 1583 Iiyama, Atsugi, Kanagawa 243-0297, Japan

**Interdisciplinary Graduate School of Science and Engineering, Tokyo Institute of Technology, 4259 Nagatsuta, Midori-ku, Yokohama, Kanagawa 226-8502, Japan

April 18, 2010
August 8, 2010
November 20, 2010
negotiation dialog, negotiation agent, gesture generation

This study focuses on gestures negotiation dialogs. Analyzing the situation/gesture relationship, we suggest how to enable agents to conduct adequate human-like gestures and evaluated whether an agent’s gestures could give an impression similar to those by a human being. We collected negotiation dialogs to study common human gestures. We studied gesture frequency in different situations and extracted gestures with high frequency, making an agent gesture module based on the number of characteristics. Using a questionnaire, we evaluated the impressions of gestures by human users and agents, confirming that the agent expresses the same state of mind as the human being by generating an adequately human-like gesture.

Cite this article as:
Daisuke Katagami, Yusuke Ikeda, and Katsumi Nitta, “Behavior Generation and Evaluation of Negotiation Agent Based on Negotiation Dialogue Instances,” J. Adv. Comput. Intell. Intell. Inform., Vol.14, No.7, pp. 840-851, 2010.
Data files:
  1. [1] R. Fisher and D. Shapiro, “Beyond Reason: Using Emotions as You Negotiate,” Penguin, 2006.
  2. [2] T. Tanaka, Y. Yasumura, D. Katagami, and K. Nitta, “Case Based Online Training Support System for ADR Mediator,” ICAIL2005 workshop Artificial Intelligence and Legal Education, pp. 22-27, 2005.
  3. [3] T. Tanaka, D. Katagami, and K. Nitta, “Advice Agent for Online Mediator Education,” AAMAS-06 workshop Int. Workshop on Agent-Based Systems for Human Learning (ABSHL), pp. 43-48, 2006.
  4. [4] T. Tanaka, N. Maeda, D. Katagami, and K. Nitta, “Case Based Utterance Generating for An Argument Agent,” The fifth workshop on Knowledge and Reasoning in Practical Dialogue Systems (in Conjunction with IJCAI 2007), pp. 38-41, 2007.
  5. [5] T. Tanaka, N. Maeda, D. Katagami, and N. Nitta, “Characterized Argument Agent for Training Partner,” New Frontiers in Artificial Intelligence: JSAI 2007 Conf. and Workshops Revised Selected Papers, Lecture Notes on Artificial Intelligence, Vol.4914, pp. 377-389, Springer, 2008.
  6. [6] M. Yuasa, Y. Yasumura, and K. Nitta, “Negotiation Support Tool Using Emotional Factors,” IFSA-NAFIPS 2001 Conf. Proc., 2001.
  7. [7] M. Chen, D. Katagami, and K. Nitta, “Let’s Play Catch in Words: Online Negotiation System with a Sense of Presence Based on Haptic Interaction,” IEEE/WIC/ACM Int. Joint Conf. on Web Intelligence and Intelligent Agent Technology, Vol.3, pp. 357-360, 2009.
  8. [8] M. Bono and K. Takanashi, “What Is Necessary in Analyses of Multi-Party Interaction?,” J. of the Japanese Society for Artificial Intelligence, Vol.22, No.5, pp. 703-710, 2007. (in Japanese)
  9. [9] L. Gordon and E. K. Gergory, “Negotiation in electronic commerce: Intergrating negotiation support and software agent technologies,” In 29th Atlantic School of Business, 1999.
  10. [10] K. Gregory and N. Sunil, “Supporting international negotiation with a www-based system,” In IIASA, IR-97-49, 1997.
  11. [11] K. Gregory and N. Sunil, “Negotiation and the web:User’s perceptions and acceptance,” In IIASA, IR-98-002, 1998.
  12. [12] S. Von-Wuu, “Agent negotiation under uncertainty and risk,” In PRIMA 2000, pp. 31-45, 2000.
  13. [13] F. Shaheen, W. Michael, and N. R. Jennings, “The influence of information on negotiation equilibrium,” In AAMAS-2002, 2002.
  14. [14] C. Mudgal and U. Vassileva, “Bilateral negotiation with incomplete and uncertain information: a decision-teoretic approach using a model of the opponent,” In Cooperative Information Agents, pp. 107-118, 2000.
  15. [15] A. Mehrabian, “Silent messages,” Wadsworth, Belmont, California, 1971.
  16. [16] A. Kendon, “Do Gestures Communicate?,” A Review Research in Language and Social Interaction, Vol.27, No.3, pp. 175-200, 1994.
  17. [17] W. Rogers, “The Contribution of Kinesic Illustrators towards the Comprehension of Verbal Behavior within Utterances,” Human Communication Research, Vol. 5, pp. 54-62, 1978.
  18. [18] K. W. Berger and G. R. Popelka, “Extra-facial Gestures in Relation to Speech-reading,” J. of Comuniation Disorders, Vol.3, pp. 302-308, 1971.
  19. [19] A. Kendon, “Gesture: Visible action as utterance,” Cambridge University Press, Cambridge, U.K., 2004.
  20. [20]
  21. [21] H. G. Wallbott, “Hand movement quality: A neglected aspect of nonverbal behavior in clinical judgment and person perception,” J. of Clinical Psychogy, Vol.41, pp. 345-359, 1985.
  22. [22] M. Kipp, “ANVIL – a generic annotation tool for multimodal dialogue,” In EUROSPEECH-2001, pp. 1367-1370, 2001.
  23. [23] M. Hayashi, “Machine Production of TV Program from Script – A proposal of TVML,” Annual Conf. on The Institute of Image Information and Television Engineers, S4-3, pp. 589-592, 1996. (in Japanese)
  24. [24] S. Descamps and M. Ishizuka, “MPML: a mark up language for controlling the behavior of life-like charactres,” J. of Visual Languages & Computing, Vol.15, No.2, pp. 183-203, 2004.
  25. [25] K. Manos, T. Panayiotopoulos, and G. Katsionis, “Virtual Director: Visualization of Simle Scenarios,” 2nd hellenic Conf. on Artificial Intelligence, SETEN, 2002.
  26. [26] J. Raskin, “The Human Interface,” Addison-Wesley ACM Press, 2000.
  27. [27] T. Kurokawa, “Nonverbal Interface,” Ohmsha, , 1994. (in Japanese)

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Mar. 05, 2021