Title: Anthropomorphic awareness of partner robot to user's situation based on gaze and speech detection
Authors: Tomoko Yonezawa; Hirotake Yamazoe; Akira Utsumi; Shinji Abe
Addresses: ATR Intelligent Robotics and Communication Laboratories, 2-2-2 Hikaridai, Seika, Soraku, 619-0288 Kyoto, Japan ' ATR Intelligent Robotics and Communication Laboratories, 2-2-2 Hikaridai, Seika, Soraku, 619-0288 Kyoto, Japan ' ATR Intelligent Robotics and Communication Laboratories, 2-2-2 Hikaridai, Seika, Soraku, 619-0288 Kyoto, Japan ' ATR Intelligent Robotics and Communication Laboratories, 2-2-2 Hikaridai, Seika, Soraku, 619-0288 Kyoto, Japan
Abstract: This paper introduces a daily-partner robot, that is aware of the user's situation by using gaze and utterance detection. For appropriate anthropomorphic interaction, the robot should talk to the user in proper timing without interrupting her/his task. Our proposed robot 1) estimates the user's context (the target of her/his speech) by detecting his/his gaze the utterance, 2) expresses the need to speak to the user by silent gaze-turns towards the user and the object of joint attention (speech-implying behaviour) and 3) tells the message when the user talks to the robot. Based on preliminary results that show the sufficient human-sensitivity to the speech-implying behaviours of the robot, we evaluate the proposed behavioural model. The results show that the crossmodal awareness is effective for respectful communication that does not disturb the user's ongoing task by silent behaviours that effectively show the robot's intention to speak and draw the user's attention.
Keywords: anthropomorphic awareness; crossmodal awareness; anthropomorphic behaviour; stuffed toys; toy robots; wide area gaze tracking; utterance detection; gaze detection; speech detection; joint attention; behavioural modelling; intention to speak; silent communication; human-robot interaction.
DOI: 10.1504/IJAACS.2012.044782
International Journal of Autonomous and Adaptive Communications Systems, 2012 Vol.5 No.1, pp.18 - 38
Published online: 05 Dec 2014 *
Full-text access for editors Full-text access for subscribers Purchase this article Comment on this article