ABSTRACT
Does the emotional content of a robot's speech affect how people teach it? In this experiment, participants were asked to demonstrate several "dances" for a robot to learn. Participants moved their bodies in response to instructions displayed on a screen behind the robot. Meanwhile, the robot faced the participant and appeared to emulate the participant's movements. After each demonstration, the robot received an accuracy score and the participant chose whether or not to demonstrate that dance again. Regardless of the participant's input, however, the robot's dancing and the scores it received were arranged in advance and constant across all participants. The only variation between groups in this study was what the robot said in response to its scores. Participants saw one of three conditions: appropriate emotional responses, often-inappropriate emotional responses, or apathetic responses. Participants that taught the robot with appropriate emotional responses demonstrated the dances, on average, significantly more frequently and significantly more accurately than participants in the other two conditions.
- E. Aronson. The theory of cognitive dissonance: A current perspective. Advances in Experimental Social Psychology, 4:1--34, 1969.Google Scholar
- C. Bartneck. Interacting with an embodied emotional character. Proceedings of the 2003 International Conference on Designing Pleasurable Products and Interfaces (DPPI), pages 55--60, 2003. Google ScholarDigital Library
- R. Beale and C. Creed. Affective interaction: How emotional agents affect users. International Journal of Human-Computer Studies, 67(9):755--776, 2009. Google ScholarDigital Library
- D. Berry, L. Butler, and F. De Rosis. Evaluating a realistic agent in an advice-giving task. International Journal of Human-Computer Studies, 63(3):304--327, 2005. Google ScholarDigital Library
- T. Bickmore and R. Picard. Establishing and maintaining long-term human-computer relationships. ACM Transactions on Computer-Human Interaction (TOCHI), 12(2):327, 2005. Google ScholarDigital Library
- C. Breazeal and A. Thomaz. Learning from human teachers with socially guided exploration. Proceedings of the 2008 IEEE International Conference on Robotics and Automation (ICRA), pages 3539--3544, 2008.Google ScholarCross Ref
- W. Burleson and R. W. Picard. Gender-Specific Approaches to Developing Emotionally Intelligent Learning Companions. IEEE Intelligent Systems, 22:62--69, 2007. Google ScholarDigital Library
- J. Cassell and K. Thorisson. The power of a nod and a glance: Envelope vs. emotional feedback in animated conversational agents. Applied Artificial Intelligence, 13(4):519--538, 1999.Google ScholarCross Ref
- C. Creed and R. Beale. Psychological Responses to Simulated Displays of Mismatched Emotional Expressions. Interacting with Computers, 20(2):225--239, 2008. Google ScholarDigital Library
- T. Fong, I. Nourbakhsh, and K. Dautenhahn. A survey of socially interactive robots. Robotics and autonomous systems, 42(3-4):143--166, 2003.Google Scholar
- D. Goleman. Social Intelligence: The New Science of Human Relationships. Random House, Inc., 2006.Google Scholar
- H. Kozima, C. Nakagawa, and Y. Yasuda. Interactive robots for communication-care: a case-study in autism therapy. Proceedings of the 2005 IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), pages 341--346, aug. 2005.Google ScholarCross Ref
- I. Leite, A. Pereira, C. Martinho, and A. Paiva. Are emotional robots more fun to play with? In Proceedings of the 2008 IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), pages 77--82, 2008.Google ScholarCross Ref
- K. Liu and R. Picard. Embedded empathy in continuous, interactive health assessment. CHI Workshop on HCI Challenges in Health Assessment, 2005.Google Scholar
- A. Lockerd and C. Breazeal. Tutelage and socially guided robot learning. Proceedings of the 2004 International Conference on Intelligent Robots and Systems (IROS), 4:3475--3480, 2004.Google ScholarCross Ref
- H. Maldonado, J. Lee, S. Brave, C. Nass, H. Nakajima, R. Yamada, K. Iwamura, and Y. Morishima. We learn better together: Enhancing eLearning with emotional characters. Proceedings of the 2005 Conference on Computer Support for Collaborative Learning (CSCL), page 417, 2005. Google ScholarDigital Library
- S. C. Marsella and J. Gratch. EMA: A process model of appraisal dynamics. Cognitive Systems Research, 10(1):70--90, 2009. Modeling the Cognitive Antecedents and Consequences of Emotion. Google ScholarDigital Library
- A. Thomaz and C. Breazeal. Teachable robots: Understanding human teaching behavior to build more effective robot learners. Artificial Intelligence, 172(6--7):716--737, 2008. Google ScholarDigital Library
Index Terms
- Robots that express emotion elicit better human teaching
Recommendations
Designing Expressive Lights and In-Situ Motions for Robots to Express Emotions
HAI '18: Proceedings of the 6th International Conference on Human-Agent InteractionIn this paper, we explore how a utility robot might express emotions via expressive lights and in-situ motions. In most previous work, methods for either modality were investigated alone, leaving a huge potential to improve the expression of emotions by ...
The influence of people’s culture and prior experiences with Aibo on their attitude towards robots
This paper presents a cross-cultural study on peoples’ negative attitude toward robots. 467 participants from seven different countries filled in the negative attitude towards robots scale survey which consists of 14 questions in three clusters: ...
How are you feeling?: Using Tangibles to Log the Emotions of Older Adults
TEI '20: Proceedings of the Fourteenth International Conference on Tangible, Embedded, and Embodied InteractionThe global population is ageing, leading to shifts in healthcare needs. Home healthcare monitoring systems currently focus on physical health, but there is an increasing recognition that psychological wellbeing also needs support. This raises the ...
Comments