Skip to main content

Engagement Detection During Deictic References in Human-Robot Interaction

  • Conference paper
  • First Online:
Book cover Social Robotics (ICSR 2016)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 9979))

Included in the following conference series:

Abstract

Humans are typically skilled interaction partners and detect even small problems during an interaction. In contrast, interactive robot systems often lack the basic capabilities to sense the engagement of their interaction partners and keep a common ground. This becomes even more problematic if humanoid robots with human-like behavior are used as they build up high expectations in terms of their cognitive capabilities. This paper contributes an approach for analyzing human engagement during object references in an explanation scenario based on time series alignment. An experimental guide scenario in a smart home environment was used to collect a training and test dataset where the engagement classification is carried out by human operators. The experiments already performed on the dataset give deeper insights into the presented task and motivate an incremental, mixed modality approach to engagement classification. While some of the results rely on external sensors they give an outlook on the requirements and possibilities for HRI scenarios with next-gen social robots.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    VFOA is short for visual focus of attention.

References

  1. Yu, C., Aoki, P.M., Woodruff, A.: Detecting user engagement in everyday conversations. Science, p. 4 (2004)

    Google Scholar 

  2. Bohus, D., Horvitz, E.: Learning to predict engagement with a spoken dialog system in open-world settings. In: SIGdial 2009, London, UK (2009)

    Google Scholar 

  3. Yamazaki, K., Yamazaki, A., Okada, M., Kuno, Y., Kobayashi, Y., Hoshi, Y., Pitsch, K., Luff, P., Lehn, D., Heath, C.: Revealing gauguin: engaging visitors in robot guide s explanation in an art museum. In: CHI 2009 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 1437–1446 (2009)

    Google Scholar 

  4. Jang, M., Ahn, B.K., Park, C., Yang, H.S., Kim, J.H., Cho, Y.J., Lee, D.W., Cho, H.K., Kim, Y.A., Chae, K.: Building an automated engagement recognizer based on video analysis. In: Proceedings of the 2014 ACM/IEEE International Conference on Human-Robot Interaction - HRI 2014, pp. 182–183. ACM Press, New York (2014)

    Google Scholar 

  5. Michalowski, M.P., Simmons, R.: Multimodal person tracking and attention classification. In: Proceeding of the 1st ACM SIGCHI/SIGART Conference on Human-Robot Interaction - HRI 2006, p. 349 (2006)

    Google Scholar 

  6. Rich, C., Ponsler, B., Holroyd, A., Sidner, C.: Recognizing engagement in human-robot interaction. In: 2010 5th ACM/IEEE International Conference on Human-Robot Interaction (HRI) (2010)

    Google Scholar 

  7. Holthaus, P., Leichsenring, C., Bernotat, J., Richter, V., Pohling, M., Carlmeyer, B., Köster, N., Meyer zu Borgsen, S., Zorn, R., Schiffhauer, B., Engelmann, K.F., Lier, F., Schulz, S., Cimiano, P., Eyssel, F.A., Hermann, T., Kummert, F., Schlangen, D., Wachsmuth, S., Wagner, P., Wrede, B., Wrede, S.: How to Address Smart Homes with a Social Robot? A Multi-modal Corpus of User Interactions with an Intelligent Environment, European Language Resources Association (ELRA) (2016)

    Google Scholar 

  8. Ba, S., Odobez, J.M.: Recognizing visual focus of attention from head pose in natural meetings. IEEE Trans. Syst. Man Cybern. Part B (Cybern.) 39(1), 16–33 (2009)

    Google Scholar 

  9. Pitsch, K., Wrede, S.: When a robot orients visitors to an exhibit. Referential practices and interactional dynamics in real world HRI. In: The 23rd IEEE International Symposium on Robot and Human Interactive Communication, pp. 36–42. IEEE (2014)

    Google Scholar 

  10. Wienke, J., Wrede, S.: A middleware for collaborative research in experimental robotics. In: International Symposium on System Integration, Kyoto, Japan (2011)

    Google Scholar 

Download references

Acknowledgments

The authors acknowledge the financial support from the Cluster of Excellence Cognitive Interaction Technology CITEC (EXC 277), Bielefeld University and the Volkswagen Foundation (Dilthey Fellowship Interaction & Space, K. Pitsch).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Timo Dankert .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2016 Springer International Publishing AG

About this paper

Cite this paper

Dankert, T., Goerlich, M., Wrede, S., Gehle, R., Pitsch, K. (2016). Engagement Detection During Deictic References in Human-Robot Interaction. In: Agah, A., Cabibihan, JJ., Howard, A., Salichs, M., He, H. (eds) Social Robotics. ICSR 2016. Lecture Notes in Computer Science(), vol 9979. Springer, Cham. https://doi.org/10.1007/978-3-319-47437-3_91

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-47437-3_91

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-47436-6

  • Online ISBN: 978-3-319-47437-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics