Abstract
Various kinds of audio/video communication systems have been developed and are available today. These systems could offer an identical environment for remote communication just as a face-to-face meeting. However, two critical issues are pointed out. One is lack of existence for remote participants and the other one is lack of relationship with remote participants. This study proposes an idea of active display interface for remote communication system called ARM-COMS (ARm-supported eMbodied COmmunication Monitor System) to tackle these issues. The concept of an active display is to enhance the existence of a digital object shown in the screen by physical movement of the display. The idea of ARM-COMS comes from this concept, and enhances the existence of a remote person by mimicking the physical movement of the remote person using the active display. In addition to this basic feature, ARM-COMS shows the relationship with the remote person by an entrainment control of the active display. This research investigates how users interact with the active display using eye-tracking and motion tracking experiments, and gain insight into user behavior to study the effectiveness of ARM-COMS. This paper presents the initial results of entrainment control and discusses the future works.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Sirkin, D., Ju, W.: Consistency in physical and on-screen action improves perceptions of telepresence robots. In: HRI 2012 Proceedings of the Seventh Annual ACM/IEEE International Conference on Human-Robot Interaction, pp. 57–64 (2012)
Double. http://www.doublerobotics.com/
Osawa, T., Matsuda, Y., Ohmura, R., Imai, M.: Embodiment of an agent by anthropomorphization of a common object. Web Intell. Agent Syst.: Int. J. 10, 345–358 (2012)
Otsuka, T., Araki, S., Ishizuka, K., Fujimoto, M., Heinrich, M., Yamato, J.: A realtime multimodal system for analyzing group meetings by combining face pose tracking and speaker diarization. In: Proceedings of the 10th International Conference on Multimodal Interfaces (ICMI 2008), Chania, Crete, Greece, pp. 257–264 (2008)
Ohtsuka, S., Oka, S., Kihara, K., Tsuruda, T., Seki, M.: Human-body swing affects visibility of scrolled characters with direction dependency. In: Society for Information Display (SID) 2011 Symposium Digest of Technical Papers, pp. 309–312 (2011)
Ito, T., Watanabe, T.: ARM-COMS: ARm-supported eMbodied communication monitor system. In: Yamamoto, S. (ed.) HIMI/HCII 2013, vol. 8018, pp. 307–316. Springer, Heidelberg (2013)
Watanabe, T.: Human-entrained embodied interaction and communication technology. In: Fukuda, S. (ed.) Emotional Engineering, pp. 161–177. Springer, Heidelberg (2011)
Wongphati, M., Matsuda, Y., Osawa, H., Imai, M.: Where do you want to use a robotic arm? And what do you want from the robot? In: International Symposium on Robot and Human Interactive Communication, pp. 322–327 (2012)
Kim, K., Bolton, J., Girouard, A., Cooperstock, J., Vertegaal, R.: TeleHuman: effects of 3D perspective on gaze and pose estimation with a life-size cylindrical telepresence pod. In: Proceedings of CHI 2012, pp. 2531–2540 (2012)
Tariq, A.M., Ito, T.: Master-slave robotic arm manipulation for communication robot. In: Proceedings of 2011 Annual Meeting on Japan Society of Mechanical Engineer, vol. 11, no. 1, p. S12013 (2011)
Kashiwabara, T., Osawa, H., Shinozawa, K., Imai, M.: TEROOS: a wearable avatar to enhance joint activities. In: Annual conference on Human Factors in Computing Systems, pp. 2001–2004 (2012)
Rosch, J.L., Vogel-Walcutt, J.J.: A review of eye-tracking applications as tools for training. Cogn. Technol. Work 15, 313 (2013). doi:10.1007/s10111-012-0234-7
Slykhuis, D., Wiebe, E., Annetta, L.: Eye-tracking students’ attention to powerpoint photographs in a science education setting. J. Sci. Educ. Technol. 14(5), 509–520 (2005)
Acknowledgements
This work was supported by JSPS KAKENHI Grant Numbers JP16K00274, JP26280077. The author would like to acknowledge Hiroki KIMACHI, and all members of Collaborative Engineering Labs at Tokushima University, and Center for Technical Support of Tokushima University, for their cooperation to conduct the experiments.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2018 Springer International Publishing AG
About this paper
Cite this paper
Ito, T., Watanabe, T. (2018). Eye-Tracking Analysis of User Behavior with an Active Display Interface. In: Chung, W., Shin, C. (eds) Advances in Affective and Pleasurable Design. AHFE 2017. Advances in Intelligent Systems and Computing, vol 585. Springer, Cham. https://doi.org/10.1007/978-3-319-60495-4_8
Download citation
DOI: https://doi.org/10.1007/978-3-319-60495-4_8
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-60494-7
Online ISBN: 978-3-319-60495-4
eBook Packages: EngineeringEngineering (R0)