Elsevier

Robotics and Autonomous Systems

Volume 58, Issue 10, 31 October 2010, Pages 1159-1176
Robotics and Autonomous Systems

A survey of Tactile Human–Robot Interactions

https://doi.org/10.1016/j.robot.2010.07.002Get rights and content

Abstract

Robots come into physical contact with humans in both experimental and operational settings. Many potential factors motivate the detection of human contact, ranging from safe robot operation around humans, to robot behaviors that depend on human guidance. This article presents a review of current research within the field of Tactile Human–Robot Interactions (Tactile HRI), where physical contact from a human is detected by a robot during the execution or development of robot behaviors. Approaches are presented from two viewpoints: the types of physical interactions that occur between the human and robot, and the types of sensors used to detect these interactions. We contribute a structure for the categorization of Tactile HRI research within each viewpoint. Tactile sensing techniques are grouped into three categories, according to what covers the sensors: (i) a hard shell, (ii) a flexible substrate or (iii) no covering. Three categories of physical HRI likewise are identified, consisting of contact that (i) interferes with robot behavior execution, (ii) contributes to behavior execution and (iii) contributes to behavior development. We populate each category with the current literature, and furthermore identify the state-of-the-art within categories and promising areas for future research.

Research highlights

► Review of recent work in the field of Tactile Human–Robot Interactions (Tactile HRI). ► Literature examined from two viewpoints: tactile sensing and human–robot interaction. ► Within each viewpoint, a framework by which to categorize approaches is defined. ► Identification of the state-of-the-art and open areas for future research.

Introduction

Robots and humans come into physical contact under a variety of circumstances. For years robots have operated around humans within industrial and scientific settings, and their presence within the home and general society today becomes ever more common. During robot operation, physical contact with a human might be expected or unexpected, and enhance or interfere with the execution of a robot behavior.

Accordingly there are many motivations for detecting human touch within robot applications. This article provides an overview of current research within the field of Tactile Human–Robot Interactions (Tactile HRI): that is, of robot applications that detect and reason about, perhaps even depend on, the touch of a human. The field of tactile HRI represents the intersection of two independent research areas within robotics: the detection of tactile feedback, and interactions between humans and robots. In this review, we consider the field of tactile HRI from each of these perspectives.

The area of tactile detection within robotics is a very broad field, with applications ranging from those which are strictly industrial to those which involve social interactions with humans. Research within the area of tactile feedback aims to improve both the quality and interpretation of sensor data. Improvements in tactile sensing are measured according to the following criteria:

  • Data quality. Evaluated according to detection sensitivity (range and resolution), noise (derived from the environment and other sensors) and physical robustness.

  • Signal interpretation. Evaluated according to computational expense (time and storage) and measurement accuracy, and (occasionally) the sophistication of the extracted information.

For a review of tactile sensor devices and the algorithms that interpret their signals, we refer the reader to Nicholls and Lee [1] and Lee [2]; for a higher-level overview of advances in tactile robotics, to Howe [3]; for a more recent overview of grasping and manipulation approaches, and the hardware that such approaches typically employ, to Tegin and Wikander [4]; and for a current survey of tactile sensing for robotics, that additionally identifies design hints for sensor development, to Dahiya et al. [5]. In this review we will address only tactile sensing as used to detect human contact within HRI applications.

From the standpoint of tactile HRI detection, a variety of sensing approaches are taken in order to detect human touch. Approaches differ both in the sensor devices and data analysis techniques used for contact detection, as well as the types of touch that are able to be identified. Many applications build custom sensors able to perform an assortment of measurements by using multiple sensor devices with different detection targets. Interest in tactile sensing goes beyond the binary detection of human contact, and a variety of measurement data is extracted from these sensors. This data includes, but is not limited to: contact presence, location, area and duration; force magnitude, orientation and moment; vibration; temperature. Note however that at a minimum contact is always detected, and sensor devices which do not detect contact directly (e.g. potentiometers) are always paired with devices that do. The sensor devices employed for the detection of human touch include, but are not limited to: force/torque sensors, Force-Sensing-Resistors (FSR), contact sensors, electric field sensors, capacitive sensing arrays, resistive sensing arrays, cameras, temperature sensors, potentiometers, photoreflectors, touchpads and strain-gauge sensors.

The field of Human–Robot Interaction (HRI) is a research area that studies the interactions between robots and humans. The area is investigated for many reasons, such as to develop new techniques for knowledge transfer from human to robot, to design effective tools for human control of a robot, and in anticipation of the growing presence of robots within general society, to name a few. Moreover, the area is investigated from a variety of viewpoints, ranging from those which are human-centric, for example human-friendly interfaces for robot control, to those which are robot-centric, for example human guidance in the completion of a robot task. For the reader unfamiliar with HRI, we point to the following works for introductory information. An overview of HRI theory may be found in [6], with a focus on physical interactions being provided in [7]. Dautenhahn and Werry [8] discuss existing techniques, Steinfeld et al. [9] potential metrics, and Yanco and Drury [10] a proposed taxonomy for the analysis of HRI applications. This review will focus exclusively on HRI applications with a human touch element.

From the standpoint of HRI and robot behaviors, the detection of tactile interactions with a human is primarily motivated by at least one of the following considerations1:

  • Safe operation around humans. The robot possibly interacts with a human during behavior execution, perhaps unexpectedly (e.g. unintended collisions).

  • A necessary element of behavior execution. The robot definitely, and necessarily, interacts with a human during behavior execution. The human might guide (e.g. indicate behavior selection) or be a partner in (e.g. human–robot team tasks) the execution, or the human–robot contact might be the entire point of the behavior (e.g. robot-assisted touch therapy).

  • A necessary element for behavior development. The robot depends on tactile contact from a human while building, refining or adapting a behavior.

Approaches motivated by each of these considerations will be presented within this review.

This review presents the field of tactile HRI from two viewpoints. We contribute a structure for the categorization of techniques within each viewpoint, and furthermore place the current literature within this structure.

The first viewpoint, presented in Section 2, considers approaches within the context of tactile sensing. Three major tactile classes are identified, according to those sensors used for the detection of human touch within the recent literature. This section presents works at a technical level, providing sensor device details.

The second viewpoint, presented in Section 3, considers approaches from the stance of physical human–robot interactions. Three major HRI classes are identified, according to the physical interactions seen in the literature to date. This section provides a presentation of works at a fairly high level.

A summary of the current state-of-the-art within the field of tactile HRI is then provided in Section 4, along with a discussion connecting tactile sensing techniques and HRI applications. Open areas for future research are identified, and in the final section we conclude.

Section snippets

Tactile sensor feedback

This section describes the mechanisms through which tactile feedback is detected and used within tactile HRI applications. We broadly organize the section according to details of the sensor setup used to detect the tactile feedback. Part of our discussion will be devoted to sensor skins, or mechanisms that combine multiple sensor devices with some sort of covering that takes on the shape of the robot body, with the result of continuous sensor coverage over the region covered by the skin. This

Physical human–robot interactions

This section presents the different forms taken by physical human–robot interactions within the current tactile HRI literature. We classify the majority of these HRI forms into three categories. The first are physical interactions that interfere with the execution of a robot behavior (Section 3.1). The most common motivation for detecting such interactions is to enable safe robot operation in the presence of humans. The second category are interactions that are an intended part of robot

Discussion

This section concludes our review of the tactile HRI literature to date. We begin by providing a summary of current trends within tactile HRI, identifying approaches which we consider to define the intersection of advanced tactile detection and HRI work within the field. We then highlight open areas for future research, and conclude by delineating those topics not covered in this review.

Conclusion

We have presented a review of current work within the field of Tactile Human–Robot Interactions (Tactile HRI). The detection of human touch can be important for safe robot operation around humans and furthermore may contribute to robot behavior execution, for example in human-guided motion, as well as to robot behavior development, for example a tactile reward within a learning framework. We have addressed the topic of tactile HRI from the viewpoints of two independent research lines within

Acknowledgements

The research leading to these results has received funding from the European Community’s Seventh Framework Programme FP7/2007-2013–Challenge 2–Cognitive Systems, Interaction, Robotics under grant agreement no. [231500]-[ROBOSKIN].

Brenna D. Argall is a postdoctoral fellow in the Learning Algorithms and Systems Laboratory (LASA) at the Swiss Federal Institute of Technology in Lausanne (EPFL). She received her Ph.D. in Robotics (2009) from the Robotics Institute at Carnegie Mellon University, as well as her M.S. in Robotics (2006) and B.S. in Mathematics (2002). Prior to graduate school, she held a Computational Biology position in the Laboratory of Brain and Cognition at the National Institutes of Health, while

References (66)

  • P.A. Schmidt et al.

    A sensor for dynamic tactile information with applications in human–robot interaction and object exploration

    Robotics and Autonomous Systems

    (2006)
  • H. Nicholls et al.

    A survey of robot tactile sensing technology

    The International Journal of Robotics Research

    (1989)
  • M. Lee

    Tactile sensing: new directions, new challenges

    The International Journal of Robotics Research

    (2000)
  • R.D. Howe

    Tactile sensing and control of robotic manipulation

    Journal of Advanced Robotics

    (2003)
  • J. Tegin et al.

    Tactile sensing in intelligent robotic manipulation

    Industrial Robot: an International Journal

    (2005)
  • R.S. Dahiya et al.

    Tactile sensing — from humans to humanoids

    Transactions on Robotics

    (2010)
  • J. Scholtz, Theory and evaluation of human robot interactions, in: Proceedings of the 36th Annual Hawaii International...
  • A.D. Santis et al.

    An atlas of physical human–robot interaction

    Mechanism and Machine Theory

    (2008)
  • K. Dautenhahn, I. Werry, A quantitative technique for analysing robot–human interactions, in: Proceedings of the...
  • A. Steinfeld, T. Fong, D. Kaber, M. Lewis, J. Scholtz, A. Schultz, M. Goodrich, Common metrics for human–robot...
  • H. Yanco, J. Drury, Classifying human–robot interaction: an updated taxonomy, in: IEEE International Conference on...
  • H. Iwata, H. Hoshino, T. Morita, S. Sugano, Human-humanoid physical interaction realizing force following and task...
  • M. Frigola, A. Casals, J. Amat, Human–robot interaction based on a sensitive bumper skin, in: Proceedings of the...
  • S. Koo, J.G. Lim, D. soo Kwon, Online touch behavior recognition of hard-cover robot using temporal decision tree...
  • H. Iwata et al.

    Human–robot-contact-state identification based on tactile recognition

    IEEE Transactions on Industrial Electronics

    (2005)
  • N. Mitsunaga, T. Miyashita, H. Ishiguro, K. Kogure, N. Hagita, Robovie-IV: a communication robot interacting with...
  • H. Ishiguro et al.

    Robovie: an interactive humanoid robot

    Industrial Robot: An International Journal

    (2001)
  • T. Kanda et al.

    Development and evaluation of interactive humanoid robots

    Human Interactive Robot for Psychological Enrichment

    Proceedings of the IEEE

    (2004)
  • T. Miyashita et al.

    Haptic communication between humans and robots

  • T. Tajika, T. Miyashita, H. Ishiguro, N. Hagita, Reducing influence of robot’s motion on tactile sensor based on...
  • T. Minato, Y. Yoshikawa, T. Noda, S. Ikemoto, H. Ishiguro, M. Asada, CB2: a child robot with biomimetic body for...
  • T. Minato et al.

    Development of an android robot for studying human–robot interaction

  • D. Matsui, T. Minato, K.F. MacDorman, H. Ishiguro, Generating natural motion in an android by mapping human motion, in:...
  • S. Nishio, H. Ishiguro, N. Hagita, Geminoid: Teleoperated android of an existing person, in: A.C. de Pina Filho (Ed.),...
  • T. Mukai et al.

    Development of the tactile sensor system of a human-interactive robot RI-MAN

    IEEE Transactions on Robotics

    (2008)
  • T. Yoshikai, M. Hayashi, Y. Ishizaka, T. Sagisaka, M. Inaba, Behavior integration for whole-body close interactions by...
  • Y. Ohmura, Y. Kuniyoshi, A. Nagakubo, Conformable and scalable tactile sensor skin for curved surfaces, in: Proceedings...
  • Y. Ohmura, Y. Kuniyoshi, Humanoid robot which can lift a 30 kg box by whole body contact and tactile feedback, in:...
  • K. Wada, T. Shibata, Social effects of robot therapy in a care house — change of social network of the residents for...
  • T. Shibata, Ubiquitous surface tactile sensor, in: First IEEE Technical Exhibition Based Conference on Robotics and...
  • W.D. Stiehl, J. Lieberman, C. Breazeal, L. Basel, L. Lalla, M. Wolf, The design of the Huggable: a therapeutic robotic...
  • J.-J. Aucouturier et al.

    Cheek to chip: dancing robots and AI’s future

    IEEE Intelligent Systems

    (2008)
  • S. Yohanan, K. MacLean, A tool to study affective touch: Goals & design of the haptic creature, in: Proceedings of ACM...
  • Cited by (299)

    View all citing articles on Scopus

    Brenna D. Argall is a postdoctoral fellow in the Learning Algorithms and Systems Laboratory (LASA) at the Swiss Federal Institute of Technology in Lausanne (EPFL). She received her Ph.D. in Robotics (2009) from the Robotics Institute at Carnegie Mellon University, as well as her M.S. in Robotics (2006) and B.S. in Mathematics (2002). Prior to graduate school, she held a Computational Biology position in the Laboratory of Brain and Cognition at the National Institutes of Health, while investigating visualization techniques for neural fMRI data. Her research interests focus upon machine learning techniques to develop and improve robot control systems, under the guidance of a human teacher.

    Aude G. Billard is Associate Professor and head of the LASA Laboratory at the School of Engineering at the Swiss Federal Institute of Technology in Lausanne (EPFL). Prior to this, she was Research Assistant Professor at the Department of Computer Sciences at the University of Southern California, where she retains an adjunct faculty position to this day. Aude Billard received a B.Sc. (1994) and M.Sc. (1995) in Physics from EPFL, with specialization in Particle Physics at the European Center for Nuclear Research (CERN), an M.Sc. in Knowledge-based Systems (1996) and a Ph.D. in Artificial Intelligence (1998) from the Department of Artificial Intelligence at the University of Edinburgh. Her research interests focus on machine learning tools to support robot learning through human guidance. This extends also to research on complementary topics, including machine vision and its use in human–machine interaction and computational neuroscience to develop models of learning in humans.

    View full text