A survey of Tactile Human–Robot Interactions
Research highlights
► Review of recent work in the field of Tactile Human–Robot Interactions (Tactile HRI). ► Literature examined from two viewpoints: tactile sensing and human–robot interaction. ► Within each viewpoint, a framework by which to categorize approaches is defined. ► Identification of the state-of-the-art and open areas for future research.
Introduction
Robots and humans come into physical contact under a variety of circumstances. For years robots have operated around humans within industrial and scientific settings, and their presence within the home and general society today becomes ever more common. During robot operation, physical contact with a human might be expected or unexpected, and enhance or interfere with the execution of a robot behavior.
Accordingly there are many motivations for detecting human touch within robot applications. This article provides an overview of current research within the field of Tactile Human–Robot Interactions (Tactile HRI): that is, of robot applications that detect and reason about, perhaps even depend on, the touch of a human. The field of tactile HRI represents the intersection of two independent research areas within robotics: the detection of tactile feedback, and interactions between humans and robots. In this review, we consider the field of tactile HRI from each of these perspectives.
The area of tactile detection within robotics is a very broad field, with applications ranging from those which are strictly industrial to those which involve social interactions with humans. Research within the area of tactile feedback aims to improve both the quality and interpretation of sensor data. Improvements in tactile sensing are measured according to the following criteria:
- •
Data quality. Evaluated according to detection sensitivity (range and resolution), noise (derived from the environment and other sensors) and physical robustness.
- •
Signal interpretation. Evaluated according to computational expense (time and storage) and measurement accuracy, and (occasionally) the sophistication of the extracted information.
From the standpoint of tactile HRI detection, a variety of sensing approaches are taken in order to detect human touch. Approaches differ both in the sensor devices and data analysis techniques used for contact detection, as well as the types of touch that are able to be identified. Many applications build custom sensors able to perform an assortment of measurements by using multiple sensor devices with different detection targets. Interest in tactile sensing goes beyond the binary detection of human contact, and a variety of measurement data is extracted from these sensors. This data includes, but is not limited to: contact presence, location, area and duration; force magnitude, orientation and moment; vibration; temperature. Note however that at a minimum contact is always detected, and sensor devices which do not detect contact directly (e.g. potentiometers) are always paired with devices that do. The sensor devices employed for the detection of human touch include, but are not limited to: force/torque sensors, Force-Sensing-Resistors (FSR), contact sensors, electric field sensors, capacitive sensing arrays, resistive sensing arrays, cameras, temperature sensors, potentiometers, photoreflectors, touchpads and strain-gauge sensors.
The field of Human–Robot Interaction (HRI) is a research area that studies the interactions between robots and humans. The area is investigated for many reasons, such as to develop new techniques for knowledge transfer from human to robot, to design effective tools for human control of a robot, and in anticipation of the growing presence of robots within general society, to name a few. Moreover, the area is investigated from a variety of viewpoints, ranging from those which are human-centric, for example human-friendly interfaces for robot control, to those which are robot-centric, for example human guidance in the completion of a robot task. For the reader unfamiliar with HRI, we point to the following works for introductory information. An overview of HRI theory may be found in [6], with a focus on physical interactions being provided in [7]. Dautenhahn and Werry [8] discuss existing techniques, Steinfeld et al. [9] potential metrics, and Yanco and Drury [10] a proposed taxonomy for the analysis of HRI applications. This review will focus exclusively on HRI applications with a human touch element.
From the standpoint of HRI and robot behaviors, the detection of tactile interactions with a human is primarily motivated by at least one of the following considerations1:
- •
Safe operation around humans. The robot possibly interacts with a human during behavior execution, perhaps unexpectedly (e.g. unintended collisions).
- •
A necessary element of behavior execution. The robot definitely, and necessarily, interacts with a human during behavior execution. The human might guide (e.g. indicate behavior selection) or be a partner in (e.g. human–robot team tasks) the execution, or the human–robot contact might be the entire point of the behavior (e.g. robot-assisted touch therapy).
- •
A necessary element for behavior development. The robot depends on tactile contact from a human while building, refining or adapting a behavior.
This review presents the field of tactile HRI from two viewpoints. We contribute a structure for the categorization of techniques within each viewpoint, and furthermore place the current literature within this structure.
The first viewpoint, presented in Section 2, considers approaches within the context of tactile sensing. Three major tactile classes are identified, according to those sensors used for the detection of human touch within the recent literature. This section presents works at a technical level, providing sensor device details.
The second viewpoint, presented in Section 3, considers approaches from the stance of physical human–robot interactions. Three major HRI classes are identified, according to the physical interactions seen in the literature to date. This section provides a presentation of works at a fairly high level.
A summary of the current state-of-the-art within the field of tactile HRI is then provided in Section 4, along with a discussion connecting tactile sensing techniques and HRI applications. Open areas for future research are identified, and in the final section we conclude.
Section snippets
Tactile sensor feedback
This section describes the mechanisms through which tactile feedback is detected and used within tactile HRI applications. We broadly organize the section according to details of the sensor setup used to detect the tactile feedback. Part of our discussion will be devoted to sensor skins, or mechanisms that combine multiple sensor devices with some sort of covering that takes on the shape of the robot body, with the result of continuous sensor coverage over the region covered by the skin. This
Physical human–robot interactions
This section presents the different forms taken by physical human–robot interactions within the current tactile HRI literature. We classify the majority of these HRI forms into three categories. The first are physical interactions that interfere with the execution of a robot behavior (Section 3.1). The most common motivation for detecting such interactions is to enable safe robot operation in the presence of humans. The second category are interactions that are an intended part of robot
Discussion
This section concludes our review of the tactile HRI literature to date. We begin by providing a summary of current trends within tactile HRI, identifying approaches which we consider to define the intersection of advanced tactile detection and HRI work within the field. We then highlight open areas for future research, and conclude by delineating those topics not covered in this review.
Conclusion
We have presented a review of current work within the field of Tactile Human–Robot Interactions (Tactile HRI). The detection of human touch can be important for safe robot operation around humans and furthermore may contribute to robot behavior execution, for example in human-guided motion, as well as to robot behavior development, for example a tactile reward within a learning framework. We have addressed the topic of tactile HRI from the viewpoints of two independent research lines within
Acknowledgements
The research leading to these results has received funding from the European Community’s Seventh Framework Programme FP7/2007-2013–Challenge 2–Cognitive Systems, Interaction, Robotics under grant agreement no. [231500]-[ROBOSKIN].
Brenna D. Argall is a postdoctoral fellow in the Learning Algorithms and Systems Laboratory (LASA) at the Swiss Federal Institute of Technology in Lausanne (EPFL). She received her Ph.D. in Robotics (2009) from the Robotics Institute at Carnegie Mellon University, as well as her M.S. in Robotics (2006) and B.S. in Mathematics (2002). Prior to graduate school, she held a Computational Biology position in the Laboratory of Brain and Cognition at the National Institutes of Health, while
References (66)
- et al.
A sensor for dynamic tactile information with applications in human–robot interaction and object exploration
Robotics and Autonomous Systems
(2006) - et al.
A survey of robot tactile sensing technology
The International Journal of Robotics Research
(1989) Tactile sensing: new directions, new challenges
The International Journal of Robotics Research
(2000)Tactile sensing and control of robotic manipulation
Journal of Advanced Robotics
(2003)- et al.
Tactile sensing in intelligent robotic manipulation
Industrial Robot: an International Journal
(2005) - et al.
Tactile sensing — from humans to humanoids
Transactions on Robotics
(2010) - J. Scholtz, Theory and evaluation of human robot interactions, in: Proceedings of the 36th Annual Hawaii International...
- et al.
An atlas of physical human–robot interaction
Mechanism and Machine Theory
(2008) - K. Dautenhahn, I. Werry, A quantitative technique for analysing robot–human interactions, in: Proceedings of the...
- A. Steinfeld, T. Fong, D. Kaber, M. Lewis, J. Scholtz, A. Schultz, M. Goodrich, Common metrics for human–robot...
Human–robot-contact-state identification based on tactile recognition
IEEE Transactions on Industrial Electronics
Robovie: an interactive humanoid robot
Industrial Robot: An International Journal
Development and evaluation of interactive humanoid robots
Human Interactive Robot for Psychological Enrichment
Proceedings of the IEEE
Haptic communication between humans and robots
Development of an android robot for studying human–robot interaction
Development of the tactile sensor system of a human-interactive robot RI-MAN
IEEE Transactions on Robotics
Cheek to chip: dancing robots and AI’s future
IEEE Intelligent Systems
Cited by (299)
Adaptive speed and separation monitoring based on switching of safety zones for effective human robot collaboration
2022, Robotics and Computer-Integrated ManufacturingCitation Excerpt :The pressure deviation principle has the key benefit of providing direct feedback for the status of the robot surface, either in case of voluntary or unintentional contact with other surfaces. Thus, artificial skins, covered with such sensors can be installed on robot's surface and be used for safely stop a robot's movement in a human-centric perspective [21]. Regarding the use of lightweight robot with compliant joints, they have become very popular in the recent years due to the benefits they present.
RESEARCH ON MOTOR LEARNING AND CONTROL OF MULTI-DOF BIONIC MANIPULATOR
2024, International Journal of Robotics and AutomationTouch Technology in Affective Human-, Robot-, and Virtual-Human Interactions: A Survey
2023, Proceedings of the IEEE
Brenna D. Argall is a postdoctoral fellow in the Learning Algorithms and Systems Laboratory (LASA) at the Swiss Federal Institute of Technology in Lausanne (EPFL). She received her Ph.D. in Robotics (2009) from the Robotics Institute at Carnegie Mellon University, as well as her M.S. in Robotics (2006) and B.S. in Mathematics (2002). Prior to graduate school, she held a Computational Biology position in the Laboratory of Brain and Cognition at the National Institutes of Health, while investigating visualization techniques for neural fMRI data. Her research interests focus upon machine learning techniques to develop and improve robot control systems, under the guidance of a human teacher.
Aude G. Billard is Associate Professor and head of the LASA Laboratory at the School of Engineering at the Swiss Federal Institute of Technology in Lausanne (EPFL). Prior to this, she was Research Assistant Professor at the Department of Computer Sciences at the University of Southern California, where she retains an adjunct faculty position to this day. Aude Billard received a B.Sc. (1994) and M.Sc. (1995) in Physics from EPFL, with specialization in Particle Physics at the European Center for Nuclear Research (CERN), an M.Sc. in Knowledge-based Systems (1996) and a Ph.D. in Artificial Intelligence (1998) from the Department of Artificial Intelligence at the University of Edinburgh. Her research interests focus on machine learning tools to support robot learning through human guidance. This extends also to research on complementary topics, including machine vision and its use in human–machine interaction and computational neuroscience to develop models of learning in humans.