Attention Allocation for Multi-modal Perception of Human-friendly Robot Partners

https://doi.org/10.3182/20130811-5-US-2037.00054Get rights and content

Abstract

This paper proposes a method of attention allocation for multi-modal perception of human-friendly robot partners based on various types of sensors built in a smart phone. First, we propose human and object detection method using octagonal templates based on evolutionary robot vision. Next, we propose an integration method for estimating human behaviors based on the human detection using color image by the multi-layered spiking neural network using the time series of positions of human and object. Furthermore, we propose a method of attention allocation based on the time series of human behavior recognition. Finally, we show several experimental results of the proposed method, and discuss the future direction on this research.

Keywords

Robot Partners
Multi-modal Perception
Computational Intelligence
Human Robot Interaction
Robot Vision

Cited by (0)

View Abstract