Title :
Learning relational object categories using behavioral exploration and multimodal perception
Author :
Sinapov, Jivko ; Schenck, Connor ; Stoytchev, Alexander
Author_Institution :
Dev. Robot. Lab., Iowa State Univ., Ames, IA, USA
fDate :
May 31 2014-June 7 2014
Abstract :
This paper proposes a framework for learning human-provided category labels that describe individual objects, pairwise object relationships, as well as groups of objects. The framework was evaluated using an experiment in which the robot interactively explored 36 objects that varied by color, weight, and contents. The proposed method allowed the robot not only to learn categories describing individual objects, but also to learn categories describing pairs and groups of objects with high recognition accuracy. Furthermore, by grounding the category representations in its own sensorimotor repertoire, the robot was able to estimate how similar two categories are in terms of the behaviors and sensory modalities that are used to recognize them. Finally, this grounded measure of similarity enabled the robot to boost its recognition performance when learning a new category by relating it to a set of familiar categories.
Keywords :
image colour analysis; learning (artificial intelligence); object recognition; robot vision; behavioral exploration; category representation; human-provided category label learning; individual objects; multimodal perception; object color; object content; object groups; object weight; pairwise object relationships; recognition performance; relational object category learning; robot interactive exploration; sensorimotor repertoire; sensory modality; similarity measure; Context; Feature extraction; Image color analysis; Robot sensing systems; Vectors; Visualization;
Conference_Titel :
Robotics and Automation (ICRA), 2014 IEEE International Conference on
Conference_Location :
Hong Kong
DOI :
10.1109/ICRA.2014.6907696