DocumentCode
2270477
Title
Implicit mapping of the peripersonal space of a humanoid robot
Author
Antonelli, Marco ; Chinellato, Eris ; Del Pobil, Angel P.
Author_Institution
Robotic Intell. Lab., Jaume I Univ., Castellón de la Plana, Spain
fYear
2011
fDate
11-15 April 2011
Firstpage
1
Lastpage
8
Abstract
In this work, taking inspiration from primate visuomotor mechanisms, a humanoid robot is able to build a sensorimotor map of the environment that is configured and trained through gazing and reaching movements. The map is accessed and modified by two types of information: retinotopic (visual) and proprioceptive (eye and arm movements), and constitutes both a knowledge of the environment and a sensorimotor code for performing movements and evaluate their outcome. By performing direct and inverse transformations between stereo vision, oculomotor and joint-space representations, the robot learns to perform gazing and reaching movements, which are in turn employed to update the sensorimotor knowledge of the environment. Thus, the robot keeps learning during its normal behavior, by interacting with the world and contextually updating its representation of the world itself. Such representation is never made explicit, but rather constitutes a visuomotor awareness of the space which emerges thanks to the interaction of the agent with the surrounding space.
Keywords
humanoid robots; robot vision; stereo image processing; humanoid robot; joint space representation; oculomotor; peripersonal space; primate visuomotor mechanisms; proprioceptive information; retinotopic information; sensorimotor code; sensorimotor knowledge; stereo vision; visuomotor awareness; Head; Joints; Neurons; Robot kinematics; Robot sensing systems; Visualization;
fLanguage
English
Publisher
ieee
Conference_Titel
Computational Intelligence, Cognitive Algorithms, Mind, and Brain (CCMB), 2011 IEEE Symposium on
Conference_Location
Paris
Print_ISBN
978-1-4244-9890-1
Type
conf
DOI
10.1109/CCMB.2011.5952119
Filename
5952119
Link To Document