DocumentCode
3369646
Title
Fast and accurate humanoid robot navigation guided by stereovision
Author
Guodong, Chen ; Xie, Ming ; Xia, Zeyang ; Sun, Lining ; Ji, Junhong ; Du, Zhijiang ; Lei, Wang
Author_Institution
State Key Lab. of Robot. & Syst., Harbin Inst. of Technol., Harbin, China
fYear
2009
fDate
9-12 Aug. 2009
Firstpage
1910
Lastpage
1915
Abstract
Stair-climbing and moving object grasping both require high precision information feedback of the feature coordinate. This manuscript mainly describes how to process the information acquired from the stereovision system in a fast and accurate way and how to use the data to compensate the progressive error caused by the humanoid robot. From camera calibration to image processing to stereo match, every step in the whole vision information processing process plays an important role in getting high precision. Less time consuming can make the robot adapt to the changes of environment quickly, especially in dynamic environment. In this paper, two humanoid robot common tasks are chosen as experiments to verify the effectiveness of the proposed methods. Stair climbing experiment verifies the high precision in long distances and also by using the image information to compensate the progressive error caused by long distance walking and moving object grasping experiment verifies the less time consuming.
Keywords
humanoid robots; materials handling; mobile robots; motion control; path planning; robot dynamics; robot vision; stereo image processing; dynamic environment; humanoid robot navigation; moving object grasping; stereo vision system; Calibration; Cameras; Feedback; Humanoid robots; Image processing; Information processing; Legged locomotion; Navigation; Robot kinematics; Robot vision systems; Camera calibration; Feature extraction; Humanoid robot; Image processing;
fLanguage
English
Publisher
ieee
Conference_Titel
Mechatronics and Automation, 2009. ICMA 2009. International Conference on
Conference_Location
Changchun
Print_ISBN
978-1-4244-2692-8
Electronic_ISBN
978-1-4244-2693-5
Type
conf
DOI
10.1109/ICMA.2009.5246533
Filename
5246533
Link To Document