DocumentCode :
1608569
Title :
Fusing laser and image data for 3D perceived space representations
Author :
Bourbakis, N. ; Andel, Rich
Author_Institution :
Center for Intelligent Syst., Binghamton Univ., NY, USA
fYear :
1997
Firstpage :
50
Lastpage :
58
Abstract :
In this paper a fusion technique is presented for 3-D representation of space. The technique is based on a fusion process which involves laser range data and segmented image data with human expertise related with the surrounding environment. In particular, the range data may contain noise due to reflections on sloped surfaces or long distance open corridors, thus the proposed fusion approach removes the noise by using human expertise from 3-D environments. This method could be utilized either by an autonomous robot in an unknown environment, or by an inspection machine in a complex manufacturing environment, or by a visual navigation device used by blind people. Additional applications for this technique are the ability to correct measurement deficiencies in a laser-scan and to provide a true color and 3-D perceived shape representation for a given object in a modeling environment
Keywords :
image representation; image segmentation; noise; sensor fusion; 3D environments; 3D perceived shape representation; 3D perceived space representations; autonomous robot; complex manufacturing environment; fusion process; human expertise; image data; inspection machine; laser data; laser range data; modeling environment; segmented image data; sloped surfaces; visual navigation device; Acoustic reflection; Humans; Image segmentation; Inspection; Laser fusion; Laser noise; Optical reflection; Robots; Shape measurement; Working environment noise;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Tools with Artificial Intelligence, 1997. Proceedings., Ninth IEEE International Conference on
Conference_Location :
Newport Beach, CA
ISSN :
1082-3409
Print_ISBN :
0-8186-8203-5
Type :
conf
DOI :
10.1109/TAI.1997.632236
Filename :
632236
Link To Document :
بازگشت