DocumentCode
3764404
Title
2D LiDAR and camera fusion in 3D modeling of indoor environment
Author
Juan Li;Xiang He;Jia Li
Author_Institution
Department of Electrical and Computer Engineering, Oakland University, Rochester, MI 48309, U.S.A
fYear
2015
fDate
6/1/2015 12:00:00 AM
Firstpage
379
Lastpage
383
Abstract
Detailed 3D modeling of indoor scene has become an important topic in many research fields. It can provide extensive information about the environment and boost various location based services, such as interactive gaming and indoor navigation. This paper presents an indoor scene construction approach using 2D line-scan LiDAR and entry-level digital camera. Both devices are mounted rigidly on a robotic servo, which sweeps vertically to cover the third dimension. Fiducial target based extrinsic calibration is applied to acquire transformation matrices between LiDAR and camera. Based on the transformation matrix, we perform registration to fuse the color images from the camera with the 3D points cloud from the LiDAR. The whole system setup has much lower cost as compared to systems using 3D LiDAR and omnidirectional camera. Using pre-calculated transformation matrices instead of feature extraction techniques such as SIFT or SURF in registration gives better fusion result and lower computational complexity. The experiments carried out in office building environment show promising results of our approach.
Keywords
"Three-dimensional displays","Laser radar","Cameras","Calibration","Robot sensing systems","Servomotors","Solid modeling"
Publisher
ieee
Conference_Titel
Aerospace and Electronics Conference (NAECON), 2015 National
Electronic_ISBN
2379-2027
Type
conf
DOI
10.1109/NAECON.2015.7443100
Filename
7443100
Link To Document