Title :
A Method for Judicious Fusion of Inconsistent Multiple Sensor Data
Author :
Kumar, Manish ; Garg, Devendra P. ; Zachery, Randy A.
Author_Institution :
Dept. of Mech. Eng. & Mater. Sci., Duke Univ., Durham, NC
fDate :
5/1/2007 12:00:00 AM
Abstract :
One of the major problems in sensor fusion is that sensors frequently provide spurious observations which are difficult to predict and model. The spurious measurements from sensors must be identified and eliminated since their incorporation in the fusion pool might lead to inaccurate estimation. This paper presents a unified sensor fusion strategy based on a modified Bayesian approach that can automatically identify the inconsistency in sensor measurements so that the spurious measurements can be eliminated from the data fusion process. The proposed method adds a term to the commonly used Bayesian formulation. This term is an estimate of the probability that the data is not spurious, based upon the measured data and the unknown value of the true state. In fusing two measurements, it has the effect of increasing the variance of the posterior distribution when measurement from one of the sensors is inconsistent with respect to the other. The increase or decrease in variance can be estimated using the information theoretic measure "entropy." The proposed strategy was verified with the help of extensive computations performed on simulated data from three sensors. A comparison was made between two different fusion schemes: centralized fusion in which data obtained from all sensors were fused simultaneously, and a decentralized or sequential Bayesian scheme that proved useful for identifying and eliminating spurious data from the fusion process. The simulations verified that the proposed strategy was able to identify spurious sensor measurements and eliminate them from the fusion process, thus leading to a better overall estimate of the true state. The proposed strategy was also validated with the help of experiments performed using stereo vision cameras, one infrared proximity sensor, and one laser proximity sensor. The information from these three sensing sources was fused to obtain an occupancy profile of the robotic workspace
Keywords :
Bayes methods; entropy; sensor fusion; centralized data fusion; decentralized Bayesian scheme; inconsistent multiple sensor data; infrared proximity sensor; laser proximity sensor; modified Bayesian approach; sensor data fusion process; sensor measurements; sequential Bayesian scheme; stereo vision cameras; unified sensor fusion strategy; Bayesian methods; Cameras; Computational modeling; Entropy; Infrared sensors; Predictive models; Robot vision systems; Sensor fusion; State estimation; Stereo vision; Bayesian approach; decentralized fusion; sensor fusion; sequential fusion; spurious data;
Journal_Title :
Sensors Journal, IEEE
DOI :
10.1109/JSEN.2007.894905