Title :
Towards autonomous image fusion
Author :
Hossny, M. ; Nahavandi, S. ; Creighton, D. ; Bhatti, A.
Author_Institution :
Centre for Intell. Syst. Res., Deakin Univ., Geelong, VIC, Australia
Abstract :
Mobile robots are providing great assistance operating in hazardous environments such as nuclear cores, battlefields, natural disasters, and even at the nano-level of human cells. These robots are usually equipped with a wide variety of sensors in order to collect data and guide their navigation. Whether a single robot operating all sensors or a swarm of cooperating robots operating their special sensors, the captured data can be too large to be transferred across limited resources (e.g. bandwidth, battery, processing, and response time) in hazardous environments. Therefore, local computations have to be carried out on board the swarming robots to assess the worthiness of captured data and the capacity of fused information in a certain spatial dimension as well as selection of proper combination of fusion algorithms and metrics. This paper introduces to the concepts of Type-I and Type-II fusion errors, fusion capacity, and fusion worthiness. These concepts together form the ladder leading to autonomous fusion systems.
Keywords :
image fusion; mobile robots; robot vision; sensors; autonomous image fusion; battlefields; fusion capacity; fusion worthiness; hazardous environments; human cells; ladder leading; mobile robots; natural disasters; nuclear cores; response time; sensors; spatial dimension; swarming robots; type-I fusion errors; type-II fusion errors; Entropy; Histograms; Image fusion; Measurement; Pixel; Robots; Sensors; Autonomous; Fusion Capacity; Image Fusion; Type I; Type II;
Conference_Titel :
Control Automation Robotics & Vision (ICARCV), 2010 11th International Conference on
Conference_Location :
Singapore
Print_ISBN :
978-1-4244-7814-9
DOI :
10.1109/ICARCV.2010.5707343