DocumentCode :
740079
Title :
Challenges and Opportunities of Multimodality and Data Fusion in Remote Sensing
Author :
Dalla Mura, M. ; Prasad, S. ; Pacifici, F. ; Gamba, P. ; Chanussot, J. ; Benediktsson, J.A.
Author_Institution :
GIPSA-Lab., Univ. Grenoble Alpes, Grenoble, France
Volume :
103
Issue :
9
fYear :
2015
Firstpage :
1585
Lastpage :
1601
Abstract :
Remote sensing is one of the most common ways to extract relevant information about Earth and our environment. Remote sensing acquisitions can be done by both active (synthetic aperture radar, LiDAR) and passive (optical and thermal range, multispectral and hyperspectral) devices. According to the sensor, a variety of information about the Earth´s surface can be obtained. The data acquired by these sensors can provide information about the structure (optical, synthetic aperture radar), elevation (LiDAR), and material content (multispectral and hyperspectral) of the objects in the image. Once considered together their complementarity can be helpful for characterizing land use (urban analysis, precision agriculture), damage detection (e.g., in natural disasters such as floods, hurricanes, earthquakes, oil spills in seas), and give insights to potential exploitation of resources (oil fields, minerals). In addition, repeated acquisitions of a scene at different times allows one to monitor natural resources and environmental variables (vegetation phenology, snow cover), anthropological effects (urban sprawl, deforestation), climate changes (desertification, coastal erosion), among others. In this paper, we sketch the current opportunities and challenges related to the exploitation of multimodal data for Earth observation. This is done by leveraging the outcomes of the data fusion contests, organized by the IEEE Geoscience and Remote Sensing Society since 2006. We will report on the outcomes of these contests, presenting the multimodal sets of data made available to the community each year, the targeted applications, and an analysis of the submitted methods and results: How was multimodality considered and integrated in the processing chain? What were the improvements/new opportunities offered by the fusion? What were the objectives to be addressed and the reported solutions? And from this, what will be the next challenges?
Keywords :
data acquisition; geophysical image processing; hyperspectral imaging; image fusion; optical radar; remote sensing by laser beam; remote sensing by radar; synthetic aperture radar; Earth information extraction; Earth observation; Earth surface; IEEE Geoscience and Remote Sensing Society; LiDAR; anthropological effect; climate change; coastal erosion; damage detection; data fusion; deforestation; desertification; earthquake; environmental information extraction; environmental variable; flood; hurricane; hyperspectral imaging; land use characterization; method analysis; mineral exploitation; multimodal data exploitation; multispectral imaging; natural disaster; natural resource monitoring; oil field exploitation; oil spill; optical radar; optical range; passive devices; precision agriculture; remote sensing acquisition; remote sensing multimodality; resource exploitation; snow cover; synthetic aperture radar; thermal range; urban analysis; urban sprawl; vegetation phenology; Data integration; Laser radar; Multimodal sensors; Optical sensors; Remote sensing; Spatial resolution; Synthetic aperture radar; Change detection (CD); classification; data fusion (DF); pansharpening; remote sensing;
fLanguage :
English
Journal_Title :
Proceedings of the IEEE
Publisher :
ieee
ISSN :
0018-9219
Type :
jour
DOI :
10.1109/JPROC.2015.2462751
Filename :
7194740
Link To Document :
بازگشت