A method of image data fusion is proposed, which applies a stationary or redundant wavelet decomposition to merge multispectral and panchromatic images. Couples of subbands of corresponding frequency content are locally merged if a correlation condition is satisfied, and the fused image is produced by taking the inverse transform. A local correlation coefficient between the low-frequency subband of the high-resolution image and the corresponding subband of the up-sampled multispectral image is used in order to pilot the fusion process. The experimental results prove that the method incorporates contextual spatial information and reduces possible over-enhancement of the multispectral data on the image regions having higher spectral content, as in the case of urban areas or other highly textured regions.
Scheda prodotto non validato
Scheda prodotto in fase di analisi da parte dello staff di validazione
|Titolo:||Context-driven image fusion of multispectral and panchromatic data based on a redundant wavelet representation|
|Citazione:||Garzelli, A., & Soldati, F. (2001). Context-driven image fusion of multispectral and panchromatic data based on a redundant wavelet representation. In Proc. IEEE/ISPRS Joint Workshop on Remote Sensing and Data Fusion over Urban Areas (pp.122-126). IEEE.|
|Appare nelle tipologie:||4.1 Contributo in Atti di convegno|
File in questo prodotto: