基于優(yōu)化字典學習的遙感圖像融合方法
doi: 10.11999/JEIT180263 cstr: 32379.14.JEIT180263
-
1.
太原理工大學大數(shù)據(jù)學院 ??太原 ??030024
-
2.
太原理工大學電氣與動力工程學院 ??太原 ??030024
-
3.
太原理工大學信息與計算機學院 ??太原 ??030024
Remote Sensing Image Fusion Based on Optimized Dictionary Learning
-
1.
College of Data Science, Taiyuan University of Technology, Taiyuan 030024, China
-
2.
College of Electrical and Power Engineering, Taiyuan University of Technology, Taiyuan 030024, China
-
3.
College of Information and Computer, Taiyuan University of Technology, Taiyuan 030024, China
-
摘要: 為提升全色圖像和多光譜圖像的融合效果,該文提出基于優(yōu)化字典學習的遙感圖像融合方法。首先將經(jīng)典圖像庫中的圖像分塊作為訓練樣本,對其進行K均值聚類,根據(jù)聚類結果適度裁減數(shù)量較多且相似度較高的圖像塊,減少訓練樣本個數(shù)。接著對裁減后的訓練樣本進行訓練,得到通用性字典,并標記出相似字典原子和較少使用的字典原子。然后用與原稀疏模型差異最大的全色圖像塊規(guī)范化后替換相似字典原子和較少使用的字典原子,得到自適應字典。使用自適應字典對多光譜圖像經(jīng)IHS變換后獲取的亮度分量和源全色圖像進行稀疏表示,把每一個圖像塊稀疏系數(shù)中的模極大值系數(shù)分離,得到極大值稀疏系數(shù),將剩下的稀疏系數(shù)稱為剩余稀疏系數(shù)。針對極大值稀疏系數(shù)和剩余稀疏系數(shù)分別選擇不同的融合規(guī)則進行融合,以保留更多的光譜信息和空間細節(jié)信息,最后進行IHS逆變換獲得融合圖像。實驗結果表明,與傳統(tǒng)方法相比所提方法得到的融合圖像主觀視覺效果較好,且客觀評價指標更優(yōu)。Abstract: In order to improve the fusion quality of panchromatic image and multi-spectral image, a remote sensing image fusion method based on optimized dictionary learning is proposed. Firstly, K-means cluster is applied to image blocks in the image database, and then image blocks with high similarity are removed partly in order to improve the training efficiency. While obtaining a universal dictionary, the similar dictionary atoms and less used dictionary atoms are marked for further research. Secondly, similar dictionary atoms and less used dictionary atoms are replaced by panchromatic image blocks with the largest difference from the original sparse model to obtain an adaptive dictionary. Furthermore the adaptive dictionary is used to sparse represent the intensity component and panchromatic image, the modulus maxima coefficients in the sparse coefficients of each image blocks are separated to obtain maximal sparse coefficients, and the remaining sparse coefficients are called residual sparse coefficients. Then, each part is fused by different fusion rules to preserve more spectral and spatial detail information. Finally, inverse IHS transform is employed to obtain the fused image. Experiments demonstrate that the proposed method provides better spectral quality and superior spatial information in the fused image than its counterparts.
-
Key words:
- Remote sensing image fusion /
- K-means cluster /
- Adaptive dictionary /
- Sparse represent /
- Fusion rule
-
表 3 不同采樣率下字典訓練誤差對比
算法 采樣10% 采樣30% 采樣50% 采樣70% 采樣90% KSVD算法 0.636 0.614 0.448 0.022 0.022 本文算法 0.036 0.022 0.021 0.019 0.018 下載: 導出CSV
表 2 不同采樣率下字典訓練效率對比(s)
算法 采樣10% 采樣30% 采樣50% 采樣70% 采樣90% KSVD算法 579.8 1280.5 1923.3 3235.4 6577.2 本文算法 1562.4 2281.7 2936.4 4240.6 7578.9 下載: 導出CSV
表 4 Road原圖像融合結果的性能比較
融合方法 ERGAS RASE SAM UIQI IHS方法 9.7446 24.5692 1.1642 0.3469 NIHS方法 5.1226 33.7954 0.3466 0.8387 NSST方法 6.6566 16.6560 1.0277 0.7262 SR方法 8.2873 28.3787 1.1011 0.5979 本文方法 5.0543 12.9461 0.6602 0.7925 下載: 導出CSV
表 5 City原圖像融合結果的性能比較
融合方法 ERGAS RASE SAM UIQI IHS方法 13.1711 34.7595 2.7537 0.5705 NIHS方法 6.2005 16.9826 1.5020 0.8981 NSST方法 8.5364 23.4545 2.2660 0.8229 SR方法 9.1183 46.6352 1.7720 0.7655 本文方法 6.0461 18.8073 1.4300 0.8673 下載: 導出CSV
表 6 River原圖像融合結果的性能比較
融合方法 ERGAS RASE SAM UIQI IHS方法 8.1003 14.5947 4.9814 0.7037 NIHS方法 4.3391 12.7904 1.1056 0.9197 NSST方法 5.6170 17.5402 3.1750 0.8637 SR方法 9.3164 75.8740 4.6461 0.8020 本文方法 4.2291 9.6131 2.9010 0.9307 下載: 導出CSV
表 7 Fighter原圖像融合結果的性能比較
融合方法 ERGAS RASE SAM UIQI IHS方法 1.4788 4.2170 0.2381 0.9830 NIHS方法 1.4199 4.0977 0.2392 0.9850 NSST方法 1.5954 6.0820 0.3091 0.9834 SR方法 2.3366 7.0861 0.3709 0.9508 本文方法 0.9564 2.7291 0.1570 0.9929 下載: 導出CSV
-
HASSAN G. A review of remote sensing image fusion methods[J]. Information Fusion, 2016, 32(PA): 75–89 doi: 10.1016/j.inffus.2016.03.003 MUFIT C and ABDULKADIR T. Intensity–hue–saturation-based image fusion using iterative linear regression[J]. Journal of Applied Remote Sensing, 2016, 10(4): 045019 doi: 10.1117/1.JRS.10.045019 JI Feng, LI Zeren, CHANG Xia, et al. Remote sensing image fusion method based on PCA and NSCT transform[J]. Journal of Graphics, 2017, 38(2): 247–252 doi: 10.11996/JG.j.2095-302X.2017020247 LI Xu, ZHANG Yiming, GAO Yanan, et al. Using guided filtering to improve gram-schmidt based pansharpening method for GeoEye-1 satellite images[C]. International Conference on Information Systems and Computing Technology, Shanghai, China, 2016: 33–37. YU Biting, JIA Bo, DING Lu, et al. Hybrid dual-tree complex wavelet transform and support vector machine for digital multi-focus image fusion[J]. Neurocomputing, 2016, 182(C): 1–9 doi: 10.1016/j.neucom.2015.10.084 JORGE N M, XAVIER O, OCTAVI F, et al. Multiresolution-based image fusion with additive wavelet decomposition[J]. IEEE Transactions on Geoscience&Remote Sensing, 1999, 37(3): 1204–1211 doi: 10.1109/36.763274 CHEN Ning, NIU Weiran, ZHANG Jian, et al. Remote sensing image fusion algorithm based on modified Contourlet transform[J]. Journal of Computer Applications, 2015, 35(7): 2015–2019 doi: 10.11772/j.issn.1001-9081.2015.07.2015 劉健, 雷英杰, 邢雅瓊, 等. 基于改進型NSST變換的圖像融合方法[J]. 控制與決策, 2017, 32(2): 275–280 doi: 10.13195/j.kzyjc.2016.0075LIU Jian, LEI Yingjie, XING Yaqiong, et al. Image fusion method based on improved NSST transform[J]. Control&Decision, 2017, 32(2): 275–280 doi: 10.13195/j.kzyjc.2016.0075 肖化超, 周詮, 鄭小松. 基于IHS變換和Curvelet變換的衛(wèi)星遙感圖像融合方法[J]. 華南理工大學學報 (自然科學版), 2016, 44(1): 58–64 doi: 10.3969/j.issn.1000-565X.2016.01.009XIAO Huachao, ZHOU Quan, and ZHENG Xiaosong. Remote sensing image fusion based on IHS transform and Curvelet transform[J]. Journal of South China University of Technology(Natural Science Edition) , 2016, 44(1): 58–64 doi: 10.3969/j.issn.1000-565X.2016.01.009 劉靜, 李小超, 祝開建, 等. 基于分布式壓縮感知的遙感圖像融合算法[J]. 電子與信息學報, 2017, 39(10): 2374–2381 doi: 10.11999/JEIT161393LIU Jing, LI Xiaochao, ZHU Kaijian, et al. Remote sensing image fusion algorithm based on distributed compression sensing[J]. Journal of Electronics&Information Technology, 2017, 39(10): 2374–2381 doi: 10.11999/JEIT161393 ALTAN U M, HU Jianwen, and LI Shutao. Remote sensing image fusion method based on nonsubsampled shearlet transform and sparse representation[J]. Sensing&Imaging, 2015, 16(1): 23 doi: 10.1007/s11220-015-0125-0 WANG Jun, PENG Jinye, JIANG Xiaoyue, et al. Remote-sensing image fusion using sparse representation with sub-dictionaries[J]. International Journal of Remote Sensing, 2017, 38(12): 3564–3585 doi: 10.1080/01431161.2017.1302106 The CIFAR-10 dataset [OL]. http://www.cs.toronto.edu/~kriz/cifar.html, 2018. BRUNO A O and DAVID J F. Emergence of simple-cell receptive field properties by learning a sparse code for natural images[J]. Nature, 1996, 381(6583): 607–609 doi: 10.1038/381607a0 MICHAL A, MICHAEL E, and ALFRED M B. K-SVD: An algorithm for designing overcomplete dictionaries for sparse representation[J]. IEEE Transactions on Signal Processing, 2006, 54(11): 4311–4322 doi: 10.1109/TSP.2006.881199 RAMIN R and KRISHNAPRASAD P S. Orthogonal matching pursuit: Recursive function approximation with applications to wavelet decomposition[C]. Conference on Signals, Systems & Computers, Pacific Grove, USA, 2002: 40–44. ZHU Xiaoxiang and RICHARD B. A sparse image fusion algorithm with application to pan-sharpening[J].IEEE Transactions on Geoscience&Remote Sensing, 2013, 51(5): 2827–2836 doi: 10.1109/TGRS.2012.2213604 SAMAN J, HASHEMY S M, KOUROSH M, et al. Classification of aquifer vulnerability using k-means cluster analysis[J]. Journal of Hydrology, 2017, 549: 27–37 doi: 10.1016/j.jhydrol.2017.03.060 吳一全, 李立. 利用核模糊聚類和正則化的圖像稀疏去噪[J]. 光子學報, 2014, 43(3): 0310001 doi: 10.3788/gzxb20144303.0310001WU Yiquan and LI Li. Image sparse denoising using kernel fuzzy clustering and regularization[J]. Acta Photonica Sinica, 2014, 43(3): 0310001 doi: 10.3788/gzxb20144303.0310001 劉帆. 基于小波核濾波器和稀疏表示的遙感圖像融合[D]. [博士論文], 西安電子科技大學, 2014.LIU Fan. Remote sensing image fusion based on wavelet kernel filter and sparse representation[D]. [Ph.D. dissertation], Xidian University, 2014. CARPER W J, THOMAS M L, and RALPH W K. The use of intensity-hue-saturation transformations for merging SPOT panchromatic and multispectral image data[J]. Photogrammetric Engineering&Remote Sensing, 1990, 56(4): 459–467. MORTEZA G and HASSAN G. Nonlinear IHS: A promising method for Pan-sharpening[J]. IEEE Geoscience&Remote Sensing Letters, 2016, 13(11): 1606–1610 doi: 10.1109/LGRS.2016.2597271 -