首頁  >  科研動态  >  正文
科研動态
博士生蔣夢輝,沈煥鋒的論文在IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING刊出
發布時間:2022-09-02 15:31:24     發布者:易真     浏覽次數:

标題: Deep-Learning-Based Spatio-Temporal-Spectral Integrated Fusion of Heterogeneous Remote Sensing Images

作者: Jiang, MH (Jiang, Menghui); Shen, HF (Shen, Huanfeng); Li, J (Li, Jie)

來源出版物: IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING : 60 文獻号: 5410915 DOI: 10.1109/TGRS.2022.3188998 出版年: 2022

摘要: It is a challenging task to integrate the spatial, temporal, and spectral information of multisource remote sensing images, especially in the case of heterogeneous images. To this end, for the first time, this article proposes a heterogeneous integrated framework based on a novel deep residual cycle generative adversarial network (GAN). The proposed network consists of a forward fusion part and a backward degeneration feedback part. The forward part generates the desired fusion result from the various observations; the backward degeneration feedback part considers the imaging degradation process and regenerates the observations inversely from the fusion result. The heterogeneous integrated fusion framework supported by the proposed network can simultaneously merge the complementary spatial, temporal, and spectral information of multisource heterogeneous observations to achieve heterogeneous spatiospectral fusion, spatiotemporal fusion, and heterogeneous spatiotemporal-spectral fusion. Furthermore, the proposed heterogeneous integrated fusion framework can be leveraged to relieve the two bottlenecks of land-cover change and thick cloud cover. Thus, the inapparent and unobserved variation trends of surface features, which are caused by the low-resolution imaging and cloud contamination, can be detected and reconstructed well. Images from many different remote sensing satellites, i.e., Moderate Resolution Imaging Spectroradiometer (MODIS), Landsat 8, Sentinel-1, and Sentinel-2, were utilized in the experiments conducted in this study, and both the qualitative and quantitative evaluations confirmed the effectiveness of the proposed image fusion method.

作者關鍵詞: Generators; Remote sensing; Spatial resolution; Generative adversarial networks; Feature extraction; Image fusion; Optical sensors; Deep residual cycle generative adversarial network (GAN); heterogeneous integrated framework; land-cover change; thick cloud cover

地址: [Jiang, Menghui; Shen, Huanfeng] Wuhan Univ, Sch Resource & Environm Sci, Wuhan 430079, Peoples R China.

[Shen, Huanfeng] Wuhan Univ, Collaborat Innovat Ctr Geospatial Technol, Wuhan 430079, Peoples R China.

[Li, Jie] Wuhan Univ, Sch Geodesy & Geomat, Wuhan 430079, Peoples R China.

通訊作者地址: Shen, HF (通訊作者)Wuhan Univ, Sch Resource & Environm Sci, Wuhan 430079, Peoples R China.

Shen, HF (通訊作者)Wuhan Univ, Collaborat Innovat Ctr Geospatial Technol, Wuhan 430079, Peoples R China.

電子郵件地址: jiangmenghui@whu.edu.cn; shenhf@whu.edu.cn; aaronleecool@whu.edu.cn

影響因子:8.125

信息服務
學院網站教師登錄 學院辦公電話 學校信息門戶登錄

版權所有 © 88858cc永利官网
地址:湖北省武漢市珞喻路129号 郵編:430079 
電話:027-68778381,68778284,68778296 傳真:027-68778893    郵箱:sres@whu.edu.cn

Baidu
sogou