首頁  >  科學研究  >  科研成果  >  正文
科研成果
沈煥鋒的論文在IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING刊出
發布時間:2021-03-05 08:36:59     發布者:易真     浏覽次數:

标題: SAR Image Despeckling Employing a Recursive Deep CNN Prior

作者: Shen, HF (Shen, Huanfeng); Zhou, CX (Zhou, Chenxia); Li, J (Li, Jie); Yuan, QQ (Yuan, Qiangqiang)

來源出版物: IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING : 59 : 1 : 273-286 DOI: 10.1109/TGRS.2020.2993319 出版年: JAN 2021

摘要: Synthetic aperture radar (SAR) images are inherently affected by speckle noise, for which deep learning-based methods have shown good potential. However, the deep learning-based methods proposed until now directly map low-quality images to high-quality images, and they are unable to characterize the priors for all the kinds of speckle images. The variational method is a classic model optimization approach that establishes the relationship between the clean and noisy images from the perspective of a probability distribution. Therefore, in this article, we propose the recursive deep convolutional neural network (CNN) prior model for SAR image despeckling (SAR-RDCP). First, the data-fitting term and regularization term of the SAR variational model are decoupled into two subproblems, i.e., a data-fitting block and a deep CNN prior block. The gradient descent algorithm is then used to solve the data-fitting block, and a predenoising residual channel attention network based on dilated convolution is used for the deep CNN prior block, which combines an end-to-end iterative optimization training. In the experiments undertaken in this study, the proposed model was compared with several state-of-the-art despeckling methods, obtaining better results in both the quantitative and qualitative evaluations.

入藏号: WOS:000603079000022

語言: English

文獻類型: Article

作者關鍵詞: Synthetic aperture radar; Machine learning; Optimization; Speckle; Image restoration; Learning systems; Transforms; Convolutional neural network (CNN); despeckling gain (DG) loss; residual channel attention; synthetic aperture radar (SAR) image despeckling; variation

地址: [Shen, Huanfeng; Zhou, Chenxia] Wuhan Univ, Sch Resource & Environm Sci, Wuhan 430079, Peoples R China.

[Shen, Huanfeng; Yuan, Qiangqiang] Wuhan Univ, Collaborat Innovat Ctr Geospatial Technol, Wuhan 430079, Peoples R China.

[Li, Jie] Wuhan Univ, Sch Geodesy & Geomat, Wuhan 430079, Peoples R China.

[Yuan, Qiangqiang] Sch Geodesy & Geomat, Wuhan 430079, Peoples R China.

通訊作者地址: Li, J (通訊作者)Wuhan Univ, Sch Geodesy & Geomat, Wuhan 430079, Peoples R China.

電子郵件地址: shenhf@whu.edu.cn; zhoucx31@whu.edu.cn; aaronleecool@whu.edu.cn; yqiang86@gmail.com

影響因子:5.855


信息服務
學院網站教師登錄 學院辦公電話 學校信息門戶登錄

版權所有 © 88858cc永利官网
地址:湖北省武漢市珞喻路129号 郵編:430079 
電話:027-68778381,68778284,68778296 傳真:027-68778893    郵箱:sres@whu.edu.cn

Baidu
sogou