Deep learning has attracted intensive attention in synthetic aperture radar (SAR) automatic target recognition (ATR). Usually, a considerable number of labeled samples are necessary to learn a deep model for obtaining good generalization capability. However, the process of sample labeling is time-consuming and costly. This letter proposes an active self-paced deep learning (ASPDL) approach to SAR ATR. In a nutshell, we first introduce the Bayesian inference into the process of deep model parameter optimization, aiming at learning a robust classification model in the case of a limited number of labeled samples. Next, a cost-effective sample selection strategy is presented to iteratively and actively select the informative samples from a pool of unlabeled samples for labeling. Concretely, high-confidence samples are actively selected through self-paced learning (SPL) way and automatically pseudo-labeled with the current classification model, whereas low-confidence samples are chosen through an active learning strategy and manually labeled. Finally, we update the parameters of the model by minimizing a dual-loss function using a new training set that is constructed by incorporating new labeled samples with original ones. Experiments on the moving and stationary target acquisition and recognition (MSTAR) benchmark data demonstrate that the proposed method can achieve better classification accuracy with relatively few labeled samples compared with some state-of-the-art methods.

A Bayesian Approach to Active Self-Paced Deep Learning for SAR Automatic Target Recognition / Ren, Haohao; Yu, Xuelian; Bruzzone, Lorenzo; Zhang, Yukun; Zou, Lin; Wang, Xuegang. - In: IEEE GEOSCIENCE AND REMOTE SENSING LETTERS. - ISSN 1545-598X. - 19:(2022), pp. 1-5. [10.1109/lgrs.2020.3036585]

A Bayesian Approach to Active Self-Paced Deep Learning for SAR Automatic Target Recognition

Bruzzone, Lorenzo;
2022-01-01

Abstract

Deep learning has attracted intensive attention in synthetic aperture radar (SAR) automatic target recognition (ATR). Usually, a considerable number of labeled samples are necessary to learn a deep model for obtaining good generalization capability. However, the process of sample labeling is time-consuming and costly. This letter proposes an active self-paced deep learning (ASPDL) approach to SAR ATR. In a nutshell, we first introduce the Bayesian inference into the process of deep model parameter optimization, aiming at learning a robust classification model in the case of a limited number of labeled samples. Next, a cost-effective sample selection strategy is presented to iteratively and actively select the informative samples from a pool of unlabeled samples for labeling. Concretely, high-confidence samples are actively selected through self-paced learning (SPL) way and automatically pseudo-labeled with the current classification model, whereas low-confidence samples are chosen through an active learning strategy and manually labeled. Finally, we update the parameters of the model by minimizing a dual-loss function using a new training set that is constructed by incorporating new labeled samples with original ones. Experiments on the moving and stationary target acquisition and recognition (MSTAR) benchmark data demonstrate that the proposed method can achieve better classification accuracy with relatively few labeled samples compared with some state-of-the-art methods.
2022
Ren, Haohao; Yu, Xuelian; Bruzzone, Lorenzo; Zhang, Yukun; Zou, Lin; Wang, Xuegang
A Bayesian Approach to Active Self-Paced Deep Learning for SAR Automatic Target Recognition / Ren, Haohao; Yu, Xuelian; Bruzzone, Lorenzo; Zhang, Yukun; Zou, Lin; Wang, Xuegang. - In: IEEE GEOSCIENCE AND REMOTE SENSING LETTERS. - ISSN 1545-598X. - 19:(2022), pp. 1-5. [10.1109/lgrs.2020.3036585]
File in questo prodotto:
File Dimensione Formato  
A_Bayesian_Approach_to_Active_Self-Paced_Deep_Learning_for_SAR_Automatic_Target_Recognition.pdf

Solo gestori archivio

Tipologia: Versione editoriale (Publisher’s layout)
Licenza: Tutti i diritti riservati (All rights reserved)
Dimensione 1.59 MB
Formato Adobe PDF
1.59 MB Adobe PDF   Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11572/401516
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 13
  • ???jsp.display-item.citation.isi??? 7
social impact