We introduce a new setting of Novel Class Discovery in Semantic Segmentation (NCDSS), which aims at segmenting unlabeled images containing new classes given prior knowledge from a labeled set of disjoint classes. In contrast to existing approaches that look at novel class dis-covery in image classification, we focus on the more chal-lenging semantic segmentation. In NCDSS, we need to dis-tinguish the objects and background, and to handle the existence of multiple classes within an image, which in-creases the difficulty in using the unlabeled data. To tackle this new setting, we leverage the labeled base data and a saliency model to coarsely cluster novel classes for model training in our basic framework. Additionally, we propose the Entropy-based Uncertainty Modeling and Self-training (EUMS) framework to overcome noisy pseudo-labels, fur-ther improving the model performance on the novel classes. Our EUMS utilizes an entropy ranking technique and a dy-namic reassignment to distill clean labels, thereby making full use of the noisy data via self-supervised learning. We build the NCDSS benchmark on the PASCAL-5i dataset and COCO-20i dataset. Extensive experiments demonstrate the feasibility of the basic framework (achieving an average mIoU of 49.81% on PASCAL-5i) and the effectiveness of EUMS framework (outperforming the basic framework by 9.28% mIoU on PASCAL-5i).

Novel Class Discovery in Semantic Segmentation / Zhao, Y.; Zhong, Z.; Sebe, N.; Lee, G. H.. - 2022:(2022), pp. 4330-4339. (Intervento presentato al convegno 2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition, CVPR 2022 tenutosi a New Orleans, LA, USA nel 18-24 June 2022) [10.1109/CVPR52688.2022.00430].

Novel Class Discovery in Semantic Segmentation

Zhong Z.;Sebe N.;
2022-01-01

Abstract

We introduce a new setting of Novel Class Discovery in Semantic Segmentation (NCDSS), which aims at segmenting unlabeled images containing new classes given prior knowledge from a labeled set of disjoint classes. In contrast to existing approaches that look at novel class dis-covery in image classification, we focus on the more chal-lenging semantic segmentation. In NCDSS, we need to dis-tinguish the objects and background, and to handle the existence of multiple classes within an image, which in-creases the difficulty in using the unlabeled data. To tackle this new setting, we leverage the labeled base data and a saliency model to coarsely cluster novel classes for model training in our basic framework. Additionally, we propose the Entropy-based Uncertainty Modeling and Self-training (EUMS) framework to overcome noisy pseudo-labels, fur-ther improving the model performance on the novel classes. Our EUMS utilizes an entropy ranking technique and a dy-namic reassignment to distill clean labels, thereby making full use of the noisy data via self-supervised learning. We build the NCDSS benchmark on the PASCAL-5i dataset and COCO-20i dataset. Extensive experiments demonstrate the feasibility of the basic framework (achieving an average mIoU of 49.81% on PASCAL-5i) and the effectiveness of EUMS framework (outperforming the basic framework by 9.28% mIoU on PASCAL-5i).
2022
Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition
New York
IEEE Computer Society
978-1-6654-6946-3
Zhao, Y.; Zhong, Z.; Sebe, N.; Lee, G. H.
Novel Class Discovery in Semantic Segmentation / Zhao, Y.; Zhong, Z.; Sebe, N.; Lee, G. H.. - 2022:(2022), pp. 4330-4339. (Intervento presentato al convegno 2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition, CVPR 2022 tenutosi a New Orleans, LA, USA nel 18-24 June 2022) [10.1109/CVPR52688.2022.00430].
File in questo prodotto:
File Dimensione Formato  
Zhao_Novel_Class_Discovery_in_Semantic_Segmentation_CVPR_2022_paper.pdf

accesso aperto

Tipologia: Versione editoriale (Publisher’s layout)
Licenza: Tutti i diritti riservati (All rights reserved)
Dimensione 2.34 MB
Formato Adobe PDF
2.34 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11572/361269
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 14
  • ???jsp.display-item.citation.isi??? 2
social impact