In this paper, we propose a novel Deep Micro-Dictionary Learning and Coding Network (DDLCN). DDLCN has most of the standard deep learning layers (pooling, fully, connected, input/output, etc.) but the main difference is that the fundamental convolutional layers are replaced by novel compound dictionary learning and coding layers. The dictionary learning layer learns an over-complete dictionary for the input training data. At the deep coding layer, a locality constraint is added to guarantee that the activated dictionary bases are close to each other. Next, the activated dictionary atoms are assembled together and passed to the next compound dictionary learning and coding layers. In this way, the activated atoms in the first layer can be represented by the deeper atoms in the second dictionary. Intuitively, the second dictionary is designed to learn the fine-grained components which are shared among the input dictionary atoms. In this way, a more informative and discriminative low-level representation of the dictionary atoms can be obtained. We empirically compare the proposed DDLCN with several dictionary learning methods and deep learning architectures. The experimental results on four popular benchmark datasets demonstrate that the proposed DDLCN achieves competitive results compared with state-of-the-art approaches.

Deep micro-dictionary learning and coding network / Tang, H.; Wei, H.; Xiao, W.; Wang, W.; Xu, D.; Yan, Yan; Sebe, N.. - (2019), pp. 386-395. ((Intervento presentato al convegno IEEE Winter Conference on Application of Computer Vision tenutosi a Hawaii nel January 8-10, 2019 [10.1109/WACV.2019.00047].

Deep micro-dictionary learning and coding network

H. Tang;W. Wang;D. Xu;Y. Yan;N. Sebe
2019

Abstract

In this paper, we propose a novel Deep Micro-Dictionary Learning and Coding Network (DDLCN). DDLCN has most of the standard deep learning layers (pooling, fully, connected, input/output, etc.) but the main difference is that the fundamental convolutional layers are replaced by novel compound dictionary learning and coding layers. The dictionary learning layer learns an over-complete dictionary for the input training data. At the deep coding layer, a locality constraint is added to guarantee that the activated dictionary bases are close to each other. Next, the activated dictionary atoms are assembled together and passed to the next compound dictionary learning and coding layers. In this way, the activated atoms in the first layer can be represented by the deeper atoms in the second dictionary. Intuitively, the second dictionary is designed to learn the fine-grained components which are shared among the input dictionary atoms. In this way, a more informative and discriminative low-level representation of the dictionary atoms can be obtained. We empirically compare the proposed DDLCN with several dictionary learning methods and deep learning architectures. The experimental results on four popular benchmark datasets demonstrate that the proposed DDLCN achieves competitive results compared with state-of-the-art approaches.
IEEE Winter Conference on Application of Computer Vision
New York
IEEE
978-1-7281-1975-5
Tang, H.; Wei, H.; Xiao, W.; Wang, W.; Xu, D.; Yan, Yan; Sebe, N.
Deep micro-dictionary learning and coding network / Tang, H.; Wei, H.; Xiao, W.; Wang, W.; Xu, D.; Yan, Yan; Sebe, N.. - (2019), pp. 386-395. ((Intervento presentato al convegno IEEE Winter Conference on Application of Computer Vision tenutosi a Hawaii nel January 8-10, 2019 [10.1109/WACV.2019.00047].
File in questo prodotto:
File Dimensione Formato  
08658671.pdf

Solo gestori archivio

Tipologia: Versione editoriale (Publisher’s layout)
Licenza: Tutti i diritti riservati (All rights reserved)
Dimensione 4.87 MB
Formato Adobe PDF
4.87 MB Adobe PDF   Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11572/250749
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 6
  • ???jsp.display-item.citation.isi??? 6
social impact