Humans show high-level of abstraction capabilities in games that require quickly communicating object information. They decompose the message content into multiple parts and communicate them in an interpretable protocol. Toward equipping machines with such capabilities, we propose the Primitive-based Sketch Abstraction task where the goal is to represent sketches using a fixed set of drawing primitives under the influence of a budget. To solve this task, our Primitive-Matching Network (PMN), learns interpretable abstractions of a sketch in a self supervised manner. Specifically, PMN maps each stroke of a sketch to its most similar primitive in a given set, predicting an affine transformation that aligns the selected primitive to the target stroke. We learn this stroke-to-primitive mapping end-to-end with a distance-transform loss that is minimal when the original sketch is precisely reconstructed with the predicted primitives. Our PMN abstraction empirically achieves the highest performance on sketch recognition and sketch-based image retrieval given a communication budget, while at the same time being highly interpretable. This opens up new possibilities for sketch analysis, such as comparing sketches by extracting the most relevant primitives that define an object category. Code is available at https://github.com/ExplainableML/sketch-primitives.

Abstracting Sketches Through Simple Primitives / Alaniz, S.; Mancini, M.; Dutta, A.; Marcos, D.; Akata, Z.. - 13689:(2022), pp. 396-412. (Intervento presentato al convegno 17th European Conference on Computer Vision, ECCV 2022 tenutosi a Tel Aviv, Israel nel 23–27 October, 2022) [10.1007/978-3-031-19818-2_23].

Abstracting Sketches Through Simple Primitives

Mancini, M.;
2022-01-01

Abstract

Humans show high-level of abstraction capabilities in games that require quickly communicating object information. They decompose the message content into multiple parts and communicate them in an interpretable protocol. Toward equipping machines with such capabilities, we propose the Primitive-based Sketch Abstraction task where the goal is to represent sketches using a fixed set of drawing primitives under the influence of a budget. To solve this task, our Primitive-Matching Network (PMN), learns interpretable abstractions of a sketch in a self supervised manner. Specifically, PMN maps each stroke of a sketch to its most similar primitive in a given set, predicting an affine transformation that aligns the selected primitive to the target stroke. We learn this stroke-to-primitive mapping end-to-end with a distance-transform loss that is minimal when the original sketch is precisely reconstructed with the predicted primitives. Our PMN abstraction empirically achieves the highest performance on sketch recognition and sketch-based image retrieval given a communication budget, while at the same time being highly interpretable. This opens up new possibilities for sketch analysis, such as comparing sketches by extracting the most relevant primitives that define an object category. Code is available at https://github.com/ExplainableML/sketch-primitives.
2022
Computer Vision – ECCV 2022 17th European Conference
Cham, Svizzera
Springer Science and Business Media Deutschland GmbH
978-3-031-19817-5
978-3-031-19818-2
Alaniz, S.; Mancini, M.; Dutta, A.; Marcos, D.; Akata, Z.
Abstracting Sketches Through Simple Primitives / Alaniz, S.; Mancini, M.; Dutta, A.; Marcos, D.; Akata, Z.. - 13689:(2022), pp. 396-412. (Intervento presentato al convegno 17th European Conference on Computer Vision, ECCV 2022 tenutosi a Tel Aviv, Israel nel 23–27 October, 2022) [10.1007/978-3-031-19818-2_23].
File in questo prodotto:
File Dimensione Formato  
136890392.pdf

accesso aperto

Tipologia: Post-print referato (Refereed author’s manuscript)
Licenza: Tutti i diritti riservati (All rights reserved)
Dimensione 2.1 MB
Formato Adobe PDF
2.1 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11572/400763
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 2
  • ???jsp.display-item.citation.isi??? 0
social impact