We propose N e s t e r, a method for injecting neural networks into constrained structured predictors. N e s t e r first uses a neural network to compute an initial prediction that may or may not satisfy the constraints, and then applies a constraint-based structured predictor to refine the raw predictions according to hard and soft constraints. N e s t e r combines the advantages of its two components: the network can learn complex representations from low-level data while the constraint program on top reasons about the high-level properties and requirements of the prediction task. An empirical evaluation on handwritten equation recognition shows that N e s t e r achieves better performance than both the either component in isolation, especially when training examples are scarce, while scaling to more complex problems than other neuro-programming approaches. N e s t e r proves especially useful to reduce errors at the semantic level of the problem, which is particularly challenging for neural network architectures.

Neuro-Symbolic Constraint Programming for Structured Prediction / Dragone, Paolo; Teso, Stefano; Passerini, Andrea. - 2986:(2021), pp. 6-14. (Intervento presentato al convegno NeSy tenutosi a virtual nel 25-27 October 2021).

Neuro-Symbolic Constraint Programming for Structured Prediction

Dragone, Paolo;Teso, Stefano;Passerini, Andrea
2021-01-01

Abstract

We propose N e s t e r, a method for injecting neural networks into constrained structured predictors. N e s t e r first uses a neural network to compute an initial prediction that may or may not satisfy the constraints, and then applies a constraint-based structured predictor to refine the raw predictions according to hard and soft constraints. N e s t e r combines the advantages of its two components: the network can learn complex representations from low-level data while the constraint program on top reasons about the high-level properties and requirements of the prediction task. An empirical evaluation on handwritten equation recognition shows that N e s t e r achieves better performance than both the either component in isolation, especially when training examples are scarce, while scaling to more complex problems than other neuro-programming approaches. N e s t e r proves especially useful to reduce errors at the semantic level of the problem, which is particularly challenging for neural network architectures.
2021
NeSy’21: 15th International Workshop on Neural-Symbolic Learning and Reasoning
online
CEUR Workshop Proceedings
Dragone, Paolo; Teso, Stefano; Passerini, Andrea
Neuro-Symbolic Constraint Programming for Structured Prediction / Dragone, Paolo; Teso, Stefano; Passerini, Andrea. - 2986:(2021), pp. 6-14. (Intervento presentato al convegno NeSy tenutosi a virtual nel 25-27 October 2021).
File in questo prodotto:
File Dimensione Formato  
deep_pyconstruct___nesy21.pdf

accesso aperto

Tipologia: Versione editoriale (Publisher’s layout)
Licenza: Creative commons
Dimensione 839.23 kB
Formato Adobe PDF
839.23 kB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11572/364929
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 4
  • ???jsp.display-item.citation.isi??? 0
social impact