Betz, P;
Niepert, M;
Minervini, P;
Stuckenschmidt, H;
(2021)
Backpropagating through Markov Logic Networks.
In:
Proceedings of the 15th International Workshop on Neural-Symbolic Learning and Reasoning as part of the 1st International Joint Conference on Learning & Reasoning (IJCLR 2021).
(pp. pp. 67-81).
CEUR: Virtual.
Preview |
Text
paper5.pdf - Published Version Download (305kB) | Preview |
Abstract
We integrate Markov Logic networks with deep learning architectures operating on high-dimensional and noisy feature inputs. Instead of relaxing the discrete components into smooth functions, we propose an approach that allows us to backpropagate through standard statistical relational learning components using perturbation-based differentiation. The resulting hybrid models are shown to outperform models solely relying on deep learning based function fitting. We find that using noise perturbations is required to allow the proposed hybrid models to robustly learn from the training data.
Type: | Proceedings paper |
---|---|
Title: | Backpropagating through Markov Logic Networks |
Event: | NeSy 2021: 15th International Workshop on Neural-Symbolic Learning and Reasoning as part of the 1st International Joint Conference on Learning & Reasoning (IJCLR 2021) |
Open access status: | An open access version is available from UCL Discovery |
Publisher version: | http://ceur-ws.org/Vol-2986/ |
Language: | English |
Additional information: | © 2021 Copyright for this paper by its authors. Use permitted under Creative Commons License Attribution 4.0 International (CC BY 4.0). |
Keywords: | Machine Learning, Reasoning, Markov Logic, Discrete-continuous learning |
UCL classification: | UCL UCL > Provost and Vice Provost Offices > UCL BEAMS UCL > Provost and Vice Provost Offices > UCL BEAMS > Faculty of Engineering Science UCL > Provost and Vice Provost Offices > UCL BEAMS > Faculty of Engineering Science > Dept of Computer Science |
URI: | https://discovery.ucl.ac.uk/id/eprint/10137934 |
Archive Staff Only
View Item |