AKBC 2021
  • Papers
  • Speakers
  • Workshops
  • Home
  • Day 1
  • Day 1 Posters
  • Paper 24

Neural Concept Formation in Knowledge Graphs

Agnieszka Dobrowolska, antonio vergari, Pasquale Minervini

Keywords: knowledge graphs, link prediction, concepts

TLDR: We learn novel concepts in KGs, use them to augment the KG and learn more accurate neural link predictor models.

, -

Poster Session Abstract AKBC OpenReview PDF
Abstract: In this work, we investigate how to learn novel concepts in Knowledge Graphs (KGs) in a principled way, and how to effectively exploit them to produce more accurate neural link prediction models. Specifically, we show how concept membership relationships learned via unsupervised clustering of entities can be reified and used to augment a KG. In a thorough set of experiments, we confirm that neural link predictors trained on these augmented KGs, or in a joint Expectation-Maximization iterative scheme, can generalize better and produce more accurate predictions for infrequent relationships. For instance, our method yields relative improvements of up to 8.6% MRR on WN18RR for rare predicates, and up to 82% in small-data regimes, where the model has access to just a small subset of the training triples. Furthermore, our proposed models are able to learn meaningful concepts.

Open Slides (in new window)

Back to Top

© 2021 AKBC Organization Committee