Cardiff University | Prifysgol Caerdydd ORCA
Online Research @ Cardiff 
WelshClear Cookie - decide language by browser settings

Inducing relational knowledge from BERT

Bouraoui, Zied, Camacho Collados, Jose and Schockaert, Steven 2019. Inducing relational knowledge from BERT. Presented at: Thirty-Fourth AAAI Conference on Artificial Intelligence (AAAI-20), New York, NY, USA, 7-12 February 2020.

[img]
Preview
PDF - Accepted Post-Print Version
Download (186kB) | Preview

Abstract

One of the most remarkable properties of word embeddings is the fact that they capture certain types of semantic and syntactic relationships. Recently, pre-trained language models such as BERT have achieved groundbreaking results across a wide range of Natural Language Processing tasks. However, it is unclear to what extent such models capture relational knowledge beyond what is already captured by standard word embeddings. To explore this question, we propose a methodology for distilling relational knowledge from a pre-trained language model. Starting from a few seed instances of a given relation, we first use a large text corpus to find sentences that are likely to express this relation. We then use a subset of these extracted sentences as templates. Finally, we fine-tune a language model to predict whether a given word pair is likely to be an instance of some relation, when given an instantiated template for that relation as input.

Item Type: Conference or Workshop Item (Paper)
Date Type: Submission
Status: In Press
Schools: Computer Science & Informatics
Date of First Compliant Deposit: 26 March 2020
Date of Acceptance: 12 February 2020
Last Modified: 30 Apr 2020 14:00
URI: http://orca.cf.ac.uk/id/eprint/127433

Actions (repository staff only)

Edit Item Edit Item

Downloads

Downloads per month over past year

View more statistics