Siamese BERT Model with Adversarial Training for Relation Classification
2020
Relation classification is a very important Natural Language Processing (NLP) task to classify the relations from the plain text. It is one of the basic tasks of constructing a knowledge graph. Most existing state-of-the-art methods are primarily based on Convolutional Neural Networks(CNN) or Long Short-Term Memory Networks(LSTM). Recently, many pre-trained Bidirectional Encoder Representation from Transformers (BERT) models have been successfully used in the sequence labeling and many NLP classification tasks. Relation classification is different in that it needs to pay attention to not only the sentence information but also the entity pairs. In this paper, a Siamese BERT model with Adversarial Training (SBERT-AT) is proposed for relation classification. Firstly, the features of the entities and the sentence can be extracted separately to improve the performance of relation classification. Secondly, the adversarial training is applied to the SBERT architecture to improve the robustness. Lastly, the experimental results demonstrate that we achieve significant improvement compared with the other methods on real-world datasets.
- Correction
- Source
- Cite
- Save
- Machine Reading By IdeaReader
27
References
0
Citations
NaN
KQI