Improving Relation Extraction via Joint Coding Using BiLSTM and DCNN.

2021 
Neural network methods based on distant supervision has been widely used in studies concerning relation extraction, however, a traditional convolutional neural network can not effectively extract the dependency relationship and structured information between words in sentences. In order to solve this problem, we propose a novel approach to improve relation extraction results. Specifically, we propose to first apply a neural network-based model to encode sentences, feature vectors obtained are then fed into a one-dimensional dilated convolutional neural network to extract the relation. Finally, sentence-level attention mechanism is used to reduce the noise caused by the mislabeling problem of distant supervision. Our approach has been evaluated on real world datasets NYT10 and compared with a wide range of baselines. Experimental results show that: (1) our approach can improve the performance of neural network relation extraction based on distant supervision; (2) the proposed approach achieves outstanding results on the datasets.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    19
    References
    0
    Citations
    NaN
    KQI
    []