A BERT-based One-Pass Multi-Task Model for Clinical Temporal Relation Extraction.
2020
Recently BERT has achieved a state-of-the-art performance in temporal relation extraction from clinical Electronic Medical Records text. However, the current approach is inefficient as it requires multiple passes through each input sequence. We extend a recently-proposed one-pass model for relation classification to a one-pass model for relation extraction. We augment this framework by introducing global embeddings to help with long-distance relation inference, and by multi-task learning to increase model performance and generalizability. Our proposed model produces results on par with the state-of-the-art in temporal relation extraction on the THYME corpus and is much “greener” in computational cost.
- Correction
- Source
- Cite
- Save
- Machine Reading By IdeaReader
26
References
5
Citations
NaN
KQI