Symmetric Regularization based BERT for Pair-wise Semantic Reasoning
2020
The ability of semantic reasoning over the sentence pair is essential for many natural language understanding tasks, e.g., natural language inference and machine reading comprehension. A recent significant improvement in these tasks comes from BERT. As reported, the next sentence prediction (NSP) in BERT is of great significance for downstream problems with sentence-pair input. Despite its effectiveness, NSP still lacks the essential signal to distinguish between entailment and shallow correlation. To remedy this, we propose to augment the NSP task to a multi-class categorization task, which includes previous sentence prediction (PSP). This task encourages the model to learn the subtle semantics, thereby improves the ability of semantic understanding. Furthermore, by using a smoothing technique, the scopes of NSP and PSP are expanded into a broader range which includes close but nonsuccessive sentences. This simple method yields remarkable improvement against vanilla BERT. Our method consistently improves the performance on the NLI and MRC benchmarks by a large margin, including the challenging HANS dataset.
Keywords:
- Correction
- Source
- Cite
- Save
- Machine Reading By IdeaReader
3
References
5
Citations
NaN
KQI