Upgrading CRFS to JRFS and its Benefits to Sequence Modeling and Labeling
2020
Two important sequence tasks are sequence modeling and labeling. Sequence modeling involves determining the probabilities of sequences, e.g. language modeling. It is still difficult to improve language modeling with additional relevant tags, e.g. part-of-speech (POS) tags. For sequence labeling, it is worthwhile to explore task-dependent semi-supervised learning to leverage a mix of labeled and unlabeled data, besides pre-training. In this paper, we propose to upgrade condtional random fields (CRFs) and obtain a joint generative model of observation and label sequences, called joint random fields (JRFs). Specifically, we propose to use the potential function in the original CRF as the potential function that defines the joint distribution. This development from CRFs to JRFs benefits both modeling and labeling of sequence data, as shown in our experiments. For example, the JRF model (using POS tags) outperforms traditional language models and avoids the need to produce hypothesized labels by a standalone POS tagger. For sequence labeling, task-dependent semi-supervised learning by JRFs consistently outperform the CRF baseline and self-training, on POS tagging, chunking and NER.
- Correction
- Source
- Cite
- Save
- Machine Reading By IdeaReader
25
References
2
Citations
NaN
KQI