NEWS Article Summarization with Pretrained Transformer

2021 
Pretrained language models have shown tremendous improvement in many NLP applications including text summarization. Text-to-Text transfer transformer (T5) and Bidirectional Encoder Representations from Transformers (BERT) are most recent pretrained language models applied most widely in NLP research domain. In this paper we have shown how T5 and BERT can be applied for text summarization task and can be use for both abstractive and extractive summary generation tool. Our hypothesis is that T5 performance outperform over BART and transformer developed from the scratch. To test our hypothesis used our dataset containing more than 80K news articles and their summaries. This dataset has been tested using BART, Text-to-Text transformer (T5), model generated using transfer learning over T5, and an encoder-decoder based model developed from scratch. The results show that T5 gives better result than other three models used for testing.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    14
    References
    1
    Citations
    NaN
    KQI
    []