Abstraction Based Bengali Text Summarization Using Bi-directional Attentive Recurrent Neural Networks

2021 
Summarizing text is recognized as a vital problem in the field of deep learning and natural language processing (NLP). The entire process of text summarization is proved to be critical in correct and quickly summarizing massive texts, something which could be not only time consuming but also expensive if it is done without the help of machines. We intended to make a model that will be able to generate fluent, effective human likely summarized Bengali text. We built the model with the help of bi-directional RNNs with LSTM cells at the encoding layer and in the decoder layer. We used the attention mechanism for an appropriate result. We followed the structure of sequence-to-sequence model which is mostly used in machine translation. We also used a word embedding file that was pre-trained and specially created for Bengali NLP researches. We tried to keep the entire train loss as low as possible and build a useful and most human likely text summarizer. After the entire experiment, our model can predict quite meaningful and fluent text summaries with a training loss of 0.007.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    0
    Citations
    NaN
    KQI
    []