Generating summary using sequence to sequence model

2020 
With the advent of the information explosion era, automatic summarization of text abstracts has become more and more popular in academic circles in natural language processing. At present, there are many problems in the process of generating text abstracts. In view of the problem of gradient disappearance, this paper uses the advantages of LSTM in dealing with long-distance dependence of sequence data to alleviate this problem, making the model pay more attention to more effective information. In order to further improve the performance of the model, attention mechanism is introduced to solve the problem of insufficient semantic understanding and unsmooth statement. The seq2seq method based on LSTM and attention mechanism designed in this paper can deal with the problems existing in generating abstracts and get better effect of generating text abstracts, which can be applied to text extraction tasks.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    7
    References
    2
    Citations
    NaN
    KQI
    []