Variational Auto-Encoder for text generation

2020 
Many different methods to text generation have been introduced in the past. Recurrent neural network language(RNNLM) is powerful and scalable for text generation in unsupervised generative modeling. We extended the RNNLM and propose the Variational Auto-Encoder Recurrent Neural Network(VAE-RNNLM), which designed to explicitly capture such global features as continuous latent variable. Maximum likelihood learning in such a model presents an intractable inference problem. VAE-RNNLM circumvents these difficulties by using the architecture of the latest advance in variational inference to introduce a practical training technique for powerful neural network generative models with latent variables. In this paper, we using VAE-RNNLM for text generation and achieved good performance.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    22
    References
    0
    Citations
    NaN
    KQI
    []