Denoising AutoEncoder Based Delete and Generate Approach for Text Style Transfer

2021 
Text style transfer task is transferring sentences to other styles while preserving the semantics as much as possible. In this work, we study a two-step text style transfer method on non-parallel datasets. In the first step, the style-relevant words are detected and deleted from the sentences in the source style corpus. In the second step, the remaining style-devoid contents are fed into a Natural Language Generation model to produce sentences in the target style. The model consists of a style encoder and a pre-trained DenoisingAutoEncoder. The former extracts style features of each style corpus and the latter reconstructs source sentences during training and generates sentences in the target style during inference from given contents. We conduct experiments on two text sentiment transfer datasets and comprehensive comparisons with other relevant methods in terms of several evaluation aspects. Evaluation results show that our method outperforms others in terms of sentence fluency and achieves a decent tradeoff between content preservation and style transfer intensity. The superior performance on the Caption dataset illustrates our method’s potential advantage on occasions of limited data.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    0
    Citations
    NaN
    KQI
    []