Semi-Supervised Text Simplification with Back-Translation and Asymmetric Denoising Autoencoders

Authors

  • Yanbin Zhao Shanghai Jiao Tong University
  • Lu Chen Shanghai Jiao Tong University
  • Zhi Chen Shanghai Jiao Tong University
  • Kai Yu Shanghai Jiao Tong University

DOI:

https://doi.org/10.1609/aaai.v34i05.6515

Abstract

Text simplification (TS) rephrases long sentences into simplified variants while preserving inherent semantics. Traditional sequence-to-sequence models heavily rely on the quantity and quality of parallel sentences, which limits their applicability in different languages and domains. This work investigates how to leverage large amounts of unpaired corpora in TS task. We adopt the back-translation architecture in unsupervised machine translation (NMT), including denoising autoencoders for language modeling and automatic generation of parallel data by iterative back-translation. However, it is non-trivial to generate appropriate complex-simple pair if we directly treat the set of simple and complex corpora as two different languages, since the two types of sentences are quite similar and it is hard for the model to capture the characteristics in different types of sentences. To tackle this problem, we propose asymmetric denoising methods for sentences with separate complexity. When modeling simple and complex sentences with autoencoders, we introduce different types of noise into the training process. Such a method can significantly improve the simplification performance. Our model can be trained in both unsupervised and semi-supervised manner. Automatic and human evaluations show that our unsupervised model outperforms the previous systems, and with limited supervision, our model can perform competitively with multiple state-of-the-art simplification systems.

Downloads

Published

2020-04-03

How to Cite

Zhao, Y., Chen, L., Chen, Z., & Yu, K. (2020). Semi-Supervised Text Simplification with Back-Translation and Asymmetric Denoising Autoencoders. Proceedings of the AAAI Conference on Artificial Intelligence, 34(05), 9668-9675. https://doi.org/10.1609/aaai.v34i05.6515

Issue

Section

AAAI Technical Track: Natural Language Processing