Cross-lingual Machine Reading Comprehension with Language Branch Knowledge Distillation

Junhao Liu, Linjun Shou, Jian Pei, Ming Gong, Min Yang, Daxin Jiang


Abstract
Cross-lingual Machine Reading Comprehension (CLMRC) remains a challenging problem due to the lack of large-scale annotated datasets in low-source languages, such as Arabic, Hindi, and Vietnamese. Many previous approaches use translation data by translating from a rich-source language, such as English, to low-source languages as auxiliary supervision. However, how to effectively leverage translation data and reduce the impact of noise introduced by translation remains onerous. In this paper, we tackle this challenge and enhance the cross-lingual transferring performance by a novel augmentation approach named Language Branch Machine Reading Comprehension (LBMRC). A language branch is a group of passages in one single language paired with questions in all target languages. We train multiple machine reading comprehension (MRC) models proficient in individual language based on LBMRC. Then, we devise a multilingual distillation approach to amalgamate knowledge from multiple language branch models to a single model for all target languages. Combining the LBMRC and multilingual distillation can be more robust to the data noises, therefore, improving the model’s cross-lingual ability. Meanwhile, the produced single multilingual model can apply to all target languages, which saves the cost of training, inference, and maintenance for multiple models. Extensive experiments on two CLMRC benchmarks clearly show the effectiveness of our proposed method.
Anthology ID:
2020.coling-main.244
Volume:
Proceedings of the 28th International Conference on Computational Linguistics
Month:
December
Year:
2020
Address:
Barcelona, Spain (Online)
Editors:
Donia Scott, Nuria Bel, Chengqing Zong
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
2710–2721
Language:
URL:
https://aclanthology.org/2020.coling-main.244
DOI:
10.18653/v1/2020.coling-main.244
Bibkey:
Cite (ACL):
Junhao Liu, Linjun Shou, Jian Pei, Ming Gong, Min Yang, and Daxin Jiang. 2020. Cross-lingual Machine Reading Comprehension with Language Branch Knowledge Distillation. In Proceedings of the 28th International Conference on Computational Linguistics, pages 2710–2721, Barcelona, Spain (Online). International Committee on Computational Linguistics.
Cite (Informal):
Cross-lingual Machine Reading Comprehension with Language Branch Knowledge Distillation (Liu et al., COLING 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.coling-main.244.pdf
Data
SQuADXQuAD