◼︎書誌情報
Machel Reid, Junjie Hu, Graham Neubig, Yutaka Matsuo “AfroMT: Pretraining Strategies and Reproducible Benchmarks for Translation for 8 African Languages”, Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing (EMNLP)
【著者】Machel Reid, Junjie Hu (Carnegie Mellon University), Graham Neubig (Carnegie Mellon University), Yutaka Matsuo
【タイトル】AfroMT: Pretraining Strategies and Reproducible Benchmarks for Translation for 8 African Languages
◼︎概要
Reproducible benchmarks are crucial in driving progress of machine translation research. However, existing machine translation benchmarks have been mostly limited to high-resource or well-represented languages. Despite an increasing interest in low-resource machine translation, there are no standardized reproducible benchmarks for many African languages, many of which are used by millions of speakers but have less digitized textual data. To tackle these challenges, we propose AfroMT, the first standardized, clean, and reproducible machine translation benchmark for eight widely spoken African languages. We also develop a suite of analysis tools for system diagnosis taking into account the unique properties of these languages. Furthermore, we explore the newly considered case of low-resource focused pretraining and develop two novel data augmentation-based strategies, leveraging word-level alignment information and pseudo-monolingual data for pretraining multilingual sequence-to-sequence models. We demonstrate significant improvements when pretraining on 11 languages, with gains of up to 2 BLEU points over strong baselines. We also show gains of up to 12 BLEU points over cross-lingual transfer baselines in data-constrained scenarios. All code and pretrained models are released as further steps towards larger reproducible benchmarks for African languages.