Research

研究

研究業績

カテゴリー

研究領域

  • A Few Thousand Translations Go a Long Way! Leveraging Pre-trained Models for African News Translation

    David Ifeoluwa Adelani, Jesujoba Oluwadara Alabi, Angela Fan, Julia Kreutzer, Xiaoyu Shen, Machel Reid, Dana Ruiter, Dietrich Klakow, Peter Nabende, Ernie Chang, Tajuddeen Gwadabe, Freshia Sackey, Bonaventure F. P. Dossou, Chris Chinenye Emezue, Colin Leong, Michael Beukman, Shamsuddeen Hassan Muhammad, Guyo Dub Jarso, Oreen Yousuf, Andre Niyongabo Rubungo, Gilles HACHEME, Eric Peter Wairagala, Muhammad Umair Nasir, Benjamin Ayoade Ajibade, Tunde Oluwaseyi Ajayi, Yvonne Wambui Gitau, Jade Abbott, Mohamed Ahmed, Millicent Ochieng, Anuoluwapo Aremu, Perez Ogayo, Jonathan Mukiibi, Fatoumata Ouoba Kabore, Godson Koffi KALIPE, Derguene Mbaye, Allahsera Auguste Tapo, Victoire Memdjokam Koagne, Edwin Munkoh-Buabeng, Valencia Wagner, Idris Abdulmumin, Ayodele Awokoya

    The 2022 Conference of the North American Chapter of the Association for Computational Linguistics (NAACL 2022). July 2022. Association for Computational Linguistics

  • PARADISE: Exploiting Parallel Data for Multilingual Sequence-to-Sequence Pretraining

    Machel Reid and Mikel Artetxe.

    7th Workshop on Representation Learning for NLP (non-archival), ACL 2022.

  • 第25回画像の認識・理解シンポジウム (MIRU2022) MIRU優秀賞: Pixel vs. Object: 変化キャプショニングにおける最適な画像表現についての研究

    土居健人, 濱口竜平, 岩澤有祐, 大西正輝, 松尾豊, 櫻田健,

  • Best Paper Award:“On the Impact of Data Augmentation on Downstream Performance in Natural Language Processing”. Proceedings of the Third Workshop on Insights from Negative Results in NLP, Online and Dublin, Ireland. Association for Computational Linguistics, 2022

    Itsuki Okimura, Machel Reid, Makoto Kawano and Yutaka Matsuo.

  • AfroMT: Pretraining Strategies and Reproducible Benchmarks for Translation of 8 African Languages

    Machel Reid, Junjie Hu, Graham Neubig and Yutaka Matsuo

    The 2021 Conference on Empirical Methods in Natural Language Processing (EMNLP 2021). November 2021. Association for Computational Linguistics.

  • Subformer: Exploring Weight Sharing for Parameter Efficiency in Generative Transformers

    Machel Reid, Edison Marrese-Taylor and Yutaka Matsuo.

    Findings of The 2021 Conference on Empirical Methods in Natural Language Processing (Findings of EMNLP 2021). Association for Computational Linguistics. 

  • LEWIS: Levenshtein Editing for Unsupervised Text Style Transfer

    Machel Reid and Victor Zhong

    Findings of the Association for Computational Linguistics: The Joint Conference of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (ACL-IJCNLP 2021).

  • Variational Inference for Learning Representations of Natural Language Edits

    Edison Marrese-Taylor, Machel Reid and Yutaka Matsuo.

    The Thirty-Fifth AAAI Conference on Artificial Intelligence (AAAI-21). February 2021.

  • DORi: Discovering Object Relationships for Moment Localization of a Natural Language Query in a Video

    Cristian Rodriguez-Opazo, Edison Marrese-Taylor, Basura Fernando, Hongdong Li and Stephen Gould

    The IEEE Winter Conference on Applications of Computer Vision (WACV). January 2021

  • ICMU Best Poster Award: Fast Spatial Twitter Search Method Using Location Adaptive Range Query

    Keiichi Ochiai, Daisuke Torii, Yusuke Fukazawa and Yutaka Matsuo