Research

研究

  • Home
  • 研究業績
  • 国際会議
  • 研究業績

    カテゴリー

    研究領域

    • A Few Thousand Translations Go a Long Way! Leveraging Pre-trained Models for African News Translation

      David Ifeoluwa Adelani, Jesujoba Oluwadara Alabi, Angela Fan, Julia Kreutzer, Xiaoyu Shen, Machel Reid, Dana Ruiter, Dietrich Klakow, Peter Nabende, Ernie Chang, Tajuddeen Gwadabe, Freshia Sackey, Bonaventure F. P. Dossou, Chris Chinenye Emezue, Colin Leong, Michael Beukman, Shamsuddeen Hassan Muhammad, Guyo Dub Jarso, Oreen Yousuf, Andre Niyongabo Rubungo, Gilles HACHEME, Eric Peter Wairagala, Muhammad Umair Nasir, Benjamin Ayoade Ajibade, Tunde Oluwaseyi Ajayi, Yvonne Wambui Gitau, Jade Abbott, Mohamed Ahmed, Millicent Ochieng, Anuoluwapo Aremu, Perez Ogayo, Jonathan Mukiibi, Fatoumata Ouoba Kabore, Godson Koffi KALIPE, Derguene Mbaye, Allahsera Auguste Tapo, Victoire Memdjokam Koagne, Edwin Munkoh-Buabeng, Valencia Wagner, Idris Abdulmumin, Ayodele Awokoya

      The 2022 Conference of the North American Chapter of the Association for Computational Linguistics (NAACL 2022). July 2022. Association for Computational Linguistics

    • PARADISE: Exploiting Parallel Data for Multilingual Sequence-to-Sequence Pretraining

      Machel Reid and Mikel Artetxe.

      7th Workshop on Representation Learning for NLP (non-archival), ACL 2022.

    • Generalized Decision Transformer for Offline Hindsight Infomation Matching

      Hiroki Furuta, Yutaka Matsuo, and Shixiang Shane Gu

      International Conference on Learning Representations 2022 (ICLR2022, Spotlight).

    • Improving the Robustness to Variations of Objects and Instructions with a Neuro-Symbolic Approach for Interactive Instruction Following

      Kazutoshi Shinoda, Yuki Takezawa, Masahiro Suzuki, Yusuke Iwasawa, Yutaka Matsuo

      Workshop on Novel Ideas in Learning-to-Learn through Interaction, EMNLP 2021.

    • Test-Time Classifier Adjustment Module for Model-Agnostic Domain Generalization

      Yusuke Iwasawa, and Yutaka Matsuo.

      Advances in Neural Information Processing Systems 2021 (NeurIPS2021, Spotlight). December 2021.

    • Co-Adaptation of Algorithmic and Implementational Innovations in Inference-based Deep Reinforcement Learning

      Hiroki Furuta, Tadashi Kozuno, Tatsuya Matsushima, Yutaka Matsuo, and Shixiang Shane Gu.

      Advances in Neural Information Processing Systems 2021 (NeurIPS2021). December 2021.

    • AfroMT: Pretraining Strategies and Reproducible Benchmarks for Translation of 8 African Languages

      Machel Reid, Junjie Hu, Graham Neubig and Yutaka Matsuo

      The 2021 Conference on Empirical Methods in Natural Language Processing (EMNLP 2021). November 2021. Association for Computational Linguistics.

    • Subformer: Exploring Weight Sharing for Parameter Efficiency in Generative Transformers

      Machel Reid, Edison Marrese-Taylor and Yutaka Matsuo.

      Findings of The 2021 Conference on Empirical Methods in Natural Language Processing (Findings of EMNLP 2021). Association for Computational Linguistics. 

    • Alignment-free Object-level Scene Change Detection using Deep Object Matching

      Kento Doi, Ryuhei Hamaguchi, Yusuke Iwasawa, Masaki Onishi, Yutaka Matsuo, Ken Sakurada.

      IEEE Robotics and Automatation Society (IROS2022)

    • Policy Information Capacity: Information-Theoretic Measure for Task Complexity in Deep Reinforcement Learning

      Hiroki Furuta, Tatsuya Matsushima, Tadashi Kozuno, Yutaka Matsuo, Sergey Levine, Ofir Nachum, and Shixiang Shane Gu

      International Conference on Machine Learning 2021 (ICML2021).