Research
研究
研究業績
カテゴリー
研究領域
年
-
BRA駆動開発のためのLLMを用いた神経投影情報の自動抽出とデータベース構築
堀口 維里優, 芦原 佑太, 山川 宏
第33回日本神経回路学会全国大会
-
大規模言語モデルによる情感豊かなインタラクション実現のためのロードマップの提案 – 扁桃体による恐怖や動機付けの仕組みを中心に –
大森 隆司、田和辻 可昌、宮本 竜也、芦原 佑太、荒川 直哉、山川 宏
2023年度人工知能学会全国大会(第37回)
-
DiffusER: Diffusion via Edit-based Reconstruction
Machel Reid, Vincent Josua Hellendoorn, Graham Neubig
International Conference on Learning Representations (ICLR2023)
-
Large Language Models are Zero-Shot Reasoners
Takeshi Kojima, Shixiang Shane Gu, Machel Reid, Yutaka Matsuo, Yusuke Iwasawa
Neural Information Processing Systems (NeurIPS 2022)
-
LSTMモデルによる金融経済レポートの指数化
山本裕樹, 落合桂一, 鈴木雅大, 松尾豊
情報処理学会論文誌トランザクションデジタルプラクティス (2022)
-
On the Impact of Data Augmentation on Downstream Performance in Natural Language Processing
Itsuki Okimura, Machel Reid, Makoto Kawano and Yutaka Matsuo.
Proceedings of the Third Workshop on Insights from Negative Results in NLP, Online and Dublin, Ireland. Association for Computational Linguistics. Best Paper Award
-
A Few Thousand Translations Go a Long Way! Leveraging Pre-trained Models for African News Translation
David Ifeoluwa Adelani, Jesujoba Oluwadara Alabi, Angela Fan, Julia Kreutzer, Xiaoyu Shen, Machel Reid, Dana Ruiter, Dietrich Klakow, Peter Nabende, Ernie Chang, Tajuddeen Gwadabe, Freshia Sackey, Bonaventure F. P. Dossou, Chris Chinenye Emezue, Colin Leong, Michael Beukman, Shamsuddeen Hassan Muhammad, Guyo Dub Jarso, Oreen Yousuf, Andre Niyongabo Rubungo, Gilles HACHEME, Eric Peter Wairagala, Muhammad Umair Nasir, Benjamin Ayoade Ajibade, Tunde Oluwaseyi Ajayi, Yvonne Wambui Gitau, Jade Abbott, Mohamed Ahmed, Millicent Ochieng, Anuoluwapo Aremu, Perez Ogayo, Jonathan Mukiibi, Fatoumata Ouoba Kabore, Godson Koffi KALIPE, Derguene Mbaye, Allahsera Auguste Tapo, Victoire Memdjokam Koagne, Edwin Munkoh-Buabeng, Valencia Wagner, Idris Abdulmumin, Ayodele Awokoya
The 2022 Conference of the North American Chapter of the Association for Computational Linguistics (NAACL 2022). July 2022. Association for Computational Linguistics
-
PARADISE: Exploiting Parallel Data for Multilingual Sequence-to-Sequence Pretraining
Machel Reid and Mikel Artetxe.
7th Workshop on Representation Learning for NLP (non-archival), ACL 2022.
-
AfroMT: Pretraining Strategies and Reproducible Benchmarks for Translation of 8 African Languages
Machel Reid, Junjie Hu, Graham Neubig and Yutaka Matsuo
The 2021 Conference on Empirical Methods in Natural Language Processing (EMNLP 2021). November 2021. Association for Computational Linguistics.
-
Subformer: Exploring Weight Sharing for Parameter Efficiency in Generative Transformers
Machel Reid, Edison Marrese-Taylor and Yutaka Matsuo.
Findings of The 2021 Conference on Empirical Methods in Natural Language Processing (Findings of EMNLP 2021). Association for Computational Linguistics.