Research

研究

研究業績

カテゴリー

研究領域

  • 2024年度 第19回言語処理若手シンポジウム(YANS2024),スポンサー賞(株式会社日立製作所賞): “指示数増加による大規模言語モデルの指示追従性能への悪影響”

    小島 武

  • Slender-Mamba: Fully Quantized Mamba From Head to Toe

    Zhenxuan Yu, Takeshi Kojima, Yutaka Matsuo and Yusuke Iwasawa.

    Proceedings of the 31st International Conference on Computational Linguistics (COLING 2025).

  • Geometric-Averaged Preference Optimization for Soft Preference Labels

    Hiroki Furuta, Kuang-Huei Lee, Shixiang Shane Gu, Yutaka Matsuo, Aleksandra Faust, Heiga Zen, Izzeddin Gur

    Advances in Neural Information Processing Systems 37 (NeurIPS 2024)

  • Which Programming Language and What Features at Pre-training Stage Affect Downstream Logical Inference Performance?

    Fumiya Uchiyama, Takeshi Kojima, Andrew Gambardella, Qi Cao, Yusuke Iwasawa, Yutaka Matsuo

    The 2024 Conference on Empirical Methods in Natural Language Processing (EMNLP 2024)

  • Suspicion-Agent: Playing Imperfect Information Games with Theory of Mind Aware GPT-4

    Jiaxian Guo*, Bo Yang*, Paul Yoo, Yuchen Lin, Yutaka Matsuo, Yusuke Iwasawa

    AAAI RL+LLM, 2024 (Oral).

  • Decoupling Noise and Toxic Parameters for Language Model Detoxification by Task Vector Merging

    Yongmin Kim, Takeshi Kojima, Yusuke Iwasawa, Yutaka Matsuo

    2024 First Conference on Language Modeling (COLM 2024).

  • Language Models Do Hard Arithmetic Tasks Easily and Hardly Do Easy Arithmetic Tasks

    Andrew Gambardella, Yusuke Iwasawa, Yutaka Matsuo

    Proceeding of the 62nd Annual Meeting of the Association for Computer Linguistics (ACL2024)

  • KG-Rank: Enhancing Large Language Models for Medical QA with Knowledge Graphs and Ranking Techniques

    Edison Marrese-Taylor

    Proceeding of the 62nd Annual Meeting of the Association for Computer Linguistics (ACL2024)

  • Improving Low-resource Asian Machine Translation using Bilingual Lexical Resources

    Francis Zheng, Edison Marrese-Taylor

    Proceeding of the 62nd Annual Meeting of the Association for Computer Linguistics (ACL2024).

  • On the Multilingual Ability of Decoder-based Pre-trained Language Models: Finding and Controlling Language-Specific Neurons

    Takeshi Kojima, Itsuki Okimura, Yusuke Iwasawa, Hitomi Yanaka, Yutaka Matsuo

    The Annual Conference of the North American Chapter of the Association for Computational Linguistics (NAACL-HLT 2024)