Research

  • Home
  • Publications
  • Publications

    Category

    Research Area

    Year

    • AfroMT: Pretraining Strategies and Reproducible Benchmarks for Translation of 8 African Languages

      Machel Reid, Junjie Hu, Graham Neubig and Yutaka Matsuo

      The 2021 Conference on Empirical Methods in Natural Language Processing (EMNLP 2021). November 2021. Association for Computational Linguistics.

    • Subformer: Exploring Weight Sharing for Parameter Efficiency in Generative Transformers

      Machel Reid, Edison Marrese-Taylor and Yutaka Matsuo.

      Findings of The 2021 Conference on Empirical Methods in Natural Language Processing (Findings of EMNLP 2021). Association for Computational Linguistics. 

    • Information-theoretic regularization for learning global features by sequential VAE

      Kei Akuzawa, Yusuke Iwasawa, Yutaka Matsuo

      Mach Learn (2021)

    • The whole brain architecture approach: Accelerating the developmentof artificial general intelligence by referring to the brain

      Hiroshi Yamakawa

      Neural Networks (2021)

    • Bypassing combinatorial explosions in equivalence structure extraction

      Seiya Sato & Hiroshi Yamakawa

      Knowledge and Information Systems (2021)

    • Time-Sequential Variational Conditional Auto-encoders for Recommendation

      保住純,岩澤有祐,松尾豊

      人工知能学会論文誌,Vol. 36,No. 3 (2021)

    • Out-of-distribution Detection Using Joint Probability between Class and Geometric Transformation

      岡本弘野, 鈴木雅大, 松尾豊

      情報処理学会論文誌, Vol.62, No.7, pp.1382-1392 (2021)

    • Semi-supervised Out-of-distribution Detection Using Output of Intermediate Layer in Deep Neural Networks

      岡本弘野, 鈴木雅大, 松尾豊

      情報処理学会論文誌, Vol.62, No.4, pp.1142-1151 (2021)

    • Alignment-free Object-level Scene Change Detection using Deep Object Matching

      Kento Doi, Ryuhei Hamaguchi, Yusuke Iwasawa, Masaki Onishi, Yutaka Matsuo, Ken Sakurada.

      IEEE Robotics and Automatation Society (IROS2022)

    • Policy Information Capacity: Information-Theoretic Measure for Task Complexity in Deep Reinforcement Learning

      Hiroki Furuta, Tatsuya Matsushima, Tadashi Kozuno, Yutaka Matsuo, Sergey Levine, Ofir Nachum, and Shixiang Shane Gu

      International Conference on Machine Learning 2021 (ICML2021).