■Bibliographic Information
Gouki Minegishi, Hiroki Furuta, Shohei Taniguchi, Yusuke Iwasawa, Yutaka Matsuo. “In-Context Meta Learning Induces Multi-Phase Circuit Emergence”. International Conference on Machine Learning (ICML).
■Overview
Transformer-based language models exhibit In-Context Learning (ICL), where predictions are made adaptively based on context. While prior work links induction heads to ICL through phase transitions, this can only account for ICL when the answer is included within the context. However, an important property of practical ICL in large language models is the ability to meta-learn how to solve tasks from context, rather than just copying answers from In this paper, we experimentally clarify how such meta-learning ability In this paper, we experimentally clarify how such meta-learning ability is acquired by analyzing the dynamics of the model’s circuit during training by extending the copy task from previous research to an In- Interestingly, in this setting, we find that there are The emergence of a unique circuit emerges in each phase, contrasting with the single-phase transition in induction heads. The emergence of such circuits can be related to several phenomena known in large language models, and our analysis lead to a deeper understanding of the source of the The emergence of such circuits can be related to several phenomena known in large language models, and our analysis lead to a deeper understanding of the source of the Transformer’s ICL ability.