[SRW] Second Language Acquisition of Neural Language Models
Miyu Oba, Tatsuki Kuribayashi, Hiroki Ouchi, Taro Watanabe
Student Research Workshop Srw Paper
Session 6: Student Research Workshop (Poster)
Conference Room: Frontenac Ballroom and Queen's Quay
Conference Time: July 12, 09:00-10:30 (EDT) (America/Toronto)
Global Time: July 12, Session 6 (13:00-14:30 UTC)
TLDR:
With the success of neural language models (LMs), their language acquisition has gained much attention. This work sheds light on the second language (L2) acquisition of LMs, while previous work has explored their first language (L1) acquisition. Specifically, we trained bilingual LMs with a scenario...
You can open the
#paper-S127
channel in a separate window.
Abstract:
With the success of neural language models (LMs), their language acquisition has gained much attention. This work sheds light on the second language (L2) acquisition of LMs, while previous work has explored their first language (L1) acquisition. Specifically, we trained bilingual LMs with a scenario similar to human L2 acquisition and analyzed their cross-lingual transfer from linguistic perspectives.
Our experiments demonstrated that the L1 pretraining accelerated their linguistic generalization in L2, and language transfer configurations (e.g., the L1 choice, presence of parallel texts) substantially affected their generalizations. These also highlight that their L2 acquisition is not human-like in particular aspects.