Towards Zero-Shot Multilingual Transfer for Code-Switched Responses
Ting-Wei Wu, Changsheng Zhao, Ernie Chang, Yangyang Shi, Pierce I-Jen Chuang, Vikas Chandra, Biing Juang
Main: Multilingualism and Cross-Lingual NLP Main-oral Paper
Session 1: Multilingualism and Cross-Lingual NLP (Oral)
Conference Room: Pier 4&5
Conference Time: July 10, 11:00-12:30 (EDT) (America/Toronto)
Global Time: July 10, Session 1 (15:00-16:30 UTC)
Keywords:
cross-lingual transfer
Languages:
chinese, spanish, indonesian
TLDR:
Recent task-oriented dialog systems have had great success in building English-based personal assistants, but extending these systems to a global audience is challenging due to the need for annotated data in the target language. An alternative approach is to leverage existing data in a high-resource...
You can open the
#paper-P693
channel in a separate window.
Abstract:
Recent task-oriented dialog systems have had great success in building English-based personal assistants, but extending these systems to a global audience is challenging due to the need for annotated data in the target language. An alternative approach is to leverage existing data in a high-resource language to enable cross-lingual transfer in low-resource language models. However, this type of transfer has not been widely explored in natural language response generation. In this research, we investigate the use of state-of-the-art multilingual models such as mBART and T5 to facilitate zero-shot and few-shot transfer of code-switched responses. We propose a new adapter-based framework that allows for efficient transfer by learning task-specific representations and encapsulating source and target language representations. Our framework is able to successfully transfer language knowledge even when the target language corpus is limited. We present both quantitative and qualitative analyses to evaluate the effectiveness of our approach.