T6: Retrieval-based Language Models and Applications

Akari Asai, Danqi Chen, Sewon Min, Zexuan Zhong

Abstract: Retrieval-based language models (LMs) have shown impressive performance on diverse NLP tasks. In this tutorial, we will provide a comprehensive and coherent overview of recent advances in retrieval-based LMs. We will start by providing preliminaries covering the foundation of LMs (e.g., masked LMs, autoregressive LMs) and retrieval systems (e.g., nearest-neighbor search). We will then detail recent progress in retrieval-based models, focusing on their model architectures and learning approaches. Finally, we will show how retrieval-based LMs are adapted to downstream applications, and extended to multilingual and multi-modal settings. Finally, we will use an exercise to showcase the effectiveness of retrieval-based LMs.

Time Event Hosts
Sunday, 14:00 T6: Retrieval-based Language Models and Applications Akari Asai, Danqi Chen, Sewon Min, Zexuan Zhong
Information about the virtual format of this tutorial: