Tell2Design: A Dataset for Language-Guided Floor Plan Generation

Sicong Leng, Yang Zhou, Mohammed Haroon Dupty, Wee Sun Lee, Sam C Joyce, Wei Lu

Main: Resources and Evaluation Main-oral Paper

Session 4: Resources and Evaluation (Oral)
Conference Room: Metropolitan East
Conference Time: July 11, 11:00-12:30 (EDT) (America/Toronto)
Global Time: July 11, Session 4 (15:00-16:30 UTC)
Keywords: nlp datasets
TLDR: We consider the task of generating designs directly from natural language descriptions, and consider floor plan generation as the initial research area. Language conditional generative models have recently been very successful in generating high-quality artistic images. However, designs must satisfy...
You can open the #paper-P2577 channel in a separate window.
Abstract: We consider the task of generating designs directly from natural language descriptions, and consider floor plan generation as the initial research area. Language conditional generative models have recently been very successful in generating high-quality artistic images. However, designs must satisfy different constraints that are not present in generating artistic images, particularly spatial and relational constraints. We make multiple contributions to initiate research on this task. First, we introduce a novel dataset, Tell2Design (T2D), which contains more than 80k floor plan designs associated with natural language instructions. Second, we propose a Sequence-to-Sequence model that can serve as a strong baseline for future research. Third, we benchmark this task with several text-conditional image generation models. We conclude by conducting human evaluations on the generated samples and providing an analysis of human performance. We hope our contributions will propel the research on language-guided design generation forward.