Dynamic Structured Neural Topic Model with Self-Attention Mechanism
Nozomu Miyamoto, Masaru Isonuma, Sho Takase, Junichiro Mori, Ichiro Sakata
Findings: Information Retrieval and Text Mining Findings Paper
Session 1: Information Retrieval and Text Mining (Virtual Poster)
Conference Room: Pier 7&8
Conference Time: July 10, 11:00-12:30 (EDT) (America/Toronto)
Global Time: July 10, Session 1 (15:00-16:30 UTC)
Spotlight Session: Spotlight - Metropolitan East (Spotlight)
Conference Room: Metropolitan East
Conference Time: July 10, 19:00-21:00 (EDT) (America/Toronto)
Global Time: July 10, Spotlight Session (23:00-01:00 UTC)
TLDR:
This study presents a dynamic structured neural topic model, which can handle the time-series development of topics while capturing their dependencies.
Our model captures the topic branching and merging processes by modeling topic dependencies based on a self-attention mechanism.
Additionally, we in...
You can open the
#paper-P1520
channel in a separate window.
Abstract:
This study presents a dynamic structured neural topic model, which can handle the time-series development of topics while capturing their dependencies.
Our model captures the topic branching and merging processes by modeling topic dependencies based on a self-attention mechanism.
Additionally, we introduce citation regularization, which induces attention weights to represent citation relations by modeling text and citations jointly.
Our model outperforms a prior dynamic embedded topic model regarding perplexity and coherence, while maintaining sufficient diversity across topics.
Furthermore, we confirm that our model can potentially predict emerging topics from academic literature.