Tutorial 4: Beyond the Bag of Words: Text Analysis with Contextualized Topic Models

Tutorial 4: Beyond the Bag of Words: Text Analysis with Contextualized Topic Models

NLP and CSS 201: Beyond the Basics

Most topic models still use Bag-Of-Words (BoW) document representations as input. These representations, though, disregard the syntactic and semantic relationships among the words in a document, the two main linguistic avenues to coherent text. Recently, pre-trained contextualized embeddings have enabled exciting new results in several NLP tasks, mapping a sentence to a vector representation. Contextualized Topic Models (CTM) combine contextualized embeddings with neural topic models to increase the quality of the topics. Moreover, using multilingual embeddings allows the model to learn topics in one language and predict them for documents in unseen languages, thus addressing a task of zero-shot cross-lingual topic modeling.

Author: Silvia Terragni, Professor

Duration: 59:45