Enhancing Topic Modeling Through Embedding Learning Strategies
Abstract
In the field of Natural Language Processing (NLP), topic modeling is crucial for uncovering patterns in textual data. Recent advances have combined traditional topic modeling with word embeddings, introducing the Embedded Topic Model (ETM). This thesis explores embedding learning strategies within topic modeling to improve the ETM and related models. It delves into more efficient variational inference, advanced word embedding techniques, and strategies for better topic interpretability. Practical implications in document classification, content recommendation, and summarization are evaluated. Scalability challenges for handling large textual corpora are also addressed. The integration of textual data with other modalities is pioneered. This work aims to enhance topic modeling using embedding learning strategies, bridging the gap between theory and practice in NLP.
Downloads
Published
How to Cite
Issue
Section
License
©2025 Jahangirnagar University Journal of Electronics and Computer Science. All rights reserved. However, permission is granted to quote from any article of the journal, to photocopy any part or full of an article for education and/or research purpose to individuals, institutions, and libraries with an appropriate citation in the reference and/or custcomary acknowledgement of the journal.