I constantly see Latent Dirichlet Allocation (LDA) as a go to technique for topic modelling. It performs okay-ish, but ignores word context and (subjectively) seems outdated. Has anyone implemented something like an LSTM with LDA to retain sentence information? What other approaches with neural nets could be a good fit for topic modelling?
If you’re looking for inspiration check out Chris Moody’s work on lda2vec here.
Saw this paper Topic Modeling in Embedding Spaces