Model based learning vs learning functions

Of late I have been looking at probabilistic graphical models , which attempt the learn the parameters of a probability distribution from data and hence can make future predictions using that. A good example of this is LDA (Latent Dirichlet allocation), which is unsupervised. I discovered similar models like Gaussian mixture models ( which can estimate any density distribution in the world, just as deep learning can estimate any function) The obvious question is- why are we not seeing more of this model based learning being integrated along with ‘function learning’, which is what deep learning is all about. I found older papers which talk about digit recognition/impage recognition, using Bernoulli mixture models :slight_smile:http://users.dsic.upv.es/~ajuan/research/2004/Juan04_08b.pdf
I wonder why these techniques are not being used together with current deep learning.

There is an article in ACM about this same theme :

Thanks
Anirban

1 Like