Hinton's Capsule Networks

Hi,

This might of interest for state of the art deep learners…

Geoffrey Hinton’s Capsule Networks are said to contain recursive layers…

Best regards,
Aki Rehn

5 Likes

I have mixed feelings about this presenter - on one hand, he makes quite a few useful observations on DNNs in general that I wasn’t aware of before, so that was great.

On the other hand, either he rushes too quickly ahead or is not aware he is misrepresenting things, as there were instances where I thought ‘hey, this is not how it is’. Wished I remember what that was though where I felt this way…

Anyhow, he intrigued me and definitely has a good way of describing things, so I am willing to investigate more of his talks and see how I feel about them :slight_smile:

Thx for sharing - I am sure many people on these forums will find this interesting.

5 Likes

I think the capsule networks are really interesting and probably the next big thing in deep learning, if we can find a way to train them faster.

I also wrote a post about my understanding of the intuition behind them: https://medium.com/@pechyonkin/understanding-hintons-capsule-networks-part-i-intuition-b4b559d1159b

16 Likes

I also felt like he rushed through some parts, but his other videos are nice. He even was an instructor at Udacity’s introductory deep learning class.

3 Likes

Hi, maxim.pechyonkin, I’ve read your post on Medium: thanks a lot, what a great job.
After read your post, this Siraj’s video (just this) doesn’t look great any more, and the Hinton’s papers become more understandable for me.
I’m waiting for the next.

3 Likes

Great post, thank you!

1 Like

Yes, Siraj has a certain style. A little showy and entertaining, but usually solid content - given the time constraints he tends to rush over the details.

Worst case scenario is he raises the interest level in deep learning while maintaining a reasonable technical standard. Obviously he knows more than I do atm, so respect.

Good post by Maxim. Thanks!

1 Like

@jeremy
I think this will be a major advance for applications in medical imaging. Most classification/regression (diagnostic) problems in this domain needs positional/rotational spatial relation between features to perform optimally. We have decent results with CNN in medical imaging but Hinton is right (as usual); this is not the way our brain works.

Every day (hmmm every second) as a radiologist, I analyze local features in an image but I need to place them in context and in relation with the position of other features in the image to take a valid decision.

It will be very interesting to follow (or lead) the applied implementations in the next months/years.

5 Likes

I can’t use Siraj’s presentations at all. I have to back to Hinton’s course and redo the theory on my own starting with backprop and verifying code with hand derived gradients with simple numerical examples. The bigger problem is you get graded where and how you spend your time. They don’t have enough experience to develop topics in depth or solve a particular problem in a POC because they don’t have 10+ years experience at work or research. Seems to be more of a web programmer hacker attitude and learn from webposts. This is ok and I am truly grateful for a ~500$ class that does so but there are times when this can get you into trouble. There is a huge gap between when Jeremy/Vincent/Sebastian present vs. anyone else. When Udacity presents topics it is more buyer beware. It may be presented in a format which is incomplete/incorrect or fails to mention bigger issues. And if you repeat that presentation in a similar style then the people with more experience may think less of you … and reduce the opportunities available to you in the future…

1 Like

This is the peer reviewed paper on Dynamic Routing Between Capsules (Sara Sabour, Nicholas Frosst, Geoffrey E Hinton)

1 Like

Here is a clean and succinct implementation for Mnist (using pytorch).

3 Likes

I am hopeful too . Performance wise we still need to watch out. I will be discussing them at large in my upcoming Deep Learning book which I’ce commenced writing recently. /T

@alexandrecc @jeremy i think the idea to test it on medical data could be a much better place to shine (MNIST isn’t really the most difficult dataset out there). So I made a basic Kaggle Kernel (https://www.kaggle.com/kmader/capsulenet-on-lung-nodules/) for Lung Nodule classification where the data is downsampled to match MNIST size (from 64x64 -> 32x32). Obviously larger medical problems would be interesting but Capsule net is soo painfully slow even on GPU a toy-ish problem is all that can really be done now. The kernel is forked from the MNIST one which worked correctly (as figuring out if things are working correctly on CT data is never easy)

5 Likes

64 x 64 gives 5 nice crops of 32 x 32 :slight_smile: Four on the side and one from the middle. What would be really cool would be to pretrain the capsule network on some large corpus on 64 x 64 images and then run it with fine tuning against that lung nodules dataset.

Well, but then again, running it against that dataset is already very cool :slight_smile: Do benchmarks for it exist? Fingers crossed for the project and hoping you do a write up on what you learn and what the outcomes of this are :slight_smile:

Great kernel @kmader to give a test on medical images ! I’ll try to run your code this weekend locally on my dl computer.

I disagree that a 28x28 patch that needs 1 hour of training can’t be useful on the short term.If the inference phase is under 10-15 sec, pre-analyzing all nodules can be done in a decent time.

My humble domain expert intuition : lung nodule analysis on CT doesn’t need too much spatial, rotational information in relation to other features to be effective. But many many other problems in radiology need this kind of information.

Totally agree with you. Udacity presents topics that are more buyer aware.

This is my implementation of CapsNet in pytorch: https://github.com/acburigo/CapsNet

I hope this turns out to be useful to someone.

3 Likes

Great Post
waiting for the final 4th post

1 Like

I have been very busy recently, but I will try to publish the last part in the following weeks. I am glad that many people found my explanations useful.

1 Like

I’ve created the awesome-capsule-networks repository to compile the many excellent resources related to capsule networks into one list.

Let me know if you are aware of resources that should be added to the list!

3 Likes