Lesson 7 In-Class Discussion

He can reply here.

Do you need dropout if you use batchnorm?

They do different things. Dropout is a regularization technique. BatchNorm has some regularization component but mostly allows for faster convergence.

1 Like

@yinterian Can the Batchnorm be the first layer? Does it save me normalising the data?

2 Likes

Did Jeremy answer this?

He said to normalize anyways.

1 Like

Why are they called bottleneck layers?

For high resolution image what sort of resnet architecture is useful - high number of blocks or lower number of blocks.

It is more about how much data you have. More layers means more parameters.

@jeremy can please end part 1 spending 5 minutes talking about your journey into deep learning. It will give us a learning model to follow.

Also, can you please talk about how you keep up with the large volume of research that’s coming out, how do you sift from this to find out what really matters to practitioners. Any workflow that you would recommend?

10 Likes

What’s CAM again?

Class Activation Maps

2 Likes

Is that like looking for something on an X-Ray?

This is exactly what I wanted to know like 1 week ago! lol

4 Likes

how do we attend part 2? will there be additional application again?

Everybody!

Yes. Like p1v2

2 Likes

yeah, I think we’re all in LOL

I want to attend to Part 2 v2

1 Like

Yay!

Congratulations to all those who sticked till the end.

#thankyou @jeremy

18 Likes