Lesson 2 In-Class Discussion

So preCompute=True means it just loads the weights for previous layers (except for the top) from the stored Resnet model and we are not finetuning it?

1 Like

I tried it in keras, It failed because of the single channel! Do you have any working sample for that?

1 Like

Regarding precompute=True: Is my understanding correct that when this is done,

  • the library will take the non-augmented dataset
  • pass it through the neural network with pretrained weights already loaded
  • note down the values of the activation of each neuron
  • and save it to disk?
4 Likes

And a followup to that is, how does it make use of these pre computed activations during training?

It can be done but most libraries assume that you have a 3 channel image. You can convert your one channel into 3 channel by copying 3 times your channel.

1 Like

Is cyclic learning rate happening per minibatch or per epoch?

Learning rates are adjusted every epoch.

How does manual annealing work? By stopping the learning and starting it again with the lower learning rate? Is there a way to do it online? If the learning is being stopped and restarted, how are the learnt parameters saved?

That’s what I originally thought here too, but then how does it apply here when we only train for 1 epoch?

How do you fix/define amplitude of jump?

What if the loss surface is a plateau and cyclic learning rate is not enough to get us out of it? How should we update learning rate? Importantly, how do we know if we are at plateau?

How many cycles do we use? Is there a problem with too many cycle_multi?

1 Like

Is there a difference between a batch and a mini-batch?

1 Like

Are the same thing.

5 Likes

What if the last cycle ends up in a narrower minima ?

You may overfit.

1 Like

I have heard that if the accuracy isnt increasing then look at the Rank-5 accuracy as well…would this apply here as well?

@yinterian Please share augmentation techniques for Non-image data?

4 Likes

Does the learn.save save the params while the learning is running?

we save them at the end when we are done