Setting of learning rate


(Kimmo Ojala) #1

I found it confusing that in lesson 1 the learning rate is set as follows: vgg.model.optimizer.lr = 0.01

When trying this I noticed that I can run this only after vgg.finetune(batches). If I try to run this directly after creating an instance of class Vgg16 I get: AttributeError: ‘Sequential’ object has no attribute ‘optimizer’

This is confusing because if understand correctly when an instance of Vgg16 is created, the model is set to be (model = self.model = Sequential()) Sequential and vgg.model.optimizer.lr seems to imply to me that that the Sequential model has an optimizer attribute.

How does this work? Why is it possible to set the learning rate only after running vgg.finetune?


(Irshad Muhammad) #2

Are you sure, you want to start from Part 1 v1 of course? Do you have special to reason learn the material taught in that part?

  • It uses Python 2.7, a large number of libraries have started to drop support for Python 2.7.
  • It mainly uses the theano as backend which is dead for long
  • It uses the Keras 1.2.2, now which is updated to major revision 2.0 with major changes in API.

I would recommend start from part 1 v2


Lesson 1 - ImportError with Theano
(Matthew Kleinsmith) #3

Adding to irshaduetian’s recommendation:

Here’s a link to part 1 v2: Welcome to Part 1 (v2)

And here’s Jeremy’s quote about switching:


(Kimmo Ojala) #4

Thanks for the tip. I’ll switch to part I v2!