Not sure if this is entirely correct, but with keras 1.x we did the following to set the learning rate:
model.optimizer.lr.set_value(0.01)
With keras 2.x this is no longer possible! The changes are deep as in I believe that model.compile no longer resets the model and thus can be used to change learning parameters on the fly (optimizer, etc) while to reset the model I think you need to recreate it.
This seems quite strange to me but decided to write about it since if this is correct it will likely trip people up who are watching the lectures and are using keras 2. There also were notebooks using python3 (and keras 2?) published here on the forum but I don’t think they use the new API just yet from what I was able to tell briefly at first glance.
So either I am terribly wrong or this might be quite useful information to share
Will continue to experiment with the new API - thus far finding the documentation to be a bit sparse on certain details and my reading through the source code is also not going very fast But will share if I find anything else of interest and would welcome comments from people experienced with keras 2.
Hello radek,
you should be able to change a learning rate x for a model y using: import keras.backend as K K.set_value(y.optimizer.lr,x)
This currently seems to be the safest way (a related comment at the bottom of the Keras thread https://github.com/fchollet/keras/issues/898 ).
Regarding the API changes in Keras 2, I think checking the Keras 2 docs is often enough for applying the required changes. When in doubt, it is also very helpful to check the API interfaces in the Keras 2/Keras 1 source code.
Thank you for providing a link to the source. I am running into all sorts of issues and will check out your repository. For instance, the most recent issue I am facing:
C:\Users\redact\Downloads\fast.ai\deeplearning1\nbs\vgg16.py:213: UserWarning: The semantics of the Keras 2 argument steps_per_epoch is not the same as the Keras 1 argument samples_per_epoch. steps_per_epoch is the number of batches to draw from the generator at each epoch. Basically steps_per_epoch = samples_per_epoch/batch_size. Similarly nb_val_samples->validation_steps and val_samples->steps arguments have changed. Update your method calls accordingly.
validation_data=val_batches, nb_val_samples=val_batches.nb_sample)
C:\Users\redact\Downloads\fast.ai\deeplearning1\nbs\vgg16.py:213: UserWarning: Update your fit_generator call to the Keras 2 API: fit_generator(<keras.pre..., validation_data=<keras.pre..., steps_per_epoch=0, epochs=1, validation_steps=8)
validation_data=val_batches, nb_val_samples=val_batches.nb_sample)
The update from @eljas on Apr 9 from here Keras 2 Released has helped until this roadblock
Update: using the code from @Robi repository (see above), I used lesson1.ipynb, utils.py and vgg16.py, I was able to run lesson1 on my CPU laptop - took about 90 minutes.
I think I managed to keep all modules at current available versions. Even after I thought I had all the modules, I still needed to upgrade anaconda from within conda as: