Passing multiple callbacks in keras(early stopping,modelcheckpoint, lrRateScheduler))

@jeremy i want to pass on lrRateScheduler and modelcheckpoint in callbacks for 2 reasons!

  1. i want to use the best model instead of the last epoch result.
  2. At some point my validation loss stops decreasing. so i want to decrease the learning rate which can be done by lrRateScheduler .

please help me with this.

Suggestion, use ReduceLROnPlateau and ModelCheckpoint. You just need to set the variable to monitor (“val_loss” might be fine), other than filenames, reduction factor.
The keras documentation is quite self-explanatory :slight_smile:

1 Like

@DavideBoschetto thanx ! but the thing i want to know is how to pass multiple callbacks like ReduceLROnPlateau and modelcheckpoint in callbacks parameter. for example -

reduce_lr = ReduceLROnPlateau(monitor=‘val_loss’, factor=0.2,
patience=5, min_lr=0.001)
chckpoint = ModelCheckpoint(filepath, monitor=‘val_loss’, verbose=0, save_best_only=False, save_weights_only=False, mode=‘auto’, period=1) = [reduce_lr , chckpoint]

This gives error.

Please tell me the right solution.

Syntax of ‘fit’ function call seems to be correct.
Please share more details, including the error message you got.

You didn’t say what the error was but for what it’s worth, Keras currently gives an error when using the ModelCheckpoint callback if I’ve manually set the learning rate. ¯\_(ツ)_/¯

What error does it give you?

For me the error is that lr is not a tensor but a float. This happens when I do something like = 0.01. I’m just wondering if something similar happens with @noskill’s error.

Please try this to set learning rate:

K.set_value(, 1e-5)


Thanks. :smiley: I’m still curious to see what @noskill’s error was, though. :wink:

@surmenok @machinethink here’s the relevant code and complete error message and sorry for replying late -

from keras.callbacks import ReduceLROnPlateau, ModelCheckPoint

lr = (ReduceLROnPlateau(monitor=‘val_loss’,factor=.4,patience=3,min_lr = .00001))
chck = (ModelCheckpoint(‘VGG16-transferlearning.h5’, monitor=‘val_acc’, save_best_only=True,save_weights_only=True))

train_datagen.flow(x_train, y_train, batch_size=batch_size),
validation_data = valid_datagen.flow(x_valid,y_valid,batch_size = batch_size*2),
nb_val_samples = x_valid.shape[0],

TypeErrorTraceback (most recent call last)
in ()
5 samples_per_epoch=x_train.shape[0]//5,
6 nb_epoch=epochs,
----> 7 callbacks=[chck,lr]
8 )

/home/nbuser/.local/lib/python2.7/site-packages/keras/engine/training.pyc in fit_generator(self, generator, samples_per_epoch, nb_epoch, verbose, callbacks, validation_data, nb_val_samples, class_weight, max_q_size, nb_worker, pickle_safe, initial_epoch)
1598 epoch_logs[‘val_’ + l] = o
-> 1600 callbacks.on_epoch_end(epoch, epoch_logs)
1601 epoch += 1
1602 if callback_model.stop_training:

/home/nbuser/.local/lib/python2.7/site-packages/keras/callbacks.pyc in on_epoch_end(self, epoch, logs)
74 logs = logs or {}
75 for callback in self.callbacks:
—> 76 callback.on_epoch_end(epoch, logs)
78 def on_batch_begin(self, batch, logs=None):

/home/nbuser/.local/lib/python2.7/site-packages/keras/callbacks.pyc in on_epoch_end(self, epoch, logs)
750 def on_epoch_end(self, epoch, logs=None):
751 logs = logs or {}
–> 752 logs[‘lr’] = K.get_value(
753 current = logs.get(self.monitor)
754 if current is None:

/home/nbuser/.local/lib/python2.7/site-packages/keras/backend/theano_backend.pyc in get_value(x)
907 def get_value(x):
908 if not hasattr(x, ‘get_value’):
–> 909 raise TypeError('get_value() can only be called on a variable. '
910 ‘If you have an expression instead, use eval().’)
911 return x.get_value()

TypeError: get_value() can only be called on a variable. If you have an expression instead, use eval().

I just here want to monitor the val_loss and reduce lr automatically if val_loss doesn’t decrease after 3 epochs. Also at the same time I want to save best model weights by ModelCheckPoint call. That’s my objective.

Do you have any code that explicitly changes learning rate? E.g. code like this? = 0.01

If yes then it could be an issue, because the code of ReduceLROnPlateau expects that is an instance of Variable class. If you have such code, try to replace it with a call to set_value function like this:

K.set_value(, 1e-5)
1 Like

it worked. tHanks @surmenok.