@surmenok @machinethink here’s the relevant code and complete error message and sorry for replying late -
from keras.callbacks import ReduceLROnPlateau, ModelCheckPoint
lr = (ReduceLROnPlateau(monitor=‘val_loss’,factor=.4,patience=3,min_lr = .00001))
chck = (ModelCheckpoint(‘VGG16-transferlearning.h5’, monitor=‘val_acc’, save_best_only=True,save_weights_only=True))
history.model.fit_generator(
train_datagen.flow(x_train, y_train, batch_size=batch_size),
validation_data = valid_datagen.flow(x_valid,y_valid,batch_size = batch_size*2),
nb_val_samples = x_valid.shape[0],
samples_per_epoch=x_train.shape[0],
nb_epoch=epochs,
callbacks=[chck,lr]
)
TypeErrorTraceback (most recent call last)
in ()
5 samples_per_epoch=x_train.shape[0]//5,
6 nb_epoch=epochs,
----> 7 callbacks=[chck,lr]
8 )
/home/nbuser/.local/lib/python2.7/site-packages/keras/engine/training.pyc in fit_generator(self, generator, samples_per_epoch, nb_epoch, verbose, callbacks, validation_data, nb_val_samples, class_weight, max_q_size, nb_worker, pickle_safe, initial_epoch)
1598 epoch_logs[‘val_’ + l] = o
1599
-> 1600 callbacks.on_epoch_end(epoch, epoch_logs)
1601 epoch += 1
1602 if callback_model.stop_training:
/home/nbuser/.local/lib/python2.7/site-packages/keras/callbacks.pyc in on_epoch_end(self, epoch, logs)
74 logs = logs or {}
75 for callback in self.callbacks:
—> 76 callback.on_epoch_end(epoch, logs)
77
78 def on_batch_begin(self, batch, logs=None):
/home/nbuser/.local/lib/python2.7/site-packages/keras/callbacks.pyc in on_epoch_end(self, epoch, logs)
750 def on_epoch_end(self, epoch, logs=None):
751 logs = logs or {}
–> 752 logs[‘lr’] = K.get_value(self.model.optimizer.lr)
753 current = logs.get(self.monitor)
754 if current is None:
/home/nbuser/.local/lib/python2.7/site-packages/keras/backend/theano_backend.pyc in get_value(x)
907 def get_value(x):
908 if not hasattr(x, ‘get_value’):
–> 909 raise TypeError('get_value() can only be called on a variable. '
910 ‘If you have an expression instead, use eval().’)
911 return x.get_value()
TypeError: get_value() can only be called on a variable. If you have an expression instead, use eval().
I just here want to monitor the val_loss and reduce lr automatically if val_loss doesn’t decrease after 3 epochs. Also at the same time I want to save best model weights by ModelCheckPoint call. That’s my objective.