Hello nicolas, looked at your notebook to avoid memory problem in Lesson2 because of get_data(). its quite good that you have tried it on a different note… would have saved me a day’s effort if I had seen it before. I also stumbled upon the same approach which you had taken that of calling model.predict_generator().
Is there any reason why you had made shuffle=False while calling get_batches()
I get 97% validation accuracy as expected if shuffle=False. But when i call get_batches() with shuffle= True both my training & validation accuracy drops down to around 50%.
I have no clue why it happens… My understanding is shuffle=True on a training data should provide randomness to have a better accuracy while training the model… But I don’t know why the accuracy drops down…
Quoted the code here for better understanding of the problem…
batch_size = 64
train_batches = get_batches(trainpath, batch_size = batch_size, shuffle = False )
valid_batches = get_batches(validpath, batch_size = batch_size, shuffle = False )
def onehot(x): return np.array(OneHotEncoder().fit_transform(x.reshape(-1,1)).todense())
train_labels = onehot(train_batches.classes)
valid_labels = onehot(valid_batches.classes)
vgg = Vgg16()
model = vgg.model
train_features = model.predict_generator(train_batches, val_samples = train_batches.N)
valid_features = model.predict_generator(valid_batches, val_samples = valid_batches.N)
lm = Sequential([ Dense( 2, activation = ‘softmax’, input_shape=(1000,)) ])
lm.compile(optimizer = RMSprop(lr = 0.1), loss = ‘categorical_crossentropy’, metrics = [‘accuracy’] )
lm.fit(train_features, train_labels, batch_size = batch_size, nb_epoch = 3,
validation_data = (valid_features, valid_labels))
I get a training & validation accuracy of around 97% after training 3 epochs.
Now restarted the kernel. Made shuffle = True and re-ran the model.
indent preformatted text by 4 spaces
train_batches = get_batches(trainpath, batch_size = batch_size, shuffle = True )
valid_batches = get_batches(validpath, batch_size = batch_size, shuffle = True )
Any suggestions on why such a drop in accuracy??