After I fiddled around with the Vgg16 model provided by Jeremy, I came to the conclusion that I was just playing around with the knobs, and not really understanding the nuts and bolts. So I figured, the best ay to learn is I’d try to create my own model in Keras for cats&dogs redux.
my first model looks like this:
model = Sequential([ Flatten(input_shape=(3,224,224)), Dense(100), Activation('relu'), Dense(2), Activation('softmax') ]) model.compile(optimizer='rmsprop',
loss='categorical_crossentropy',
metrics=['accuracy'])
The result:
model.fit_generator(generator=train_batches,
samples_per_epoch=train_batches.nb_sample,
validation_data=val_batches, nb_val_samples=val_batches.nb_sample,
nb_epoch=1)
Epoch 1/1
22500/22500 [==============================] - 267s - loss: 8.0619 - acc: 0.4998 - val_loss: 8.2267 - val_acc: 0.4896
So I tried another simple model, this time with a convolutional layer.
model = Sequential([
Convolution2D(64,3,3, input_shape=(3,224,224)),
Activation('relu'),
Flatten(),
Dense(2),
Activation('softmax')])
Again, same thing: accuracy 0.5. During training, accuracy always hovers around 0.5, and doesn’t improve. Training more epochs doesn’t help.
I also tried the vgg-like convnet example from the Keras Getting started documentation ( https://keras.io/getting-started/sequential-model-guide/#examples ).
Here you see the same effect: during trainig the accuracy circles around 0.5, and never improves.
I tried changing the optimizer and changing the learning rate. But I always get a similar result.
None of the models ever improve, and always converge on an accuracy of 0.5. So clearly, I must be doing something wrong. But I can not figure out what.
Anybody have any clue?
Tim