Adjusting learning rates and cleaning up data not working as I hoped!

You don’t need to put predict in a loop.

I didn’t quite understand what exactly you want to do, however:

  1. If you are afraid of making your model worse, save the weights (learn.save).

  2. If you want to keep training your model on the other dataset, save the weights, instantiate a new learner with the other dataset, load the weights, and keep training on.

  3. If you want to predict against a single img, do (more or less) what follows:

img = open_image('/path/to/your_image.jpg')
losses = img.predict(learn)  # learn is your learner
prediction = learn.data.classes[losses.argmax()]

print(prediction)

For more interesting things, look at: How to get an empty ConvLearner for single image prediction?

TIP: You keep doing too few epochs. Way too few: in fact, you are underfitting before and after fine tuning.

1 Like