Carvana-unet prediction for single image

I am following the carvana-unet notebook and repeating for a different dataset (https://www.cityscapes-dataset.com). I am running into issues when creating a prediction method. Here’s what I currently have:

sz = 128

def get_data(sz):
    tfms = tfms_from_model(f, sz)
    return ImageClassifierData.from_arrays(PATH, ims, ims, tfms=tfms)

data = get_data(sz)

predict = ConvLearner.pretrained(f, data, precompute=False)
predict.load('128urn-car-0')
predict.model.eval()
trn_tfms, val_tfms = tfms_from_model(f, sz)

fileName = im_names[0]
im = np.array(open_image(fileName))
im_transformed = val_tfms(im)
plt.imshow(im)

pr = predict.predict_array(im_transformed[None])
print(pr)

This was based on this post.

This is the error I get: RuntimeError: Error(s) in loading state_dict for Sequential.

2 Likes

RuntimeError: Error(s) in loading state_dict for Sequential.

Common traps in PyTorch. I guess the error was due to this line, predict.load('128urn-car-0')?

If that’s the case, there is a problem at the point when you serialize (save) and then un-serialized (load the model weights or state dict). Similar problem raised in PyTorch forum.

2 possible causes:

  • PyTorch serialization backend by default is handled by Pickle and this makes it sensitive to Python module structure (directory structure of your Python module files (*.py)
  • you have saved the model state dict together with the model architecture (graph) and then you change your model definition.

Have a look at PyTorch Serialization guide to clearly understand the different serialization semantics.

What I have just explained above is mostly what I learned from this SO post (which strangely hard to find in Google Search)

I had this same error when using the SSD model. The issue was the modeldata object has to be recreated using the exact same method as was created when the model was trained. So if there’s a custom head you must add the custom head to the pretrained ConvLearner when you load the model - along with the same transforms and everything.