Failing to Generate Unet Image

Clearly, outputting a 1x3 vector cannot be right, but I have no idea how to generate an image from the below modelcode. What am I doing wrong when I instantiate the model and what has to chagne? Thank you very much!

arch = models.resnet34
bs,size=32,128

data_path = os.getcwd() + ‘/data’
data = (ImageImageList.from_folder(data_path).split_by_rand_pct(0.1, seed=42)
.label_from_func(lambda x: data_path/x)
.transform(get_transforms(), size=size, tfm_y=True)
.databunch(bs=1).normalize(imagenet_stats, do_y=True))
data.c = 3

wd = 1e-3
learn_gen = unet_learner(data, arch, wd=wd, loss_func=F.l1_loss, callback_fns=LossMetrics,
blur=True, norm_type=NormType.Weight)

model_path = os.getcwd() + ‘/model’
state= torch.load(model_path + ‘/Gen_Test.pth’, map_location=torch.device(‘cpu’))
state = state[‘model’]

I AM SURE THE ERROR IS HERE:

body = create_body(models.resnet34, True, None)
data_classes=data.c
nf = callbacks.hooks.num_features_model(body) * 2
head = create_head(nf, data_classes, None, ps=0.5, bn_final=False)
model = nn.Sequential(body, head)

model.load_state_dict(state, strict=False)
model.eval()

prediction

fn = data.valid_ds.x.items[0]
img = open_image(fn);

img_path = os.getcwd() + ‘/data’ + ‘/ZenH_02658_f_24_i_nf_nc_hp_2014_1_e0_nl_o.jpg’
image = Image.open(img_path)
image = ToTensor()(image).unsqueeze(0) # unsqueeze to add artificial first dimension
x = model(image)

to_pil = transforms.ToPILImage()
fig=plt.figure(figsize=(10,10))
image = to_pil(x)
plt.imshow(image)
plt.show()
exit()

Can you post the full stack trace for the error? Also are you able to do data.show_batch()?