Part 2 Lesson 13 Wiki

(Sam) #191

It has been a while, so I do not remember but I think I changed definition of actn_loss2:

I changed the line out = V(sf.features)

to out = V(sf.features, requires_grad=True)

I model did not converge. see if it does for you

(James_Ying) #192

I found the reason.
Change one line in in fastai/dl2

will solve
RuntimeError: element 0 of variables does not require grad and does not have a grad_fn



(Sam) #193

When I make the change in my local,
I get an error at
optimizer = optim.LBFGS([opt_img_v], lr=0.5)

ValueError: can’t optimize a non-leaf Tensor

instead in definition actn_loss2 I changed
line out = V(sf.features)
to out = torch.tensor(sf.features, requires_grad=True)

Now the model converges nicely

I also changed line in definition of style_loss
outs = [V(o.features) for o in sfs] to outs = [torch.tensor(o.features, requires_grad=True) for o in sfs]

line in definition of comb_loss
outs = [V(o.features) for o in sfs] to outs = [torch.tensor(o.features, requires_grad=True) for o in sfs]

(James_Ying) #194

Thanks for that.
run on comb_loss is no error
but run on before ones like 1 Style transfer 2 forward hook 3 Style match. will have errors

But, only opt_img_v need require_grad=True.
all the weights in every layers should be fixed.

There are three images
0 noise image
1 original image (content image)
2 style image
We need fix all the neural network, but update the noise image very step.
Let the noise image looks similar with the original image and style image.
Normal network is update weights, but now we update the 0 noise image.

outs is equal (input image and get the result on every block_ends layers)

How about your image looks like, can you show me your result?

(Sam) #195

The images look good…similar to what jeremy has in his notebook

(Phil Weslow) #196

I am having the same issue.

I have downloaded the data to '…/fastai/courses/dl2/data0/datasets/cyclegan/horse2zebra'. Since that did not work, I also placed a copy of /data0/datasets/... in my home directory.

Can’t seem to figure out what’s causing the hang-up.