In lesson 7 when setting the size of our data, the text says “We’ll begin with a small side and use gradual resizing.” .How does that gradual resizing operate?
I’ve been able to generate decent images with the default 64x64 size, I’d like to scale up to 128x128. Can I somehow append layers to the existing network and keep the old weights? Basically hack together a pro-GAN?
Thank you in advance!
iirc he’s talking about gradually increasing the sizes of the images as he’s talked about in previous lessons.
you get to the point that your network is getting close to overfitting and then create a new databunch with larger images and train some more. at that point it’s like transfer learning on new data with a pre-trained network. you don’t change the size of the net at all.
if you go back and have a look at the lesson again you should find that’s what he’s doing.
Thanks for your reply! The issue is that with GANs the image size is baked into the network when you create the basic_critic and basic_generator The number of layers is a function of image size. Passing data with a different image size then kicks up an error.
i don’t see any gradual resizing in lesson 7 or in the notes.
not sure if you’re still interested but i’m actually playing with the lesson 7 super-res gan notebook for something and i realise now where you’re talking about. i’ve just noticed that right at the bottom, after training the gan (gen & critic together) for 40 epochs, he literally just changes the data in the learner for a larger size and trains some more:
lr = 1e-4
I thought this was a no-no with GANs tbh, I thought the input/output size was set from the beginning. I assume this is the transfer learning trick you use to “start again” with a pre-trained network which is pre-trained on your own data, not just image-net.
I can’t remember tbh, it’s been a while since I watched lesson 7. If that’s what it is then I’ve just been a dumb-ass and trained all the way though on 256 images. I could have started smaller to speed things up at the beginning and bumped up the sizes towards the end.