Converting a fp16 model to fp32 for a GAN?

It looks like the Fastai GAN leaner does not support fp16. Is it possible to first train the generator and critic at fp16 to speed up training and then convert them to fp32 for the GAN portion of the training? I feel like this could really speed everything up.

I feel like this shouldn’t be impossible yet with all of my google skills I can’t seem to find an answer.

model = convert_network(model, dtype=torch.float32)

I found this searching around on the webs. Would this be able to convert a model to fp32 so it can be trained in a GAN in fastai? I found it in the fastai fp16 callback.

You should be able to just do learn.to_fp32()

Oh darn really? I will give it a shot. So I will load the fp16 models and then do to_fp32 on them before loading them in to the GAN for training.

Yes this should work…