It looks like the Fastai GAN leaner does not support fp16. Is it possible to first train the generator and critic at fp16 to speed up training and then convert them to fp32 for the GAN portion of the training? I feel like this could really speed everything up.
I feel like this shouldn’t be impossible yet with all of my google skills I can’t seem to find an answer.
model = convert_network(model, dtype=torch.float32)
I found this searching around on the webs. Would this be able to convert a model to fp32 so it can be trained in a GAN in fastai? I found it in the fastai fp16 callback.