Chapter 5 to_fp16 problem

I was trying to replicate the noebook from chapter 5 (pet breeds classifier) but when using .to_fp16() I got theese results to_fp16

The model is doing nothing but I don’t get any error, is this happened to anybody so far?
Does anybody know how I could solve this?
thank you in advance

I wasn’t able to replicate your results. Are you following the notebook line by line? Does the error persist without FP16? What’s the output of dls.show_batch()?

Yes I am following the notebook line by line and everything works fine untill i try to use .to_fp16() on sesnet50.
Before trying to_fp16 the results where very similar to the book.

I will try to run the notebook again to see if something is different, thank you.

1 Like

Can you check which version of fastai you are using? if it is before version 2.2, to_fp16() is using the fastai version of the function – since that version the library now defaults to the Pytorch/Nvidia version: https://github.com/fastai/fastai/issues/3127

you could try using to_native_fp16() in your current environment

1 Like

As ali_baba said, make sure you have the latest version of fastai installed because if without FP16 nothing breaks, then it’s most certainly the mixed precision causing the issue.

If updating the library doesn’t help, the problem might be your NVIDIA drivers, CUDA, PyTorch dependencies, etc. Likely not the case though.

Good luck!

1 Like

Thank you guys!

could it be related to this error i get when I load fastbook?

ERROR: torchtext 0.9.1 has requirement torch==1.8.1, but you’ll have torch 1.7.1 which is incompatible.