Hi…
Do need any additional installation of below to make fp16 work in kaggle ?
“conda install -y pytorch=1.1.0 cudatoolkit=10.0 -c pytorch”
Nope… If programming in pure PyTorch, for tensors and models, add .half()
. If you are using fastai, for Learner
add .to_fp16()
. It should be using fp16 (fastai will use mixed precision).
Ok I heard from some in forums fp16 to work ,needs above installation after recent updates in kaggle,so casted the doubts
I just tried fp16 on kaggle kernels yesterday. It doesn’t work. So you might be right that some installation is needed.
in what way you found not working…
I am able to keep more bs and see model converging…with fp16.Current torch installation version in kaggle is 1.2 ,cuda is 10.1
waiting hear back from @sgugger will fp16 in current fastai work well with above torch versions…
So far not finding any difference
hi @ilovescience by fp16 i meant tofp16 only… was wanting to confirm does it has compatibility issues with torch 1.2 and cuda 10.1 …
In APTOS we fully used tofp16 for past 2 months…
not sure when kaggle updated torch and cuda to 1.2 and 10.1
The progress bar appears but never trains.
not sure it is happening in my case also … i see loss coming down…