Decimals of get_preds vs pytorch predictions are different

The results I got from pytorch have more decimals

However, the results using get_preds only gives limited decimals

How to make fastai’s prediction same as pytorch prediction?

This is most likely due to the type of pred1 not being the same as pred2

pred1 has type (float 32) floating point 32 (full precision) --> this means a float is represented in memory with 32 bits (4 bytes etc). Pytorch will be using this unless specified otherwise

pred2 via fastai is using half precision - called at the time when you declare your learn object most likely. This uses only 16 bits of memory for each number.

You can easily cast the half precision number up to full precision (32 bits) or vice versa
With Fastai: learn.to_fp32() or learn.to_fp16()

More information can be found here: https://docs.fast.ai/callback.fp16.html

1 Like