I got the following warning and it crashed my system.
fastai/courses/dl1/fastai/core.py:23: UserWarning: volatile was removed and now has no effect. Use with torch.no_grad(): instead. x = Variable(T(x), volatile=volatile, requires_grad=requires_grad)
I raised this issue with PyTorch forum and here is the reply I received: Are you sure you’re not running out of memory? volatile greatly decreases the usage, but was deprecated in favor of torch.no_grad(). Try wrapping the code that uses x in this context manager.
I also tried reducing the variable sz=100, then it worked. But the accuracy I am getting is only 95%. Is there any fix for this?
I had the same problem and, as @jeremy said, due to the python version. PIL downgrades python to 2.7. As @sepehr mentioned, pillow should be used instead of PIL. These are the commands that made my conda environment work properly:
Please, I can’t import these lines of code
from fastai.transforms import *
from fastai.conv_learner import *
from fastai.model import *
from fastai.dataset import *
from fastai.sgdr import *
from fastai.plots import *
I am using anaconda on my windows 10 laptop