While running the 3rd code_snippet (1st lesson):
from fastai.transforms import *
from fastai.conv_learner import *
from fastai.model import *
from fastai.dataset import *
from fastai.sgdr import *
from fastai.plots import *
File “/Users/deeppepe/Desktop/Coding/fastai_master/courses/dl1/fastai/core.py”, line 41
if cuda: a = to_gpu(a, async=True)
SyntaxError: invalid syntax
Not sure what’s wrong… I tried doing a clean pull of all the files from Git… and it keeps throwing this error. Any suggestions?
The syntax is wrong. Try this:
if torch.cuda.is_available(): a = to_gpu(a, async=True)
Actually, you don’t need to check if CUDA is available because by calling
to_gpu() function, fastai library already taken care of this check.
This is triggered by the imports. Am not doing any checking at all… the syntax error is after running the 3rd cell (imports).
Are you suggesting to change the fastai’s Core.py (line 41) to the one you wrote? I’ll give that a try.
Edit1: Tried, didn’t work. What did work, however, was commenting out on line 41 of Core.py (in fastai folder) the line to following:
if cuda: a = to_gpu(a) #, async=True)
There is no more error being thrown upon runing imports. Why is this?
You don’t have to do that.
OK, found the root of the problem. Are you using Python 3.7?
Yes. I see… shouldn’t have upgraded i guess. is there a syntax change?
According to Python deprecation plans,
async will become a keyword in Python 3.7; it means it can’t be used as an argument name, as it’d be a SyntaxError.
async keyword was renamed to
non_blocking in PyTorch since version 0.4.0. Ref - PyTorch CUDA semantics documentation, PyTorch Pull Request making this change.
fastai library does not support Python 3.7 yet. This should be a bug and you can report this issue to fastai GitHub repo.
Thank you so much for finding the root cause! Hope I’ll be half as skilled as you one day
I have the same problem… Is there an issue on github?
I just need to mention I’ve come across this exact problem and @cedric 's answer was helpful here.
Thank you very much for this.