Have checked multiple times (conda version 4.6, conda update conda issued, pulled fastai library to the latest) this strange problem persists for me.
from fastai.vision.import * isn’t getting untar_data causing NameError.
Only from fastai import * is fixing it, getting it & many others (like np etc) in the namespace.
This indeed is weird.
Likewise. Just to be sure I did a conda update conda again after which ssh kept failing. Turns out conda update messed up my jupyter installation (known issue)! So I installed jupyter and the conda kernels manually and tried again. Then realised it also wiped out my pytorch installation.
Going to create a new GCP instance and try afresh and report back later today.
@maya - with a brand new GCP instance things are working fine for me (as Jeremy indicated, with fastai.vision import alone). Note that I did not have to update conda or install/update fastai libraries - this is what I got as-is with the new instance:
Now I need to decide if I should try to debug my older instance and get it to this state, or abandon it and move my config/setup to this new one. Sigh.
Hi, I have setup a new instance in crestle.ai and I’m experiencing the same issue with the untar_data function not found.
The version of conda after executing conda update conda is 4.2.6 and the installed version after running conda install -c fastai fastai is 1.0.34.
Running conda update fastai doesn’t upgrade the version of fastai so I guess conda version of fastai is not up to date. I can confirm however that the pip install fastai --upgrade resolved the issue by upgrading fastai to 1.0.42 version.
This is weird. I have my own local linux box that I used to do the 2018 course, so it has a bunch of fasta v0.7 stuff. I did Maya’s two lines of code and it did seem to work but my pytorch version disappeared!
For the peeps that are having issues with running untar_data(), I found out that I have to import the fastai library as a whole to make it work - from fastai import *.
This worked for me as well. I had just recently built the environment and installed fastai and pytorch so I assumed I was at the latest version. FYI: I do need to add that I was confused about the NVIDIA GPU issue. I’m on a MacBook Pro 2017 Mojave. So, it maybe possible I followed the wrong installation path. Anyways I did the upgrade and restarted the kernel
I had the same issue, adding the from fastai import * fixed it for me, thanks. I have had to juggle a few version changes just to get things working due to an intel mkl error, which ended up requiring an mkl downgrade to fix. Everything is working now except matplotlib, which is sufficient progress for tonight’s troubleshooting
Edit: Gah, I pushed it and tried to update fastai, as in the first post in the FAQ. That promptly broke mkl_intel_thread.dll again with the “ordinal 242” error.
It turned out I needed to follow this post and add the indicated line to my system variables to finally resolve the MKL errors. In case anyone else runs across something similar.
Conda installed max 1.0.38 I used pip to get 1.0.51, neither works
Funny thing is, I got it to work once, but when I copied lesson1 to make my own dataset, the “NameError: name ‘untar_data’ is not defined” came back on both notebooks
Check to see which version of fastai your python kernel is using:
in jupyter:
"import fastai; fastai. _ _ version_ _ [no spaces for underscores; post formatting workaroud]
in terminal (of your deeplearning environment):
python [opens up python command line]
import fastai; fastai. _ _ version _ _
if untar_data() is not working you haven’t got an updated version. Another way to see this is use the following command and scroll to fastai row: >conda list
pip install wasn’t able able to update the package for me either; I think conda overrides it.
a question?
after doing untar_data()
the path which i am getting is:- PosixPath(’/home/abhishek/.fastai/data/oxford-iiit-pet’)
but i can’t find the data file there or this path
how can i know where my data is showing?
untar_data() started not working with error message “not defined” as the posts here mention. It used to work okay until a few days ago. I tried various things as mentioned above, but only unsuccessfully. I created a new clean virtual machine, installed clean new Anaconda, installed fastai, and tried untar_data to download imdb data to test. It does not work… same error, “not defined”. Any help will be appreciated…
I guess this is not an issue any more for others. But just in case there are anybody who struggles with this, I am sharing my experience here. My problem was with the version of fastai. When you pip install fastai, it installs fastai of version 2.0 (I think this is different from fastai2). With this version, I could not even import fastai. The problem was fixed after I pip install fastai==1.0.61.