Anyone starting the course July 2019

While using predict(),it is throwing an Attribute error:‘apply tfms’ when I used a DataBunch object of my test set. Does anyone know how to rectify it?

Hi everyone,

I also just started last week (beginning lesson 2 now). Anyone interested in starting a more “formal” study group, where we can discuss the lessons in more details and potentially collaborate on projects? I’m currently using Colab (seems relatively straightforward, and integrates nicely with Google Drive for sharing).

  • Lee

I just started lesson 1. I am trying to practice with a data set I have on my local machine but don’t know how to load them into my notebook for training.

I tried using => path = untar_data(‘C:\Users\Ixxxl\Documents\notebooks\nobs\santa’);

I just started lesson 1. I am trying to practice with a data set I have on my local machine but don’t know how to load them into my notebook for training.

I tried using => path = untar_data(‘C:\Users\Ixxxl\Documents\notebooks\nobs\santa’);

Sure.


How to rectify this error?

From the snapshot, it seems the filepath string has become the concatenation of all the filenames in your directory.
Are you sure you want to do that?

Are you sure your Itemlist isn’t empty?
In my experience, split_by_rand_pct works as long as it is passed a non-empty ItemList and LabelList.

Try printing your itemlist and labellist. Also random_split_by_pct is deprecated. Use split_by_rand_pct instead.

The path was actually p1/‘images’. I tried printing the ItemList. This is what I got.

I started in July. I’m up to the 4th lesson now but before that am trying to do segmentation on the ADE20k datset.

I rectified the problem and now I have created a DataBunch. While using fit_one_cycle(), I am getting an
" _thnn_conv2d_forward not supported on CPUType for Half" error . Did anyone else face this problem?

untar_data seems to be used for downloading and unpacking a dataset from a URL (see the docs), not for loading in data that is local. The lesson2-download notebook may be more useful as a reference on how work with data that you already have.

if you’re still running into issues, could you show more of your commands and what the errors are?

#Murali
I am on 3 week
Would you like to collaborate

I am on week 1. I am using google collab

Hi all, another newbie here! I am using Colab and I am trying to get going with the do your image classifier but I am getting stuck with the part when you have to upload your data and the csv files (I am using lecture 2 as template). What I reckon is the fast ai library is pointing at some other location and I can’t understand how to make the root directory be in my google drive…the bit in the lecture assumes that you are running your jupyter notebook off your local machine so it’s not helpful with this to the best of my understanding.
I think I manage to get the Colab to be rooted in the Drive folder by using the code from this article (where it says ‘Mounting Google Drive’) https://medium.com/lean-in-women-in-tech-india/google-colab-the-beginners-guide-5ad3b417dfa

But then if assuming that’s working then I don’t think I manage to get the fastai to look into the google drive. Seems to me I am missing some obvious key bit here.
Would appreciate your help!
Cheers
Val

Hi Val,

Could you give some more details on your error?

I changed the code in the section Create directory and upload urls file into your server, and making sure path points to the path where your gdrive is mounted and you want to save your images, like: /content/gdrive/My Drive/fastai-v3/data/bears. I think that was the only place where I needed to change the path to be specific to GDrive.

In the part where you upload your URL files, I instead went to my GDrive and put the files there (instead of using the Upload button that is shown in the notebook). I did this after running the dest.mkdir() commands, so the proper folders would be created and I could see them in my GDrive (such as fastai-v3/data/bears in this example). It took a few seconds to sync at some point, and I might have had to refresh the Drive page.

Hope this helps,
Lee

Hi @Valentin,
Here is a Jupyter notebook that shows you how to set up for fastai v1 in Google Colab


Hope it will help you solve the problem.
Cheers,
Joseph

1 Like

Hello, thanks for your help! I followed the suggested notebook and cloned the course v3 repo into my Drive as suggested.
I am still struggling to understand how the file structure works.
Basically I can’t understand what’s the right location for me to upload the two csv files which were outputed from the google image search javascript code.
I presume first it should go in the root directory (in that case straight into the main MyDrive), but I got error
then I thought I should create in the root directory a folder data and then subfolder ‘guitars’ (I am trying to create classifier which looks at picture of person playing guitar and trying to see whether the resnet would correctly classify the guitar model…so this would be mirroring the ‘bears’ structure from the lecture file) and then the files should be there to be discovered by running the script. However, I am getting the error that it is not finding the files or folder - so I am not sure where I go wrong. Presume its something dead obvious but I can’t see it at the moment…sorry! Here is a screenshot of the error.

@jianjye Any luck with your personal project …

the location of the csv files seems correct. I think the path variable may need to be the full path (so something like path = "/content/gdrive/My Drive/.../data/guitars", where the ... is whatever the full path is from the root of your GDrive).

For example this is what your GDrive directory structure could look like:

My\ Drive
  ./fastai-v3
    ./data
      ./guitars
        ./strat
        ./les_paul
        ./strat.csv
        ./les_paul.csv

and here’s some corresponding python code which should work with that:

from google.colab import drive
drive.mount('/content/gdrive', force_remount=True)

folder = 'strat'
file = 'strat.csv'

folder = 'les_paul'
file = 'les_paul.csv'

classes = ['strat', 'les_paul']

# run once with the "strat" variables set, once with the "les_paul" variables set
path = Path('/content/gdrive/My Drive/fastai-v3/data/guitars')
dest = path/folder
dest.mkdir(parents=True, exist_ok=True)
download_images(path/file, dest, max_pics=200)

Hope this helps!