Lesson 2 official topic

Hi, thank you very much for responding.
Code I am using is available on “Chapter 2, Production” (on Colab)

02_production.ipynb

[23] 1 dls = bears.dataloaders(path)


TypeError Traceback (most recent call last)
in <cell line: 1>()
----> 1 dls = bears.dataloaders(path)

6 frames
/usr/local/lib/python3.10/dist-packages/fastai/data/core.py in setup(self, train_setup)
395 x = f(x)
396 self.types.append(type(x))
→ 397 types = L(t if is_listy(t) else [t] for t in self.types).concat().unique()
398 self.pretty_types = ‘\n’.join([f’ - {t}’ for t in types])
399

TypeError: ‘NoneType’ object is not iterable

1 Like

To fix this problem make sure first to use search_images_ddg instead of search_images_bing
Then you must modify the following cells:

results = search_images_ddg('grizzly bear') # search_images_ddg does not need a  key
ims = results # we don't need .attrgot('contentUrl') anymore maybe because search_images_ddg returns already an L object
len(ims)

Then you must modify the code that downloads the images

if not path.exists():
    path.mkdir()
    for o in bear_types:
        dest = (path/o)
        dest.mkdir(exist_ok=True)
        results = search_images_ddg(f'{o} bear') # we change bing to ddg and remove the key
        download_images(dest, urls=results) # we remove attrgot('contentUrl')

Now you should be fine :slight_smile:

image

Hi there! Looking for some help with the actual deployment of the model.

In the colab from the book, we train the model, export it, and then import it again all in the same notebook. I’m trying to figure out how to export from one notebook and import it into another. So far, I have exported the model to my Google Drive, but I’m struggling to import it into a different notebook using the load_learner() function.

In simple terms: my goal is to create a notebook in which a user can just insert an image and have it classified. I imagine this can be done by importing the model from a remote place with load_learner() and then following from there. But where can it be stored, and how to correctly import it?

you must export your model as .pkl file and then the new notebook must have access to that file in order to import it and work with it again.

I have exported the model, created the other notebook, and tried to reference the model from the new notebook using load_learner(). It works when I use the Colab directory feature to copy the path that leads to the file, but I noticed this is specifically referencing my personal Drive, which is what the notebook is connected to.

So far, the code looks:
bear_classifier = load_learner("/content/gdrive/MyDrive/Colab Notebooks/bears_model.pkl")

The problem is that, if someone else downloads this notebook and tries to run it to classify their own bear image, it won’t work, because load_learner() will try to connect to their Drive and follow the same path (but the model won’t be there).

So my question is: is there any way to make load_learner() load a model based on a URL, so that whenever a third party downloads the notebook to predict an image, it will work?

1 Like

How about you deploy your model using gradio/huggingface you will have a url that a user can visit and uploads his/her image and get a prediction from your model.

If you still want to do that on colab maybe put your pickle file on your github and try reading from there?

Thank you very much, especially that you sharing your knowledge and time with complete stranger. I will test this and and let you know if worked. :slight_smile: all the best.

1 Like

I am glad to be of help :slight_smile: let me know of the result

Image_Classifier with github actions workflow