A walk with fastai2 - Vision - Study Group and Online Lectures Megathread

One more question, I’m trying to get my head around L or more specifically fastcore.foundation.L. Is there a web page on https://dev.fast.ai/index.html which describes it? I think it was mentioned in one of the lectures, but I can’t find it … and also can not find it in https://dev.fast.ai/torch_core :frowning:

Have you tried doing L?? and scroll down to the bottom you should find the location
/usr/local/lib/python3.6/dist-packages/fastcore/foundation.py should be there :slight_smile:

2 Likes

ah, thanks. I knew it was silly question.

not at all, in fact there should be an easier way with show docs showing you a link, let me see if i can find what it is or if some one else can chime in. :slight_smile:

1 Like

I am just going through the lecture now, and saw a question with hooks. what helped me was going to https://dev.fast.ai/callback.hook#What-are-hooks? and pasting the first example into a colab and stepping through it.

from that, my intuition is that the hooks.stored (as used in the notebook) are the input and output parameters. after that, i realized it was pretty much the same as what Zachary said.

parameters are the weights and biases. these are the things the model updates based on the loss.
activations are numbers that are calculated like the output of a matrix multiplication (convolutions) or outputs of a ReLU(or any other activation layer).
SO there is a difference between parameters and activations.
Here hooks are used to grab the activations not the parameters if i’m not wrong

@muellerzr, I am going through the notebook 01_ custom.ipynb in colab. Can you please explain how to do the step: " Create directory and upload URL file into your server". I have two files called covered.txt and uncovered.txt sitting in my laptop which contain the URLs of the images that I need to perform classification on.

Hi vahuja4 hope your having a good day!

Create directory and upload URL file into your server". I have two files called covered.txt and uncovered.txt sitting in my laptop which contain the URLs of the images that I need to perform classification on.

In the folders list you should have covered and uncovered.
In the files list you should have covered.txt and uncovered.txt

When you have created the directory structure you can can drag and drop you you text files to the same directory as the folders and run the remaining cells.

Hope this helps

1 Like

@barnacl can you tell me more about this resizing issue? I believe it must be in the defaults since it is working fine now but I could not find it in the item_tfms

@mrfabulous1, thank you! I tried what you suggested. Please take a look at the error in the screenshot. Everything seems to be in-place, but I am getting this error.

@mgloria it’s in the resize transform:

https://github.com/fastai/fastai2/blob/master/fastai2/vision/augment.py#L192

Specifically in the method you can see we use Image.NEAREST which grabs the pixel nearest the original

2 Likes

The assumption in the notebook is you don’t have the txt files nested in the folders. Since you do you need to point a path to each one of the files too. It thinks they’re all in the base directory (hence not found). The different directory structure is shown in the video.

IE path_f now becomes data/sla/notarp

If you take a look at the screenshot, I am printing the variable path/n and it is pointing to data/sla/tarp, and path_f is tarp.txt. Is that not correct?

They both need it in this case, as the first path is the destination, the second path should point to exactly where our file is stored.

1 Like

@muellerzr, thank you. I have it running now. Can you please answer one more question: Now that I have a trained model, how can I use it behind a flask server (CPU only) for inference? If you could point me to some resources, please. I can see that there are two files called stage1.pth and stage2.pth, created in the models directory. How do I take them and use them for inference?

We go into that in the next lesson for how to set your code up in general (geared towards Render but it’s applicable JavaScript). Otherwise others may chime in. And the rest of your questions are answered then too. But if you right click the file you can download it

To better understand L class, you may clone the fastcore repo on your local machine and run the nbs\01_foundation.ipynb notebook (it’s the one that generates foundation.py). There are may tests that show L class uses cases. For example:

t = L(range(12))
test_eq(t, list(range(12)))
test_ne(t, list(range(11)))
t.reverse()
test_eq(t[0], 11)
t[3] = "h"
test_eq(t[3], "h")
t[3,5] = ("j","k")
test_eq(t[3,5], ["j","k"])
test_eq(t, L(t))
test_eq(L(L(1,2),[3,4]), ([1,2],[3,4]))
t = L()
test_eq(t, [])
t.append(1)
test_eq(t, [1])
t += [3,2]
test_eq(t, [1,3,2])
t = t + [4]
test_eq(t, [1,3,2,4])
t = 5 + t
test_eq(t, [5,1,3,2,4])
test_eq(L(1,2,3), [1,2,3])
test_eq(L(1,2,3), L(1,2,3))
t = L(1)*5
t = t.map(operator.neg)
test_eq(t,[-1]*5)
test_eq(~L([True,False,False]), L([False,True,True]))
t = L(range(4))
test_eq(zip(t, L(1).cycle()), zip(range(4),(1,1,1,1)))
t = L.range(100)
test_shuffled(t,t.shuffle())

You can also create new cells in the same notebook and run your own experiments. For instance, if you are intrigued by the output of say zip(t, L(1).cycle()) you can create a cell like the one here below and run it, and display the output:

t = L(range(4))
r = zip(t, L(1).cycle())
r

If you want to restore the original file, you can run git restore nbs\01_foundation.ipynb

2 Likes

Hi vahuja4 hope all is well!

If you could point me to some resources, please.

I suggest you read the following threads.

https://forums.fast.ai/t/free-web-deployment/62333m - This shows the challenges of deploying on a platform which you haven’t used before.

Deployment Platform: Render ✅ This is the thread I would recommend if your not an expert in web apps, because it uses starter code that can be found here https://course.fast.ai/deployment_render.html, it can also be set up locally, and there many posts on the render thread discussing issues.

Also in the previous link are four other options such as AWS however they all can be tricky if your not experienced in html, css, javascript, docker, web apps, server side coding etc.

I can see that there are two files called stage1.pth and stage2.pth, created in the models directory.
How do I take them and use them for inference?

I would suggest you complete the production part of the notebook and create a .pkl file for your model save this on your G drive or locally and and run (pip list or pip freeze) and save the contents to a text file as you will need to make sure the library versions of the platform you deploy on are the same versions as the platform you trained on. Save this with your .pth files though I see more deployments with .pkl files than .pth files.

Remember fastai2 is still in development so there are not many deployment resources yet.
All the resources mentioned above have worked for fastai version 1.

Many people find the deploying part way more difficult than running through the notebook and Jeremy recommends building an app as it really shows how much you have understood.

If your a web app genius you can create your own app for inference!

Cheers mrfabulous1 :smiley: :smiley:

2 Likes

Good afternoon everyone hope you are all having a jolly day!

Has anyone managed to deploy lesson one or lesson five as a web app, I have created 10’s of apps in fastai1 but have not been able to complete the hello world of fastai (the lesson 1 image classifier)

.

If some one could share a working repository of an app that they have got working using lesson 1 or lesson 5 that would be great.

On lesson 1 and 5 i am using the https://github.com/render-examples/fastai-v3 as a template

I get the following error when deploying lesson 5 app!

   (fastai2) Mrs-MacBook-Pro:fastai-v3-master fabulous$ python app/server.py serve
Traceback (most recent call last):
  File "app/server.py", line 51, in <module>
    learn = loop.run_until_complete(asyncio.gather(*tasks))[0]
  File "/opt/anaconda3/envs/fastai2/lib/python3.7/asyncio/base_events.py", line 583, in run_until_complete
    return future.result()
  File "app/server.py", line 38, in setup_learner
    learn = torch.load(path/export_file_name, map_location=torch.device('cpu'))
  File "/opt/anaconda3/envs/fastai2/lib/python3.7/site-packages/torch/serialization.py", line 529, in load
    return _legacy_load(opened_file, map_location, pickle_module, **pickle_load_args)
  File "/opt/anaconda3/envs/fastai2/lib/python3.7/site-packages/torch/serialization.py", line 702, in _legacy_load
    result = unpickler.load()
AttributeError: Can't get attribute 'TransformerNet' on <module '__main__' from 'app/server.py'>
(fastai2) Mrs-MacBook-Pro:fastai-v3-master fabulous$  Can't get attribute 'TransformerNet' on <module '__main__' from 'app/server.py'

Error when deploying lesson 1 app

Successfully installed aiofiles-0.4.0 aiohttp-3.5.4 asyncio-3.4.3 fastai2-0.0.7 h11-0.8.1 httptools-0.0.13 numpy-1.17.5 pillow-6.2.1 python-multipart-0.0.5 starlette-0.12.0 torch-1.3.1 torchvision-0.4.2 uvicorn-0.7.1 uvloop-0.12.2 websockets-7.0
(venv) (base) Mrs-MacBook-Pro:fastai-v3-master fabulous$ python app/server.py serve
INFO: Started server process [12342]
INFO: Waiting for application startup.
INFO: Uvicorn running on http://0.0.0.0:5000 (Press CTRL+C to quit)
INFO: ('127.0.0.1', 51083) - "GET / HTTP/1.1" 200
INFO: ('127.0.0.1', 51083) - "GET /static/style.css HTTP/1.1" 200
INFO: ('127.0.0.1', 51085) - "GET /static/client.js HTTP/1.1" 304
INFO: ('127.0.0.1', 51090) - "POST /analyze HTTP/1.1" 500
ERROR: Exception in ASGI application
Traceback (most recent call last):
  File "/opt/anaconda3/lib/python3.7/site-packages/uvicorn/protocols/http/httptools_impl.py", line 368, in run_asgi
    result = await app(self.scope, self.receive, self.send)
  File "/opt/anaconda3/lib/python3.7/site-packages/starlette/applications.py", line 133, in __call__
    await self.error_middleware(scope, receive, send)
  File "/opt/anaconda3/lib/python3.7/site-packages/starlette/middleware/errors.py", line 122, in __call__
    raise exc from None
  File "/opt/anaconda3/lib/python3.7/site-packages/starlette/middleware/errors.py", line 100, in __call__
    await self.app(scope, receive, _send)
  File "/opt/anaconda3/lib/python3.7/site-packages/starlette/middleware/cors.py", line 84, in __call__
    await self.simple_response(scope, receive, send, request_headers=headers)
  File "/opt/anaconda3/lib/python3.7/site-packages/starlette/middleware/cors.py", line 140, in simple_response
    await self.app(scope, receive, send)
  File "/opt/anaconda3/lib/python3.7/site-packages/starlette/exceptions.py", line 73, in __call__
    raise exc from None
  File "/opt/anaconda3/lib/python3.7/site-packages/starlette/exceptions.py", line 62, in __call__
    await self.app(scope, receive, sender)
  File "/opt/anaconda3/lib/python3.7/site-packages/starlette/routing.py", line 585, in __call__
    await route(scope, receive, send)
  File "/opt/anaconda3/lib/python3.7/site-packages/starlette/routing.py", line 207, in __call__
    await self.app(scope, receive, send)
  File "/opt/anaconda3/lib/python3.7/site-packages/starlette/routing.py", line 40, in app
    response = await func(request)
  File "app/server.py", line 160, in analyze
    pred = learn.predict(BytesIO(img_bytes))[0]
  File "/opt/anaconda3/lib/python3.7/site-packages/fastai2/learner.py", line 325, in predict
    dl = self.dls.test_dl([item], rm_type_tfms=rm_type_tfms)
  File "/opt/anaconda3/lib/python3.7/site-packages/fastai2/data/core.py", line 315, in test_dl
    test_ds = test_set(self.valid_ds, test_items, rm_tfms=rm_type_tfms) if isinstance(self.valid_ds, Datasets) else test_items
  File "/opt/anaconda3/lib/python3.7/site-packages/fastai2/data/core.py", line 305, in test_set
    if rm_tfms is None: rm_tfms = [tl.infer_idx(test_items[0]) for tl in test_tls]
  File "/opt/anaconda3/lib/python3.7/site-packages/fastai2/data/core.py", line 305, in <listcomp>
    if rm_tfms is None: rm_tfms = [tl.infer_idx(test_items[0]) for tl in test_tls]
  File "/opt/anaconda3/lib/python3.7/site-packages/fastai2/data/core.py", line 222, in infer_idx
    assert idx < len(self.types), f"Expected an input of type in \n{pretty_types}\n but got {type(x)}"
AssertionError: Expected an input of type in 
  - <class 'pathlib.PosixPath'>
  - <class 'pathlib.Path'>
  - <class 'str'>
  - <class 'torch.Tensor'>
  - <class 'numpy.ndarray'>
  - <class 'bytes'>
  - <class 'fastai2.vision.core.PILImage'>
 but got <class '_io.BytesIO'>

Any suggestions, Ideas on how to debug, or words of wisdom welcome.

Cheers mrfabulous1 :smiley: :smiley:

@mrfabulous1 to the TransformerNet issue, you need that style_transfer.py file you generated in the app directory so you can find it and use it. To the BytesIO issue, see sgugger’s comment earlier when this was brought up. I don’t have the time to fix the bug myself

Specifically: A walk with fastai2 - Study Group and Online Lectures Megathread

1 Like