Lesson 2 - Official Topic

Thanks.

1 Like

I feel really stupid, but I finally got the deployment to work! I didn’t fully grasp the concept of the deployment until after the third lesson and after reviewing all of the posts. But now my bing search api 7 day trial has expired so I haven’t added ipywidgets to this one, but I’m playing around with them in another notebook.

https://hub.gke.mybinder.org/user/grace-bit-prog-deeplearning-y2u8w7tn/voila/render/beeapp.ipynb?token=mpOKkjQFQ7uyCgmsRNJ_YQ

Very simple question, interesting answers. I kind of agree with everyone who has replied here, even if they mostly disagree among them :sweat_smile:

1 Like

Hi @DanielLam,

I have no 404: NO FOUND problem for the app while my binder link is working as shown screenhot! Any suggestion! (https://hub.gke.mybinder.org/user/tissa2-webapp-h6ulacmy/notebooks/mywebapp.ipynb)

1 Like

Jermey mentions the pros and cons of deep learning for text. I would like to know what he thinks of the performance of these models for tasks such as NER or intent and slot filling? Is the performance better than other statistical approaches? Are there any cons to be aware of? I haven’t found much googling these topics, compared to say text generation or predicting the next token.

Hi,

If it’s working on binder. Then it might be that you need to change voila/render/classifier.ipynb filed from File -> URL.

https://forums.fast.ai/t/binder-and-fastai2/67810/8 christian.acuna has an example

1 Like

I wonder if I am doing things right here… the sign up process is not really simple, and it requires giving credit card information. Is there no easier way to get a few images? :sweat_smile:

I’m feeling a bit dumb here, but I can’t even get images :frowning:

I am using gradient (Paperspace) with the pre-loaded Fastai notebooks.

I have created an account in Azure and set up a Computer Vision “resource”, which provides me with a Key and Endpoint. I have pasted the key in the Jupyter journal, but then I get a PermissionDenied error when getting images.

---------------------------------------------------------------------------
ErrorResponseException                    Traceback (most recent call last)
<ipython-input-4-cddb73f3292e> in <module>
----> 1 results = search_images_bing(key, 'grizzly bear')
      2 ims = results.attrgot('content_url')
      3 len(ims)

/notebooks/course-v4/nbs/utils.py in search_images_bing(key, term, min_sz)
     31 def search_images_bing(key, term, min_sz=128):
     32     client = api('https://api.cognitive.microsoft.com', auth(key))
---> 33     return L(client.images.search(query=term, count=150, min_height=min_sz, min_width=min_sz).value)
     34 
     35 

/opt/conda/envs/fastai/lib/python3.7/site-packages/azure/cognitiveservices/search/imagesearch/operations/_images_operations.py in search(self, query, accept_language, user_agent, client_id, client_ip, location, aspect, color, country_code, count, freshness, height, id, image_content, image_type, license, market, max_file_size, max_height, max_width, min_file_size, min_height, min_width, offset, safe_search, size, set_lang, width, custom_headers, raw, **operation_config)
    489 
    490         if response.status_code not in [200]:
--> 491             raise models.ErrorResponseException(self._deserialize, response)
    492 
    493         deserialized = None

ErrorResponseException: Operation returned an invalid status code 'PermissionDenied'

You have an example here

1 Like

Thanks for the help! It’s not solving my PermissionDenied problem, but this is a good alternative to get images :slight_smile:

I am missing the requirements file, but I only needed to set up wget to make it work

!apt-get update
!apt-get install wget

I’m struggling with show_batch and data augmentation. Whenever I try something like

dls.train.show_batch(max_n=4, nrows=1, unique=True)

I get: AttributeError: 'AxesImage' object has no property 'unique'

Has anyone else found this issue or is it just me?

Full error:

---------------------------------------------------------------------------
AttributeError                            Traceback (most recent call last)
<ipython-input-24-02b5d181856c> in <module>
      1 bears = bears.new(item_tfms=Resize(128), batch_tfms=aug_transforms(mult=2))
      2 dls = bears.dataloaders(path)
----> 3 dls.train.show_batch(max_n=8, nrows=2, unique=True)
      4 # dls.train.show_batch(max_n=8, nrows=2)

/opt/conda/envs/fastai/lib/python3.7/site-packages/fastai2/data/core.py in show_batch(self, b, max_n, ctxs, show, **kwargs)
     90         if b is None: b = self.one_batch()
     91         if not show: return self._pre_show_batch(b, max_n=max_n)
---> 92         show_batch(*self._pre_show_batch(b, max_n=max_n), ctxs=ctxs, max_n=max_n, **kwargs)
     93 
     94     def show_results(self, b, out, max_n=9, ctxs=None, show=True, **kwargs):

/opt/conda/envs/fastai/lib/python3.7/site-packages/fastcore/dispatch.py in __call__(self, *args, **kwargs)
     96         if not f: return args[0]
     97         if self.inst is not None: f = MethodType(f, self.inst)
---> 98         return f(*args, **kwargs)
     99 
    100     def __get__(self, inst, owner):

/opt/conda/envs/fastai/lib/python3.7/site-packages/fastai2/vision/data.py in show_batch(x, y, samples, ctxs, max_n, nrows, ncols, figsize, **kwargs)
     43 def show_batch(x:TensorImage, y, samples, ctxs=None, max_n=10, nrows=None, ncols=None, figsize=None, **kwargs):
     44     if ctxs is None: ctxs = get_grid(min(len(samples), max_n), nrows=nrows, ncols=ncols, figsize=figsize)
---> 45     ctxs = show_batch[object](x, y, samples, ctxs=ctxs, max_n=max_n, **kwargs)
     46     return ctxs
     47 

/opt/conda/envs/fastai/lib/python3.7/site-packages/fastai2/data/core.py in show_batch(x, y, samples, ctxs, max_n, **kwargs)
     13     if ctxs is None: ctxs = Inf.nones
     14     for i in range_of(samples[0]):
---> 15         ctxs = [b.show(ctx=c, **kwargs) for b,c,_ in zip(samples.itemgot(i),ctxs,range(max_n))]
     16     return ctxs
     17 

/opt/conda/envs/fastai/lib/python3.7/site-packages/fastai2/data/core.py in <listcomp>(.0)
     13     if ctxs is None: ctxs = Inf.nones
     14     for i in range_of(samples[0]):
---> 15         ctxs = [b.show(ctx=c, **kwargs) for b,c,_ in zip(samples.itemgot(i),ctxs,range(max_n))]
     16     return ctxs
     17 

/opt/conda/envs/fastai/lib/python3.7/site-packages/fastai2/torch_core.py in show(self, ctx, **kwargs)
    296     _show_args = ArrayImageBase._show_args
    297     def show(self, ctx=None, **kwargs):
--> 298         return show_image(self, ctx=ctx, **{**self._show_args, **kwargs})
    299 
    300 # Cell

/opt/conda/envs/fastai/lib/python3.7/site-packages/fastai2/torch_core.py in show_image(im, ax, figsize, title, ctx, **kwargs)
     52     if figsize is None: figsize = (_fig_bounds(im.shape[0]), _fig_bounds(im.shape[1]))
     53     if ax is None: _,ax = plt.subplots(figsize=figsize)
---> 54     ax.imshow(im, **kwargs)
     55     if title is not None: ax.set_title(title)
     56     ax.axis('off')

/opt/conda/envs/fastai/lib/python3.7/site-packages/matplotlib/__init__.py in inner(ax, data, *args, **kwargs)
   1541     def inner(ax, *args, data=None, **kwargs):
   1542         if data is None:
-> 1543             return func(ax, *map(sanitize_sequence, args), **kwargs)
   1544 
   1545         bound = new_sig.bind(ax, *args, **kwargs)

/opt/conda/envs/fastai/lib/python3.7/site-packages/matplotlib/cbook/deprecation.py in wrapper(*args, **kwargs)
    356                 f"%(removal)s.  If any parameter follows {name!r}, they "
    357                 f"should be pass as keyword, not positionally.")
--> 358         return func(*args, **kwargs)
    359 
    360     return wrapper

/opt/conda/envs/fastai/lib/python3.7/site-packages/matplotlib/cbook/deprecation.py in wrapper(*args, **kwargs)
    356                 f"%(removal)s.  If any parameter follows {name!r}, they "
    357                 f"should be pass as keyword, not positionally.")
--> 358         return func(*args, **kwargs)
    359 
    360     return wrapper

/opt/conda/envs/fastai/lib/python3.7/site-packages/matplotlib/axes/_axes.py in imshow(self, X, cmap, norm, aspect, interpolation, alpha, vmin, vmax, origin, extent, shape, filternorm, filterrad, imlim, resample, url, **kwargs)
   5611         im = mimage.AxesImage(self, cmap, norm, interpolation, origin, extent,
   5612                               filternorm=filternorm, filterrad=filterrad,
-> 5613                               resample=resample, **kwargs)
   5614 
   5615         im.set_data(X)

/opt/conda/envs/fastai/lib/python3.7/site-packages/matplotlib/image.py in __init__(self, ax, cmap, norm, interpolation, origin, extent, filternorm, filterrad, resample, **kwargs)
    897             filterrad=filterrad,
    898             resample=resample,
--> 899             **kwargs
    900         )
    901 

/opt/conda/envs/fastai/lib/python3.7/site-packages/matplotlib/image.py in __init__(self, ax, cmap, norm, interpolation, origin, filternorm, filterrad, resample, **kwargs)
    259         self._imcache = None
    260 
--> 261         self.update(kwargs)
    262 
    263     def __getstate__(self):

/opt/conda/envs/fastai/lib/python3.7/site-packages/matplotlib/artist.py in update(self, props)
   1004 
   1005         with cbook._setattr_cm(self, eventson=False):
-> 1006             ret = [_update_property(self, k, v) for k, v in props.items()]
   1007 
   1008         if len(ret):

/opt/conda/envs/fastai/lib/python3.7/site-packages/matplotlib/artist.py in <listcomp>(.0)
   1004 
   1005         with cbook._setattr_cm(self, eventson=False):
-> 1006             ret = [_update_property(self, k, v) for k, v in props.items()]
   1007 
   1008         if len(ret):

/opt/conda/envs/fastai/lib/python3.7/site-packages/matplotlib/artist.py in _update_property(self, k, v)
   1000                 if not callable(func):
   1001                     raise AttributeError('{!r} object has no property {!r}'
-> 1002                                          .format(type(self).__name__, k))
   1003                 return func(v)
   1004 

AttributeError: 'AxesImage' object has no property 'unique'

If you need just a few images, then you can simply download them manually?

Using bing’s api is useful in the case where you want to generate a large dataset.

Yes, I got a few URLs manually… but it was not enough for training. I would have got a few more, but @imrandude’s soloution worked fine. But I am interested in this problem since I am considering using some of this code in the future, where I will want large datasets.

Not sure, but I wonder if it means that you only have one image for each sample.

In contrast, the example in fastbook’s 02_prooduction notebook,

bears = bears.new(item_tfms=RandomResizedCrop(128, min_scale=0.3))
dls = bears.dataloaders(path)
dls.train.show_batch(max_n=4, nrows=1, unique=True)

generates randomly resized/cropped(?) versions of each image.

In the code that you are trying to run, it seems to me that you simply resize the image to a fixed size.

Are you on the most recent fastai2 and fastcore? If so can you try the dev installs? IIRC this was a bug that got fixed recently

1 Like

Sure, that must be it, thanks [it was]. I always make sure to get the most recent code before posting, but I don’t normally work with Jupyter, so I had not thought about that this time. Still a bit uncomfortable in this environment, which is a big part of why I’m trying to actually do something with it for once :sweat_smile:

1 Like

Sorry I had missed your post Antoine! No, I was just showing the last line of the example you are showing. I fixed it by updating fastai and fastcore (embarrassing, but true).

1 Like

One thing you might try is to download your pkl file from Google Drive or S3 or even git lfs at runtime, before starting the app.

I, too am getting “Unable to add the product to your subscription. Please try again.” any solutions to this…

Number 4 was a great tip. Thanks @DanielLam!