Lesson 1 official topic

@dokkosean, try…

pred,ndx,probs = learn.predict(....)
print(....pred....)
print(....probs[ndx]....)
2 Likes

D’oh! That worked! Thanks @bencoman!

I’m modifying the “Is it a bird?” notebook to experiment following Lesson 1. How does the “download_images” function work? I’m not getting nearly as many images as I’d expect. The only parameter I see about how many is a max, which seems to default to 1000. It’s yielding 56 for one search term and 29 for another. What am I missing?

Thanks!

Hi, I am getting errors after installing cudnn when running lesson 1. I am having a hard time debugging this. The internet says it might have to do with a mismatch in matmult, but if that was the case I suspect it would be happening to more people. I have a 3070 Ti with 8gb of ram, and this lesson is dealing with small data, so I don’t think it’s memory.

RuntimeError                              Traceback (most recent call last)
Cell In[14], line 2
      1 learn = vision_learner(dls, resnet18, metrics=error_rate)
----> 2 learn.fine_tune(3)

File ~/miniconda3/envs/d2l/lib/python3.9/site-packages/fastai/callback/schedule.py:165, in fine_tune(self, epochs, base_lr, freeze_epochs, lr_mult, pct_start, div, **kwargs)
    163 "Fine tune with `Learner.freeze` for `freeze_epochs`, then with `Learner.unfreeze` for `epochs`, using discriminative LR."
    164 self.freeze()
--> 165 self.fit_one_cycle(freeze_epochs, slice(base_lr), pct_start=0.99, **kwargs)
    166 base_lr /= 2
    167 self.unfreeze()

File ~/miniconda3/envs/d2l/lib/python3.9/site-packages/fastai/callback/schedule.py:119, in fit_one_cycle(self, n_epoch, lr_max, div, div_final, pct_start, wd, moms, cbs, reset_opt, start_epoch)
    116 lr_max = np.array([h['lr'] for h in self.opt.hypers])
    117 scheds = {'lr': combined_cos(pct_start, lr_max/div, lr_max, lr_max/div_final),
    118           'mom': combined_cos(pct_start, *(self.moms if moms is None else moms))}
--> 119 self.fit(n_epoch, cbs=ParamScheduler(scheds)+L(cbs), reset_opt=reset_opt, wd=wd, start_epoch=start_epoch)

File ~/miniconda3/envs/d2l/lib/python3.9/site-packages/fastai/learner.py:256, in Learner.fit(self, n_epoch, lr, wd, cbs, reset_opt, start_epoch)
    254 self.opt.set_hypers(lr=self.lr if lr is None else lr)
    255 self.n_epoch = n_epoch
--> 256 self._with_events(self._do_fit, 'fit', CancelFitException, self._end_cleanup)

File ~/miniconda3/envs/d2l/lib/python3.9/site-packages/fastai/learner.py:193, in Learner._with_events(self, f, event_type, ex, final)
    192 def _with_events(self, f, event_type, ex, final=noop):
--> 193     try: self(f'before_{event_type}');  f()
    194     except ex: self(f'after_cancel_{event_type}')
    195     self(f'after_{event_type}');  final()

File ~/miniconda3/envs/d2l/lib/python3.9/site-packages/fastai/learner.py:245, in Learner._do_fit(self)
    243 for epoch in range(self.n_epoch):
    244     self.epoch=epoch
--> 245     self._with_events(self._do_epoch, 'epoch', CancelEpochException)

File ~/miniconda3/envs/d2l/lib/python3.9/site-packages/fastai/learner.py:193, in Learner._with_events(self, f, event_type, ex, final)
    192 def _with_events(self, f, event_type, ex, final=noop):
--> 193     try: self(f'before_{event_type}');  f()
    194     except ex: self(f'after_cancel_{event_type}')
    195     self(f'after_{event_type}');  final()

File ~/miniconda3/envs/d2l/lib/python3.9/site-packages/fastai/learner.py:239, in Learner._do_epoch(self)
    238 def _do_epoch(self):
--> 239     self._do_epoch_train()
    240     self._do_epoch_validate()

File ~/miniconda3/envs/d2l/lib/python3.9/site-packages/fastai/learner.py:231, in Learner._do_epoch_train(self)
    229 def _do_epoch_train(self):
    230     self.dl = self.dls.train
--> 231     self._with_events(self.all_batches, 'train', CancelTrainException)

File ~/miniconda3/envs/d2l/lib/python3.9/site-packages/fastai/learner.py:193, in Learner._with_events(self, f, event_type, ex, final)
    192 def _with_events(self, f, event_type, ex, final=noop):
--> 193     try: self(f'before_{event_type}');  f()
    194     except ex: self(f'after_cancel_{event_type}')
    195     self(f'after_{event_type}');  final()

File ~/miniconda3/envs/d2l/lib/python3.9/site-packages/fastai/learner.py:199, in Learner.all_batches(self)
    197 def all_batches(self):
    198     self.n_iter = len(self.dl)
--> 199     for o in enumerate(self.dl): self.one_batch(*o)

File ~/miniconda3/envs/d2l/lib/python3.9/site-packages/fastai/learner.py:227, in Learner.one_batch(self, i, b)
    225 b = self._set_device(b)
    226 self._split(b)
--> 227 self._with_events(self._do_one_batch, 'batch', CancelBatchException)

File ~/miniconda3/envs/d2l/lib/python3.9/site-packages/fastai/learner.py:193, in Learner._with_events(self, f, event_type, ex, final)
    192 def _with_events(self, f, event_type, ex, final=noop):
--> 193     try: self(f'before_{event_type}');  f()
    194     except ex: self(f'after_cancel_{event_type}')
    195     self(f'after_{event_type}');  final()

File ~/miniconda3/envs/d2l/lib/python3.9/site-packages/fastai/learner.py:205, in Learner._do_one_batch(self)
    204 def _do_one_batch(self):
--> 205     self.pred = self.model(*self.xb)
    206     self('after_pred')
    207     if len(self.yb):

File ~/miniconda3/envs/d2l/lib/python3.9/site-packages/torch/nn/modules/module.py:1190, in Module._call_impl(self, *input, **kwargs)
   1186 # If we don't have any hooks, we want to skip the rest of the logic in
   1187 # this function, and just call forward.
   1188 if not (self._backward_hooks or self._forward_hooks or self._forward_pre_hooks or _global_backward_hooks
   1189         or _global_forward_hooks or _global_forward_pre_hooks):
-> 1190     return forward_call(*input, **kwargs)
   1191 # Do not call functions when jit is used
   1192 full_backward_hooks, non_full_backward_hooks = [], []

File ~/miniconda3/envs/d2l/lib/python3.9/site-packages/torch/nn/modules/container.py:204, in Sequential.forward(self, input)
    202 def forward(self, input):
    203     for module in self:
--> 204         input = module(input)
    205     return input

File ~/miniconda3/envs/d2l/lib/python3.9/site-packages/torch/nn/modules/module.py:1190, in Module._call_impl(self, *input, **kwargs)
   1186 # If we don't have any hooks, we want to skip the rest of the logic in
   1187 # this function, and just call forward.
   1188 if not (self._backward_hooks or self._forward_hooks or self._forward_pre_hooks or _global_backward_hooks
   1189         or _global_forward_hooks or _global_forward_pre_hooks):
-> 1190     return forward_call(*input, **kwargs)
   1191 # Do not call functions when jit is used
   1192 full_backward_hooks, non_full_backward_hooks = [], []

File ~/miniconda3/envs/d2l/lib/python3.9/site-packages/torch/nn/modules/container.py:204, in Sequential.forward(self, input)
    202 def forward(self, input):
    203     for module in self:
--> 204         input = module(input)
    205     return input

File ~/miniconda3/envs/d2l/lib/python3.9/site-packages/torch/nn/modules/module.py:1190, in Module._call_impl(self, *input, **kwargs)
   1186 # If we don't have any hooks, we want to skip the rest of the logic in
   1187 # this function, and just call forward.
   1188 if not (self._backward_hooks or self._forward_hooks or self._forward_pre_hooks or _global_backward_hooks
   1189         or _global_forward_hooks or _global_forward_pre_hooks):
-> 1190     return forward_call(*input, **kwargs)
   1191 # Do not call functions when jit is used
   1192 full_backward_hooks, non_full_backward_hooks = [], []

File ~/miniconda3/envs/d2l/lib/python3.9/site-packages/torch/nn/modules/linear.py:114, in Linear.forward(self, input)
    113 def forward(self, input: Tensor) -> Tensor:
--> 114     return F.linear(input, self.weight, self.bias)

File ~/miniconda3/envs/d2l/lib/python3.9/site-packages/fastai/torch_core.py:378, in TensorBase.__torch_function__(cls, func, types, args, kwargs)
    376 if cls.debug and func.__name__ not in ('__str__','__repr__'): print(func, types, args, kwargs)
    377 if _torch_handled(args, cls._opt, func): types = (torch.Tensor,)
--> 378 res = super().__torch_function__(func, types, args, ifnone(kwargs, {}))
    379 dict_objs = _find_args(args) if args else _find_args(list(kwargs.values()))
    380 if issubclass(type(res),TensorBase) and dict_objs: res.set_meta(dict_objs[0],as_copy=True)

File ~/miniconda3/envs/d2l/lib/python3.9/site-packages/torch/_tensor.py:1278, in Tensor.__torch_function__(cls, func, types, args, kwargs)
   1275     return NotImplemented
   1277 with _C.DisableTorchFunction():
-> 1278     ret = func(*args, **kwargs)
   1279     if func in get_default_nowrap_functions():
   1280         return ret

RuntimeError: CUDA error: CUBLAS_STATUS_INVALID_VALUE when calling `cublasSgemm( handle, opa, opb, m, n, k, &alpha, a, lda, b, ldb, &beta, c, ldc)`

Can you make your notebook public and share a link to it?
If you are not using a cloud service, recommend that you do that first, to make it easier to share your notebook for review.

I am not able to advise on your error, so can only offer an alternative path to enable you to proceed while waiting for a better answer.

if that was the case I suspect it would be happening to more people

Most people are using a cloud service (as recommened by Jeremy in the course so you can focus on machine learning rather than sysadmin). You should try one. I’ve had success with Paperspace using GitHub - bencoman/paperspace-setup: Setup a paperspace instance for fastai, which evolved out of the Live coding walkthrough Live-coding (aka walk-thrus) ✅. The walkthroughs might also help with your local install.

I figured it out. I see that the download_images function relies on a list of urls, which is being generated by the search_images function we define. The default max_images there was 30.

1 Like

Hello! This is my first post. Thank you for your effort and giving the opportunity to learn to everybody.

I am enjoying this journey, but sometimes I find some difficulties. If you were so kind as to enlighten me, I would appreciate it. Also, English is not my first language, which creates confusion at certain points.

I have a silly question about the example with forests and birds:

is_bird,_,probs = learn.predict(PILImage.create(‘bird.jpg’))
print(f"This is a: {is_bird}.“)
print(f"Probability it’s a bird: {probs[0]:.4f}”)

If I am correct, the method predict returns probs, with the probability of being one category or the other. That means that probs[0] represents the probability of being a bird and probs[1] the probability of being a forest.

My question is: where in the code did we set this order? How do I know the category ‘birds’ comes first?

And another question: What if I want to have more than two categories? Are there more examples later in the book?

Thank you in advance.

Hi citaconrama! That is explained in the lesson 3 lecture (here’s a link).

2 Likes

Hi there, so I recognize that this might be an obvious question to some, but I am lost on what even the first step to setup is. Do I need to be downloading something/renting a new machine or am I able to complete the entire course with a Kaggle notebook or Jupyter notebooks? If so what are the links to those notebooks.

Welcome. All you need is to run notebooks in Kaggle or Colab.

To get started, follow the videos, look for the lesson links and additional resources at the bottom of each page, that is where you will find additional notes on how to use Kaggle etc.

(No need to setup your own hardware. - you can finish the course without going down that rabbit hole yet)

I need help adding pictures to jupyter notebook.
The “copy paste” instructions do not work.
Drag and drop also does not work.
Is there another way to add local images into the notebook?

Thanks
j

Figured it out.
I was in a Code cell and thus only the file name copied in.
Pasting in a markdown cell works as expected.

j

You migth find the Live Walkthrough useful… Live-coding (aka walk-thrus) ✅

1 Like

Thanks from the live Walkthrough links Ben . The first lesson was very helpful! I have another question. How do I install the fastai library?

2 Likes

Hello,
I just started the 1st FastAI course in conjunction with the recommended book.
The author advises to use tools like Kaggle or Paperspace instead of using the local environment, for practical reasons (GPU in particular)

What do you recommend? Kaggle or PaperSpace?

What is the general process? git clone the fastai github repository onto a Kaggle notebook and then browse each notebook?
Although I did it, I could not navigate, not being too familiar with the Kaggle platform, I see the .ipnyb file names available, but I can’t access each of the cloned notebooks from the notebook.

Sounds like ‘GPU’ is not selected. See this thread for my note on how to select it.

1 Like

Hi Saad,

I haven’t checked out PaperSpace so far, so wouldn’t be able to give a comparison. But here is the process to run the notebooks on other platforms.

If you are looking to run the code in the book then you can use colab: https://colab.research.google.com/

  • What you can do is simply replace github.com to Google Colab and the rest of the address. For example: https:// github.com/ fastai/fastbook/blob/master/01_intro.ipynb can be opened on colab with https:// colab.research.google.com/ github/ fastai/fastbook/blob/master/01_intro.ipynb (Note: I’ve added spaces in the URL as it was turning the URL into a hyperlink and the modification wouldn’t be clear)

  • In order to use GPU on colab, you can “change runtime type” to add a GPU accelerator

In addition to the text book code, the official notebook for this lecture is on Kaggle already: Is it a bird? Creating a model from your own data | Kaggle
So all you have to do is, copy and run it within Kaggle. Then you can modify it according to the problem of your choice!

Hello,
I just completet lesson 1 and tried to alter the official Kaggle notebook. I added a dataset of chess pieces and want to classify them, but I get bad results quite often.

  • Some pieces are classified correctly
  • Some easy looking pieces are classified incorrectly (check link for an example)
  • Kings are always classified incorrectly

Can someone please help me out? I tried a few different things but I just can’t find the reason… https://www.kaggle.com/code/maxwe000/is-it-a-king-creating-a-model-from-your-own-data