Beginner: Using Colab or Kaggle ✅

Hello, Very first time with fastai! Trying isBird in Kaggle.
At the step to use my custom model, it appears that it is not able to read the file supplied. I copied the isBird notebook in kaggle and am running it there. What is missing? Thanks for your help!

The code block

is_bird,_,probs = learn.predict(PILImage.create('bird.jpg'))
print(f"This is a: {is_bird}.")
print(f"Probability it's a bird: {probs[0]:.4f}")

Error

/opt/conda/lib/python3.7/site-packages/PIL/Image.py in __getattr__(self, name)
    539             )
    540             return self._category
--> 541         raise AttributeError(name)
    542 
    543     @property

AttributeError: read

You just have to pass the path of the file and not the PILImage object as per the latest release.

2 Likes

That was it. Thanks Rohan!

1 Like

Hi :slight_smile: I’m trying to run the notebook for the lesson 1 and at the last code cell, I always get an error. I didn’t modify anything. Is there any chance that some module is deprecated or something?

I also have troubles running the “Chapter 2, Production” Colab. The code after “Our folder has image files, as we’d expect:” gives always an array of 0 items.
Thank you in advance

Edit:
I tried to replace the first line for this one and now works: is_bird,_,probs = learn.predict(“forest.jpg”). It not seems a very intelligent way to do it, right?

1 Like

Hello,
I am learning fastai for medical research and I am an absolute beginner using Kaggle. I cut and paste the code for lesson 1 homework “Is it a bird?”.
Everything was working fine until I implemented the following code:

is_bird,_,probs = learn.predict(PILImage.create(‘bird.jpg’))
print(f"This is a: {is_bird}.“)
print(f"Probability it’s a bird: {probs[0]:.4f}”)

And I got the message below.

Please help

Time

Log Message
27.6s 1 Searching for ‘bird photos’
35.3s 2 Searching for ‘forest photos’
36.7s 3 Searching for ‘forest photo’
104.2s 4 Searching for ‘forest sun photo’
119.8s 5 Searching for ‘forest shade photo’
144.3s 6 Searching for ‘bird photo’
157.9s 7 Searching for ‘bird sun photo’
174.9s 8 Searching for ‘bird shade photo’
197.2s 9 /opt/conda/lib/python3.7/site-packages/torchvision/models/utils.py:209: UserWarning: The parameter ‘pretrained’ is deprecated since 0.13 and may be removed in the future, please use ‘weights’ instead.
197.2s 10 f"The parameter ‘{pretrained_param}’ is deprecated since 0.13 and may be removed in the future, "
197.2s 11 /opt/conda/lib/python3.7/site-packages/torchvision/models/utils.py:223: UserWarning: Arguments other than a weight enum or None for ‘weights’ are deprecated since 0.13 and may be removed in the future. The current behavior is equivalent to passing weights=ResNet18_Weights.IMAGENET1K_V1. You can also use weights=ResNet18_Weights.DEFAULT to get the most up-to-date weights.
197.2s 12 warnings.warn(msg)
197.4s 13 Downloading: “https://download.pytorch.org/models/resnet18-f37072fd.pth” to /root/.cache/torch/hub/checkpoints/resnet18-f37072fd.pth
270.5s 14 Traceback (most recent call last):
270.5s 15 File “”, line 1, in
270.5s 16 File “/opt/conda/lib/python3.7/site-packages/papermill/execute.py”, line 128, in execute_notebook
270.5s 17 raise_for_execution_errors(nb, output_path)
270.5s 18 File “/opt/conda/lib/python3.7/site-packages/papermill/execute.py”, line 232, in raise_for_execution_errors
270.5s 19 raise error
270.5s 20 papermill.exceptions.PapermillExecutionError:
270.5s 21 ---------------------------------------------------------------------------
270.5s 22 Exception encountered at “In [12]”:
270.5s 23 ---------------------------------------------------------------------------
270.5s 24 AttributeError Traceback (most recent call last)
270.5s 25 /tmp/ipykernel_19/1213330699.py in
270.5s 26 ----> 1 is_bird,
,probs = learn.predict(PILImage.create(‘bird.jpg’))
270.5s 27 2 print(f"This is a: {is_bird}.“)
270.5s 28 3 print(f"Probability it’s a bird: {probs[0]:.4f}”)
270.5s 29
270.5s 30 /opt/conda/lib/python3.7/site-packages/fastai/learner.py in predict(self, item, rm_type_tfms, with_input)
270.5s 31 319 def predict(self, item, rm_type_tfms=None, with_input=False):
270.5s 32 320 dl = self.dls.test_dl([item], rm_type_tfms=rm_type_tfms, num_workers=0)
270.5s 33 → 321 inp,preds,
,dec_preds = self.get_preds(dl=dl, with_input=True, with_decoded=True)
270.5s 34 322 i = getattr(self.dls, ‘n_inp’, -1)
270.5s 35 323 inp = (inp,) if i==1 else tuplify(inp)
270.5s 36
270.5s 37 /opt/conda/lib/python3.7/site-packages/fastai/learner.py in get_preds(self, ds_idx, dl, with_input, with_decoded, with_loss, act, inner, reorder, cbs, **kwargs)
270.5s 38 306 if with_loss: ctx_mgrs.append(self.loss_not_reduced())
270.5s 39 307 with ContextManagers(ctx_mgrs):
270.5s 40 → 308 self._do_epoch_validate(dl=dl)
270.5s 41 309 if act is None: act = getcallable(self.loss_func, ‘activation’)
270.5s 42 310 res = cb.all_tensors()
270.5s 43
270.5s 44 /opt/conda/lib/python3.7/site-packages/fastai/learner.py in _do_epoch_validate(self, ds_idx, dl)
270.5s 45 242 if dl is None: dl = self.dls[ds_idx]
270.5s 46 243 self.dl = dl
270.5s 47 → 244 with torch.no_grad(): self._with_events(self.all_batches, ‘validate’, CancelValidException)
270.5s 48 245
270.5s 49 246 def do_epoch(self):
270.5s 50
270.5s 51 /opt/conda/lib/python3.7/site-packages/fastai/learner.py in with_events(self, f, event_type, ex, final)
270.5s 52 197
270.5s 53 198 def with_events(self, f, event_type, ex, final=noop):
270.5s 54 → 199 try: self(f’before
{event_type}'); f()
270.5s 55 200 except ex: self(f’after_cancel
{event_type}')
270.5s 56 201 self(f’after
{event_type}‘); final()
270.5s 57
270.5s 58 /opt/conda/lib/python3.7/site-packages/fastai/learner.py in all_batches(self)
270.5s 59 203 def all_batches(self):
270.5s 60 204 self.n_iter = len(self.dl)
270.5s 61 → 205 for o in enumerate(self.dl): self.one_batch(*o)
270.5s 62 206
270.5s 63 207 def _backward(self): self.loss_grad.backward()
270.5s 64
270.5s 65 /opt/conda/lib/python3.7/site-packages/fastai/data/load.py in iter(self)
270.5s 66 125 self.before_iter()
270.5s 67 126 self.__idxs=self.get_idxs() # called in context of main process (not workers/subprocesses)
270.5s 68 → 127 for b in _loadersself.fake_l.num_workers==0:
270.5s 69 128 # pin_memory causes tuples to be converted to lists, so convert them back to tuples
270.5s 70 129 if self.pin_memory and type(b) == list: b = tuple(b)
270.5s 71
270.5s 72 /opt/conda/lib/python3.7/site-packages/torch/utils/data/dataloader.py in next(self)
270.5s 73 626 # TODO(Bug in dataloader iterator found by mypy · Issue #76750 · pytorch/pytorch · GitHub)
270.5s 74 627 self._reset() # type: ignore[call-arg]
270.5s 75 → 628 data = self._next_data()
270.5s 76 629 self._num_yielded += 1
270.5s 77 630 if self._dataset_kind == _DatasetKind.Iterable and
270.5s 78
270.5s 79 /opt/conda/lib/python3.7/site-packages/torch/utils/data/dataloader.py in _next_data(self)
270.5s 80 669 def _next_data(self):
270.5s 81 670 index = self._next_index() # may raise StopIteration
270.5s 82 → 671 data = self._dataset_fetcher.fetch(index) # may raise StopIteration
270.5s 83 672 if self._pin_memory:
270.5s 84 673 data = _utils.pin_memory.pin_memory(data, self._pin_memory_device)
270.5s 85
270.5s 86 /opt/conda/lib/python3.7/site-packages/torch/utils/data/_utils/fetch.py in fetch(self, possibly_batched_index)
270.5s 87 41 raise StopIteration
270.5s 88 42 else:
270.5s 89 —> 43 data = next(self.dataset_iter)
270.5s 90 44 return self.collate_fn(data)
270.5s 91 45
270.5s 92
270.5s 93 /opt/conda/lib/python3.7/site-packages/fastai/data/load.py in create_batches(self, samps)
270.5s 94 136 if self.dataset is not None: self.it = iter(self.dataset)
270.5s 95 137 res = filter(lambda o:o is not None, map(self.do_item, samps))
270.5s 96 → 138 yield from map(self.do_batch, self.chunkify(res))
270.5s 97 139
270.5s 98 140 def new(self, dataset=None, cls=None, **kwargs):
270.5s 99
270.5s 100 /opt/conda/lib/python3.7/site-packages/fastcore/basics.py in chunked(it, chunk_sz, drop_last, n_chunks)
270.5s 101 228 if not isinstance(it, Iterator): it = iter(it)
270.5s 102 229 while True:
270.5s 103 → 230 res = list(itertools.islice(it, chunk_sz))
270.5s 104 231 if res and (len(res)==chunk_sz or not drop_last): yield res
270.5s 105 232 if len(res)<chunk_sz: return
270.5s 106
270.5s 107 /opt/conda/lib/python3.7/site-packages/fastai/data/load.py in do_item(self, s)
270.5s 108 151 def prebatched(self): return self.bs is None
270.5s 109 152 def do_item(self, s):
270.5s 110 → 153 try: return self.after_item(self.create_item(s))
270.5s 111 154 except SkipItemException: return None
270.5s 112 155 def chunkify(self, b): return b if self.prebatched else chunked(b, self.bs, self.drop_last)
270.5s 113
270.5s 114 /opt/conda/lib/python3.7/site-packages/fastai/data/load.py in create_item(self, s)
270.5s 115 158 def retain(self, res, b): return retain_types(res, b[0] if is_listy(b) else b)
270.5s 116 159 def create_item(self, s):
270.5s 117 → 160 if self.indexed: return self.dataset[s or 0]
270.5s 118 161 elif s is None: return next(self.it)
270.5s 119 162 else: raise IndexError(“Cannot index an iterable dataset numerically - must use None.”)
270.5s 120
270.5s 121 /opt/conda/lib/python3.7/site-packages/fastai/data/core.py in getitem(self, it)
270.5s 122 456
270.5s 123 457 def getitem(self, it):
270.5s 124 → 458 res = tuple([tl[it] for tl in self.tls])
270.5s 125 459 return res if is_indexer(it) else list(zip(*res))
270.5s 126 460
270.5s 127
270.5s 128 /opt/conda/lib/python3.7/site-packages/fastai/data/core.py in (.0)
270.5s 129 456
270.5s 130 457 def getitem(self, it):
270.5s 131 → 458 res = tuple([tl[it] for tl in self.tls])
270.5s 132 459 return res if is_indexer(it) else list(zip(*res))
270.5s 133 460
270.5s 134
270.5s 135 /opt/conda/lib/python3.7/site-packages/fastai/data/core.py in getitem(self, idx)
270.5s 136 415 res = super().getitem(idx)
270.5s 137 416 if self._after_item is None: return res
270.5s 138 → 417 return self._after_item(res) if is_indexer(idx) else res.map(self._after_item)
270.5s 139 418
270.5s 140 419 # %% …/…/nbs/03_data.core.ipynb 53
270.5s 141
270.5s 142 /opt/conda/lib/python3.7/site-packages/fastai/data/core.py in _after_item(self, o)
270.5s 143 375 raise
270.5s 144 376 def subset(self, i): return self._new(self._get(self.splits[i]), split_idx=i)
270.5s 145 → 377 def _after_item(self, o): return self.tfms(o)
270.5s 146 378 def repr(self): return f"{self.class.name}: {self.items}\ntfms - {self.tfms.fs}"
270.5s 147 379 def iter(self): return (self[i] for i in range(len(self)))
270.5s 148
270.5s 149 /opt/conda/lib/python3.7/site-packages/fastcore/transform.py in call(self, o)
270.5s 150 206 self.fs = self.fs.sorted(key=‘order’)
270.5s 151 207
270.5s 152 → 208 def call(self, o): return compose_tfms(o, tfms=self.fs, split_idx=self.split_idx)
270.5s 153 209 def repr(self): return f"Pipeline: {’ → ‘.join([f.name for f in self.fs if f.name != ‘noop’])}"
270.5s 154 210 def getitem(self,i): return self.fs[i]
270.5s 155
270.5s 156 /opt/conda/lib/python3.7/site-packages/fastcore/transform.py in compose_tfms(x, tfms, is_enc, reverse, **kwargs)
270.5s 157 156 for f in tfms:
270.5s 158 157 if not is_enc: f = f.decode
270.5s 159 → 158 x = f(x, **kwargs)
270.5s 160 159 return x
270.5s 161 160
270.5s 162
270.5s 163 /opt/conda/lib/python3.7/site-packages/fastcore/transform.py in call(self, x, **kwargs)
270.5s 164 79 @property
270.5s 165 80 def name(self): return getattr(self, ‘_name’, _get_name(self))
270.5s 166 —> 81 def call(self, x, **kwargs): return self._call(‘encodes’, x, **kwargs)
270.5s 167 82 def decode (self, x, **kwargs): return self._call(‘decodes’, x, **kwargs)
270.5s 168 83 def repr(self): return f’{self.name}:\nencodes: {self.encodes}decodes: {self.decodes}’
270.5s 169
270.5s 170 /opt/conda/lib/python3.7/site-packages/fastcore/transform.py in _call(self, fn, x, split_idx, **kwargs)
270.5s 171 89 def _call(self, fn, x, split_idx=None, **kwargs):
270.5s 172 90 if split_idx!=self.split_idx and self.split_idx is not None: return x
270.5s 173 —> 91 return self._do_call(getattr(self, fn), x, **kwargs)
270.5s 174 92
270.5s 175 93 def _do_call(self, f, x, **kwargs):
270.5s 176
270.5s 177 /opt/conda/lib/python3.7/site-packages/fastcore/transform.py in do_call(self, f, x, **kwargs)
270.5s 178 95 if f is None: return x
270.5s 179 96 ret = f.returns(x) if hasattr(f,‘returns’) else None
270.5s 180 —> 97 return retain_type(f(x, **kwargs), x, ret)
270.5s 181 98 res = tuple(self.do_call(f, x, **kwargs) for x
in x)
270.5s 182 99 return retain_type(res, x)
270.5s 183
270.5s 184 /opt/conda/lib/python3.7/site-packages/fastcore/dispatch.py in call(self, *args, **kwargs)
270.5s 185 118 elif self.inst is not None: f = MethodType(f, self.inst)
270.5s 186 119 elif self.owner is not None: f = MethodType(f, self.owner)
270.5s 187 → 120 return f(*args, **kwargs)
270.5s 188 121
270.5s 189 122 def get(self, inst, owner):
270.5s 190
270.5s 191 /opt/conda/lib/python3.7/site-packages/fastai/vision/core.py in create(cls, fn, **kwargs)
270.5s 192 123 if isinstance(fn,bytes): fn = io.BytesIO(fn)
270.5s 193 124 if isinstance(fn,Image.Image) and not isinstance(fn,cls): return cls(fn)
270.5s 194 → 125 return cls(load_image(fn, **merge(cls._open_args, kwargs)))
270.5s 195 126
270.5s 196 127 def show(self, ctx=None, **kwargs):
270.5s 197
270.5s 198 /opt/conda/lib/python3.7/site-packages/fastai/vision/core.py in load_image(fn, mode)
270.5s 199 96 def load_image(fn, mode=None):
270.5s 200 97 “Open and load a PIL.Image and convert to mode
270.5s 201 —> 98 im = Image.open(fn)
270.5s 202 99 im.load()
270.5s 203 100 im = im._new(im.im)
270.5s 204
270.5s 205 /opt/conda/lib/python3.7/site-packages/PIL/Image.py in open(fp, mode, formats)
270.5s 206 3138 exclusive_fp = True
270.5s 207 3139
270.5s 208 → 3140 prefix = fp.read(16)
270.5s 209 3141
270.5s 210 3142 preinit()
270.5s 211
270.5s 212 /opt/conda/lib/python3.7/site-packages/PIL/Image.py in getattr(self, name)
270.5s 213 515 deprecate(“Image categories”, 10, “is_animated”, plural=True)
270.5s 214 516 return self._category
270.5s 215 → 517 raise AttributeError(name)
270.5s 216 518
270.5s 217 519 @property
270.5s 218
270.5s 219 AttributeError: read
270.5s 220
285.6s 221 /opt/conda/lib/python3.7/site-packages/traitlets/traitlets.py:2935: FutureWarning: --Exporter.preprocessors=[“remove_papermill_header.RemovePapermillHeader”] for containers is deprecated in traitlets 5.0. You can pass --Exporter.preprocessors item … multiple times to add items to a list.
285.6s 222 FutureWarning,
285.6s 223 [NbConvertApp] Converting notebook notebook.ipynb to notebook
286.0s 224 [NbConvertApp] Writing 986114 bytes to notebook.ipynb
300.9s 225 /opt/conda/lib/python3.7/site-packages/traitlets/traitlets.py:2935: FutureWarning: --Exporter.preprocessors=[“nbconvert.preprocessors.ExtractOutputPreprocessor”] for containers is deprecated in traitlets 5.0. You can pass --Exporter.preprocessors item … multiple times to add items to a list.
300.9s 226 FutureWarning,
300.9s 227 [NbConvertApp] Converting notebook notebook.ipynb to html
302.1s 228 /opt/conda/lib/python3.7/site-packages/bleach/sanitizer.py:168: NoCssSanitizerWarning: ‘style’ attribute specified, but css_sanitizer not set.
302.1s 229 category=NoCssSanitizerWarning,
302.1s 230 [NbConvertApp] Support files will be in __results___files/
302.1s 231 [NbConvertApp] Making directory __results___files
302.1s 232 [NbConvertApp] Making directory __results___files
302.1s 233 [NbConvertApp] Making directory __results___files

Yes since the video a change has been made in the library.
Here is the new way, no need to create an Image object
is_bird,_,probs = learn.predict('bird.jpg’)

1 Like

Thank you for the response. I cut and paste the code as you listed here and got this error message:

File “/tmp/ipykernel_27/3609130149.py”, line 3
is_bird,_,probs = learn.predict('bird.jpg’)
^

SyntaxError: EOL while scanning string literal

nevermind. I changed the ‘bird.jpg’ to “bird.jpg” and it worked

Thanks

1 Like

I am now on to Chapter 2. I put in the following code:

! [ -e /content ] && pip install -Uqq fastbook
import fastbook
fastbook.setup_book()


ModuleNotFoundError Traceback (most recent call last)
/tmp/ipykernel_27/242012523.py in
1 get_ipython().system(’ [ -e /content ] && pip install -Uqq fastbook’)
----> 2 import fastbook
3 fastbook.setup_book()

ModuleNotFoundError: No module named ‘fastbook’

What is the latest version ??

[quote=“D3valgo, post:114, topic:96280, full:true”]
I am now on to Chapter 2. I put in the following code:

! [ -e /content ] && pip install -Uqq fastbook
import fastbook
fastbook.setup_book()


ModuleNotFoundError Traceback (most recent call last)
/tmp/ipykernel_27/242012523.py in
1 get_ipython().system(’ [ -e /content ] && pip install -Uqq fastbook’)
----> 2 import fastbook
3 fastbook.setup_book()

I am now on to Chapter 2. I put in the following code:

! [ -e /content ] && pip install -Uqq fastbook
import fastbook
fastbook.setup_book()


ModuleNotFoundError Traceback (most recent call last)
/tmp/ipykernel_27/242012523.py in
1 get_ipython().system(’ [ -e /content ] && pip install -Uqq fastbook’)
----> 2 import fastbook
3 fastbook.setup_book()

ModuleNotFoundError: No module named ‘fastbook’

What is the latest version ??

Maybe I can help you, are you on Kaggle or are you running the notebook locally?

Kaggle. As a beginner, I can only cut and paste code from the fastai jupyter notebook to follow along with lecture

1 Like

Try replacing ! [ -e /content ] with ! [ -e /kaggle ]

! [ -e /kaggle ] && pip install -Uqq fastbook
import fastbook
fastbook.setup_book()

Explaination: ! [ -e /kaggle ] && means that the following command, in this case pip install -Uqq fastbook will only be executed, if the folder /kaggle exists.

I just tried it out, and it looks like Kaggle doesn’t have a /content folder, that’s why you should replace it with /kaggle if you’re on Kaggle or delete it altogether if the folder doesn’t exist in your environment.

1 Like

Thank you for the response. It looks like it worked. I have another problem though…
When I run:
if not path.exists():
path.mkdir()
for o in bear_types:
dest = (path/o)
dest.mkdir(exist_ok=True)
results = search_images_ddg(key, f’{o} bear’)
download_images(dest, urls=results.attrgot(‘contentUrl’))

I get this error now:

NameError Traceback (most recent call last)
/tmp/ipykernel_27/2701214052.py in
4 dest = (path/o)
5 dest.mkdir(exist_ok=True)
----> 6 results = search_images_ddg(key, f’{o} bear’)
7 download_images(dest, urls=results.attrgot(‘contentUrl’))

NameError: name ‘key’ is not defined

and with
dls = bears.dataloaders(path)

I get

TypeError Traceback (most recent call last)
/tmp/ipykernel_27/3866548988.py in
----> 1 dls = bears.dataloaders(path)

/opt/conda/lib/python3.7/site-packages/fastai/data/block.py in dataloaders(self, source, path, verbose, **kwargs)
153 **kwargs
154 ) → DataLoaders:
→ 155 dsets = self.datasets(source, verbose=verbose)
156 kwargs = {**self.dls_kwargs, **kwargs, ‘verbose’: verbose}
157 return dsets.dataloaders(path=path, after_item=self.item_tfms, after_batch=self.batch_tfms, **kwargs)

/opt/conda/lib/python3.7/site-packages/fastai/data/block.py in datasets(self, source, verbose)
145 splits = (self.splitter or RandomSplitter())(items)
146 pv(f"{len(splits)} datasets of sizes {‘,’.join([str(len(s)) for s in splits])}", verbose)
→ 147 return Datasets(items, tfms=self._combine_type_tfms(), splits=splits, dl_type=self.dl_type, n_inp=self.n_inp, verbose=verbose)
148
149 def dataloaders(self,

/opt/conda/lib/python3.7/site-packages/fastai/data/core.py in init(self, items, tfms, tls, n_inp, dl_type, **kwargs)
452 ):
453 super().init(dl_type=dl_type)
→ 454 self.tls = L(tls if tls else [TfmdLists(items, t, **kwargs) for t in L(ifnone(tfms,[None]))])
455 self.n_inp = ifnone(n_inp, max(1, len(self.tls)-1))
456

/opt/conda/lib/python3.7/site-packages/fastai/data/core.py in (.0)
452 ):
453 super().init(dl_type=dl_type)
→ 454 self.tls = L(tls if tls else [TfmdLists(items, t, **kwargs) for t in L(ifnone(tfms,[None]))])
455 self.n_inp = ifnone(n_inp, max(1, len(self.tls)-1))
456

/opt/conda/lib/python3.7/site-packages/fastcore/foundation.py in call(cls, x, *args, **kwargs)
96 def call(cls, x=None, *args, **kwargs):
97 if not args and not kwargs and x is not None and isinstance(x,cls): return x
—> 98 return super().call(x, *args, **kwargs)
99
100 # %% …/nbs/02_foundation.ipynb 46

/opt/conda/lib/python3.7/site-packages/fastai/data/core.py in init(self, items, tfms, use_list, do_setup, split_idx, train_setup, splits, types, verbose, dl_type)
366 if do_setup:
367 pv(f"Setting up {self.tfms}", verbose)
→ 368 self.setup(train_setup=train_setup)
369
370 def _new(self, items, split_idx=None, **kwargs):

/opt/conda/lib/python3.7/site-packages/fastai/data/core.py in setup(self, train_setup)
395 x = f(x)
396 self.types.append(type(x))
→ 397 types = L(t if is_listy(t) else [t] for t in self.types).concat().unique()
398 self.pretty_types = ‘\n’.join([f’ - {t}’ for t in types])
399

TypeError: ‘NoneType’ object is not iterable

I ran “all the cells” at once including defining dls
bears = DataBlock(
blocks=(ImageBlock, CategoryBlock),
get_items=get_image_files,
splitter=RandomSplitter(valid_pct=0.2, seed=42),
get_y=parent_label,
item_tfms=Resize(128))

and did not receive any other errors other than those.

Thank you @nicolasca this was indeed bothersome as I tried to refresh myself with the new version of materials (and Kaggle)

Kaggle seems much faster than Colab for me.

Can someone tell me what is the easiest way to export all the notebooks from Github to my Kaggle repositorty?

Thanks in advance.

3 Likes

You can probably use the kaggle CLI for that: GitHub - Kaggle/kaggle-api: Official Kaggle API, for example by doing something like this:

kaggle kernels push -p /path/to/kernel

I have cloned to my local and uploading each notebook one after another. It might not be ideal. I know that it would cause issues when there are files such as utils.py to be used. But not sure if there is any other great option.

Have a quick question about lesson one Is it a bird. I imported the code into google colab but I’m running into an error where it says the list is going out of range when searching for forest images.

Is there a link anywhere which can help someone understand the reasons for so many different environments? Kaggle, Colab, Jupyter, HuggingFace Spaces, Github — I am working with a few folks and it seems this is one of the issues that they are having difficulty in grasping. (to be open, I am a bit confused by them all as well!)