Fastai2 and new course now released

He said this, “EDIT: Ignore me! I was trying to use “error-rate” as my metric for fitting and it wasn’t liking it.”

(Apologies if this question doesn’t belong here.)
(I thought it would be useful as many other people will be using VS Code.)

Hey, I recently moved from fastaiv1 to fastaiv2.
I observed a very interesting thing in VS Code. My outline and IntelliSense are not working.It is showing Loading Document Symbol for ... . Initially, I thought it may be a problem with interpreter or something. But I tried all that I could but it is still not working. I changed to the environment having fastaiv1 and it works fine there. I don’t know if its a problem at my end.!

I would like to know if anyone else is facing the same problem

Help would be appreciated.

Thankyou

EDIT :
I have updated my environment containing fastaiv1 to fastaiv2 and again it stopped working.(Initially it was working in the env containing fastaiv1)

After many hours of searching,
I found that fastai is downgrading some packages that vscode uses for intellisense.
Especially npm packages…!(Not sure abt this !)
But I still don’t know how to fix this!

1 Like

Hi Ceremona! I ran into the same problem when running some notebooks in Google Colab. The way I got the notebooks to work was to factory reset the runtime: Runtime >> Factory Reset Runtime. I have no idea why it works but it does work soooo… :slight_smile:

Hope this helps!

In the Jupyter notebook menu, you can select Kernel -> Restart & Clear Output

Hopefully this helps!

Reported this problem on another thread, but the problem is apparently not solved yet

GPU mode is turned ON but fast.ai can’t detect this when using to_fp16(…). Same problem seen in fastbook_10_nlp

Related to this: Fastai doesn't detect the gpu in kaggle issue

2 Likes

Hey , I am using amazon ec2 instance to run my code. I have conda installed fastai . But it is still showing the version as fastai 1.0.61.
Can someone help me with that!

@ksjoe30 you can run conda list to check if you have more then one install. If there is only one i believe conda update fastai should work. You can also uninstall and install again as recommended: https://docs.fast.ai/#Installing

I Was Having the Same Issue When I used conda install fastai

I solved that by creating a fresh new environment and used

conda install -c fastai -c pytorch fastai

command to install and it solved for me.

learn.summary() causes following Error
AttributeError: ‘NBProgressBar’ object has no attribute ‘wait_for’


AttributeError Traceback (most recent call last)
in
----> 1 learn.summary()

/opt/conda/lib/python3.7/site-packages/fastai2/callback/hook.py in summary(self)
187 “Print a summary of the model, optimizer and loss function.”
188 xb = self.dls.train.one_batch()[:self.dls.train.n_inp]
–> 189 res = module_summary(self, *xb)
190 res += f"Optimizer used: {self.opt_func}\nLoss function: {self.loss_func}\n\n"
191 if self.opt is not None:

/opt/conda/lib/python3.7/site-packages/fastai2/callback/hook.py in module_summary(learn, *xb)
162 # thus are not counted inside the summary
163 #TODO: find a way to have them counted in param number somehow
–> 164 infos = layer_info(learn, *xb)
165 n,bs = 64,find_bs(xb)
166 inp_sz = _print_shapes(apply(lambda x:x.shape, xb), bs)

/opt/conda/lib/python3.7/site-packages/fastai2/callback/hook.py in layer_info(learn, *xb)
148 with Hooks(flatten_model(learn.model), _track) as h:
149 batch = apply(lambda o:o[:1], xb)
–> 150 with learn: r = learn.get_preds(dl=[batch], reorder=False)
151 return h.stored
152
/opt/conda/lib/python3.7/site-packages/fastai2/learner.py in enter(self)
205
206 def _end_cleanup(self): self.dl,self.xb,self.yb,self.pred,self.loss = None,(None,),(None,),None,None
–> 207 def enter(self): self(_before_epoch); return self
208 def exit(self, exc_type, exc_value, tb): self(_after_epoch)
209

/opt/conda/lib/python3.7/site-packages/fastai2/learner.py in call(self, event_name)
131 def ordered_cbs(self, event): return [cb for cb in sort_by_run(self.cbs) if hasattr(cb, event)]
132
–> 133 def call(self, event_name): L(event_name).map(self._call_one)
134
135 def _call_one(self, event_name):

/opt/conda/lib/python3.7/site-packages/fastcore/foundation.py in map(self, f, *args, **kwargs)
381 else f.format if isinstance(f,str)
382 else f.getitem)
–> 383 return self._new(map(g, self))
384
385 def filter(self, f, negate=False, **kwargs):

/opt/conda/lib/python3.7/site-packages/fastcore/foundation.py in _new(self, items, *args, **kwargs)
331 @property
332 def _xtra(self): return None
–> 333 def _new(self, items, *args, **kwargs): return type(self)(items, *args, use_list=None, **kwargs)
334 def getitem(self, idx): return self._get(idx) if is_indexer(idx) else L(self._get(idx), use_list=None)
335 def copy(self): return self._new(self.items.copy())

/opt/conda/lib/python3.7/site-packages/fastcore/foundation.py in call(cls, x, args, **kwargs)
45 return x
46
—> 47 res = super().call(
((x,) + args), **kwargs)
48 res._newchk = 0
49 return res

/opt/conda/lib/python3.7/site-packages/fastcore/foundation.py in init(self, items, use_list, match, *rest)
322 if items is None: items = []
323 if (use_list is not None) or not _is_array(items):
–> 324 items = list(items) if use_list else _listify(items)
325 if match is not None:
326 if is_coll(match): match = len(match)

/opt/conda/lib/python3.7/site-packages/fastcore/foundation.py in _listify(o)
258 if isinstance(o, list): return o
259 if isinstance(o, str) or _is_array(o): return [o]
–> 260 if is_iter(o): return list(o)
261 return [o]
262
/opt/conda/lib/python3.7/site-packages/fastcore/foundation.py in call(self, *args, **kwargs)
224 if isinstance(v,_Arg): kwargs[k] = args.pop(v.i)
225 fargs = [args[x.i] if isinstance(x, _Arg) else x for x in self.pargs] + args[self.maxi+1:]
–> 226 return self.fn(*fargs, **kwargs)
227
228 # Cell

/opt/conda/lib/python3.7/site-packages/fastai2/learner.py in _call_one(self, event_name)
135 def _call_one(self, event_name):
136 assert hasattr(event, event_name), event_name
–> 137 [cb(event_name) for cb in sort_by_run(self.cbs)]
138
139 def _bn_bias_state(self, with_bias): return norm_bias_params(self.model, with_bias).map(self.opt.state)

/opt/conda/lib/python3.7/site-packages/fastai2/learner.py in (.0)
135 def _call_one(self, event_name):
136 assert hasattr(event, event_name), event_name
–> 137 [cb(event_name) for cb in sort_by_run(self.cbs)]
138
139 def _bn_bias_state(self, with_bias): return norm_bias_params(self.model, with_bias).map(self.opt.state)

/opt/conda/lib/python3.7/site-packages/fastai2/callback/core.py in call(self, event_name)
42 (self.run_valid and not getattr(self, ‘training’, False)))
43 res = None
—> 44 if self.run and _run: res = getattr(self, event_name, noop)()
45 if event_name==‘after_fit’: self.run=True #Reset self.run to True at each end of fit
46 return res

/opt/conda/lib/python3.7/site-packages/fastai2/callback/progress.py in before_epoch(self)
21
22 def before_epoch(self):
—> 23 if getattr(self, ‘mbar’, False): self.mbar.update(self.epoch)
24
25 def before_train(self): self._launch_pbar()

/opt/conda/lib/python3.7/site-packages/fastprogress/fastprogress.py in update(self, val)
92 yield o
93
—> 94 def update(self, val): self.main_bar.update(val)
95
96 # Cell

/opt/conda/lib/python3.7/site-packages/fastprogress/fastprogress.py in update(self, val)
55 self.pred_t,self.last_v,self.wait_for = 0,0,1
56 self.update_bar(0)
—> 57 elif val <= self.first_its or val >= self.last_v + self.wait_for or val >= self.total:
58 cur_t = time.time()
59 avg_t = (cur_t - self.start_t) / val

AttributeError: ‘NBProgressBar’ object has no attribute ‘wait_for’

Hello All,

I’m really loving the new FASTAI and FASTBOOK…

however have few questions to help me permanently migrate to the new course

  1. Is there an equivalent file such as config.yml in the new course, i will like to modify the location of my download folders

  2. Also is there a notebook similar to Fastai v1 planaent dataset ? the new multiclass notebook seems to be tackling a different problem from what i needed it for.

I’m running fastai on win10 ~ wsl2

Thanks for your help…

You’re the 3rd person (including myself) running into the “‘NBProgressBar’ object has no attribute ‘start_t’” error when calling learn.summary(). Did you find a workaround?

@ai_padawan unfortunately not
I have alot of problems with fast.ai after their 21 August release

previously using v0.0.20 for many weeks without any issue

1 Like

I’m having the same issue, just posting to keep updated :slight_smile:

Bianca’s reply was very useful, thanks for that.

Also, it seems they changed the structure of the course, but browsing through the repository I have found that if you go the fastbook github you can find the course in there with all the annotations.

Hope this helps!

1 Like

If you call learn.summary() before training the model, does it work? I ran into a similar issue that was kind of inconsistent until I realized that calling summary() on a learner worked until you trained it. I think there is a bug in there, trying to track it down so I can send in a patch.

@suresk have you tried this lately? I believe this was fixed a few weeks back or so (could be wrong :slight_smile: )

I am able to reproduce this in 2.0.11, which was released yesterday. Unless you’re referring to unreleased changes?

Here is a fairly small repro of it:

import fastai
print(fastai.__version__)

from fastai.tabular.all import *
df = pd.DataFrame({'x': [1, 2, 3], 'y': [2, 4, 6], 'z': [2, 8, 18]})

pan = TabularPandas(df, cont_names=['x', 'y'], y_names='z')
dls = pan.dataloaders(bs=2)
learn = tabular_learner(dls)

print(learn.summary())

learn.fit_one_cycle(2, 1e-3)

print(learn.summary())
1 Like

I think the issue for both tta() and summary() is that they need to occur in a no_mbar context, so that it doesn’t try to update the progress bar (it is started in the begin_train callback which isn’t called for these two operations). I am working on a fix for both here:

It fixes it for me, but I’m having trouble getting all of the tests to run on my machine, trying to work through that now…

1 Like

Hello, I may be kinda late on this. Is the clean folder the stripped version for the course?