- Download from http://ctags.sourceforge.net/
- Unzip in any folder
- update this in PATH environment variable.
Should we change from [‘bbox’] to [‘bbox_new’] as below? Otherwise, the
bbox_str won’t capture the new orientations.
largest_bbox[‘bbox_str’] = largest_bbox[‘bbox_new’].apply(lambda x: ’ '.join(str(y) for y in x))
Oh yes. You’re right. I’ll update the gist in a while.
argument to the open_image function has to be casted to string. So use like this open_image(str(IMG_PATH/trn_fns[i]))
Cast argument to string, use like this-
i have an error in F.softmax(predict_batch(learn.model,x),-1)
NameError: name ‘predict_batch’ is not defined
I have searched for predict_batch in whole notebook but it wasn’t there.Maybe it is used in past lectures.Please help me.
Thanks in advance.
Just found this really great intro to matplotlib, focusing on the OO API https://realpython.com/python-matplotlib-guide/
Same issue here. I’ve installed CTAGs but cannot find symbols.
Shift+Command+F works for me as well.
Hi, after installing ctags please add
in your user settings.json file in the visual studio code. The symbols now work for me.
These predictions come from a fully trained model as in the notebook:
I just wanted to say: wow. This is so impressive that a neural net can do this! And that is building on top of resnet that was designed to do something quite different.
This is amazing
What you can do to come up with the number 25088 is removing the nn.Linear() part and simply checking the size of the flattened final layer.
head_reg4 = nn.Sequential(Flatten()) learn = ConvLearner.pretrained(f_model, md, custom_head=head_reg4) learn.opt_fn = optim.Adam learn.crit = nn.L1Loss() learn.summary()
It will show you
('Flatten-123', OrderedDict([('input_shape', [-1, 512, 7, 7]), ('output_shape', [-1, 25088]), ('nb_params', 0)]))])
at the end of the output.
I found a few things.
Moving in Atom – handy navigation manual.
atom-ui-ide. Among other things, it lets you find all references of a function. (either this or python-ide gives you hover-documentation like VSC). It’s sort-of a base package for IDE functionality.
python-ide builds atop that and allows you to search symbols / function declarations in the current project - not just file. However, it requires python language server to work, which is maintained by Palantir – so I don’t know how shady/safe that is. It also let’s you hover over functions/classes for documentation, even for out-of-project imports. It also let’s you CMD-click on a function and go straight to its declaration, even out-of-project. I haven’t seen this work in all cases (worked for sklearn.ndimage imports but not sklearn.metrics), but I’ve been able to CMD-click directly into the NumPy source code w/ this.
atom-ctags enables the built-in Atom search features. It builds a ctags file of recognized symbols, per project. I think VSC does this automatically behind the scenes. It also let’s you use CMD-Shift-R (Mac) to search symbols in a project (the “opim” search Jeremy did).
The functionality does come with a price. On my MacBook, enabling auto-ide-ui adds a solid half-second to Atom’s start time. Enabling it with python-ide makes that almost a full second but feels longer.
Having played with it a bit, I think if I want to keep Atom’s speed & minimalism, I’d stick with using the atom-ctags to let me search symbols or go to definitions (CMD-Shift-Down or CMD-Shift-Up to come back) – although it doesn’t always work: I’m not sure when/not symbols/ctags are generated.
I may check out VSC for Mac (or just Visual Studio?) if I find I need the functionality, but that’s what I’ve found so far.
I was wondering why we use “predict_batch” function for making predictions in case of largest item classifier but don’t use it in other cases. As far as I can see this code:
x,y = next(iter(md.val_dl)) predict_batch(learn.model, x)
and this code:
x,y = next(iter(md.val_dl)) learn.model(VV(x))
give the same output. Why do we use “predict_batch” then?
Just because it ensures that
reset are called first. I don’t use it in some of the lessons since I want to teach how to do it manually.
All the videos are “Unlisted” in Youtube. Now that part 2 is officially launched, is this intended?
The links in the time line are broken.
Many thanks - fixed now.