There is not. There should be one in March 2020.
Sounds good. Thanks for the clarification!
@jeremy/@sgugger mentioned in the blog/course/forum that everything in v2 will be in a notebook, and every feature or pull request will have a notebook to demonstrate and test it.
So this is great, it terms of showing how things should work.
But the knowledge about common yet wrong things to do things, anything from passing the wrong data type to a function, and to high level bad practices, is usually in the forum posts and not in the code itself. (this is not specific to fastai, itâs in most projects).
Is there a plan to incorporate âextremely verbose exceptionsâ, those that donât just say âinvalid data typeâ, but actually suggest what is probably the right one and give references and explanation on the situation?
Whenever I use a library, and get an exception, and then find such an explanation inside it, itâs a huge time saver and relief. On the other hand, getting super cryptic exception from a library, with an equally crypt documentation, just to search and find a StackOverflow question with a million participates, since everyone get this error, is quite frustrating. Pretty sure there are such posts on the forums here as well. So letâs reduce frustration.
Thanks
I think that L should be a library on its own. I woudl like to use it elsewhere but not have to install the whole of fastai as a dependency. Are you planning on making it its own pip package?
Just a cross-reference: a similar issue has been addressed in Fastai v2 code walk-thru 5
WSL 2 worked fine for me - I loaded the Ubuntu 18.04 LTS from the Windows store and installed Anaconda - then the fastai environment could be built. Then Jupyter Notebooks can then be used from the Windows browser. I did see an issue with the CUDA calls, but expected as WSL 2 has no GPU access.
So first of all I just want to say that, after watching the last code walkthrough the use of Python metaclass attributes is brilliant! I had no idea Python was this flexible!
Also I would like to ask, do you plan for fastai v2 to support pruning weights or conv filters? Iâd be interested in helping to implement pruning techniques
I havenât done any work on pruning. Iâd be happy to consider contributions in that area, although Iâd suggest first doing it in a separate repo and we could see whether closer integration would be useful later.
I noticed that there isnât any learning rate finder yet in v2. I would like to contribute that if no one else is already working on this. Also, is there a general todo list or should people just start contributing whatever they see missing?
There is a bunch of TODO
s.
Find with ack "#TODO"
.
Here is the current output:
19:21 $ ack "#TODO" *.ipynb
03_data_pipeline.ipynb
597: "#TODO: method examples"
1170: "#TODO: do something here\n",
05_data_core.ipynb
1478: "#TODO: make the above check a proper test"
07_vision_core.ipynb
45: "#TODO: investigate"
248: "#TODO function to resize_max all images in a path (optionally recursively) and save them somewhere (same relative dirs if recursive)"
637: "#TODO explain and/or simplify this\n",
942: "#TODO: Transform on a whole tuple lose types, see if we can simplify that?\n",
1069: "#TODO tests\n",
09_vision_augment.ipynb
786: "#TODO test"
869: "#TODO: test"
19_callback_mixup.ipynb
115: "#TODO: make less ugly\n",
30_text_core.ipynb
885: "#TODO: test + rework\n",
940: "class SentencePieceTokenizer():#TODO: pass the special tokens symbol to sp\n",
_42_tabular_rapids.ipynb
159: "#TODO Categorical\n",
50_data_block.ipynb
243: "#TODO: access vocab\n",
758: " #TODO: dupe code from DataBlock.databunch and DataSource.databunch to refactor\n",
92_notebook_showdoc.ipynb
711: " link = get_source_link(elt) #TODO: use get_source_link when it works\n",
_tabular_fast.ipynb
648: " df[n] = df[n].fillna(df[n].mean()) #TODO: request median\n",
The learning rate finder is defined in notebook 14 with the schedulers.
As for contributions, the most helpful for us now is trying to use the library and point us to bugs/things that are unclear or behave weirdly. The TODO notes are mostly for Jeremy and me and may not be very understandable.
Thanks! I am trying to use the library and didnât find the LRFinder, because it is not in the docs. Do the docs just need a rebuild?
Just so I understand this correctly: Jeremy and you are mainly looking for feedback, but not contributions (which is absolutely fine, just want to make sure I understand correctly)?
It in the docs, look here
Yes, weâre looking for feedback mainly, although help with the docs or some interesting tests is always welcome. But since the API is still changing as weâre developing, itâs probably harder to contribute something new. In a few weeks, when v2 starts to be more stable, we will definitely welcome contributions. Weâre also thinking of having a fastai-contrib repo, like pytorch hub or tf-contrib.
My bad, I was confused about the structure of the docs and as the search works on the current, not the new docs, I didnât find it.
Good to know. I understand there is always a tension between getting people involved and too many cooks spoiling the broth. Iâm sure youâll find the right balance in the next weeks.
Hi, is there a way to use @patch_property while wanting to add a setter?
Docs say there should be a (The search function redirected me to v1 docs. Iâm not sure v2 actually has to_fp16). This is the code I am using:to_fp16
method on a Learner
instance, but for me there is not.
learn = Learner(model, data, metrics=accuracy, loss_func=CrossEntropyLossFlat, opt_func=opt)
learn.to_fp16()
AttributeError: 'Learner' object has no attribute 'to_fp16'
Any idea why that would be? I did a fresh git clone
to make sure I have the latest version.
Did you import the module that contains to_fp16
?
I imported from local.callback.all import *
, that should include it, right?
Ah youâre looking at the v1 docs. v2 doesnât seem to have it yet. Note that v2 docs search is still searching v1, which is confusing!
Yes, sorry, I should have noticed. So v2 probably doesnât have a to_fp16
convenience function, yet, I suppose.