New pytorch tutorial draft - feedback welcome

I completed part 1 of fastai, reviewed Pytorch’s official tutorials and decided to take a run at Kaggle competitions before moving on to part 2. I built lots of models with many combinations of hyper-parameters and datasets for just one competition. I saved these configurations in a directory and applied a versioning scheme. Trying to review, track and compare is challenging. If I could manage these configurations better it seems like a good way to build intuition.

Sounds like an interesting topic for a post, but out of scope for this intro.

great stuff, will go over it in some iterations and check the pytorch tutorials another time

One thing that for me would add even more value: A hint explaining what a ‘dynamic graph’ that pytorch favors for the model’s computation graph actually is. Am thinking with the headline ’ What is ‘torch.nn’ really ?’ it could be good to briefly explain, what actually seems to be an important difference to tensorflows static computation graph.

Not sure, this is sth one could even explain in a small section with code. If there are practical implications to how one can work with a dynamic vs a static graph deep learning library it would also be great to have e.g. a link on that.

I agree that would be helpful, but it’s not actually directly related to torch.nn, but to PyTorch more generally. So that would be a topic for a whole different tutorial… :slight_smile:

1 Like

I went through a part of the tutorial a couple weeks or so ago and I thought it was really solid. I was using it to get back up to speed in using pytorch.

to answer my own question (link might be useful for others): https://www.youtube.com/watch?feature=youtu.be&utm_campaign=NLP+News&utm_medium=email&utm_source=Revue+newsletter&v=WTNH0tcscqo