Pytorch Cheatsheet

Over the last few weeks I’ve been using Pytorch exclusively for all my deep learning work. I’ve come to enjoy it and I think it’s made me more productive. Pytorch has some great tutorials to get you started. But when you’re ready to get things done, here is short cheatsheet of the functions I found myself writing over and over. Inside is an Experiment class for monitoring and resuming training runs, along with some functions for saving and plotting loss history and a web server for monitoring training away from your computer.

Eventually, I’d like to expand the repo into a “Pytorch Starter Template” with boilerplate for quickly launching projects with a variety of architectures / setups. Please let me know if you’re interested in collaborating, I would love to work with others on this. :slight_smile:

Okay, to kickstart a flame war… Tensorflow and Keras are great frameworks, but Pytorch feels younger, scrappier and more fun. If I were to guess I’d say it overtakes vanilla tensorflow in a few years, if not Keras as well. It’s like Java vs Python.

19 Likes

count me in. I have not played much with pytorch, but in my short effort in implementing lsh for mean shift clustering, it seemed very intuitive. I am running some experiments and it may be a few days before I could touch Pytorch though.

Keras, pytorch, tensorflow – whatever the environment might be, it might be better to have a separate wiki pages of sorts for sharing some good known workflow “secrets”/tips. Ideally a notebook would be great, but it does not lend itself to wiki style editing.

1 Like

The thing that strikes me as awesome about PyTorch is how easy it makes it to use cuda for general Numpy stuff.

Messing around with calculating spatial gradients in PyTorch gave me some ideas for other CV stuff that would be cool to have GPU accelerated.

Also think it wouldn’t be too hard to add PyTorch as a Keras backend, most of the required functions would be easy enough.

Keras should allow a backend to be imported as a separate module so it doesn’t all have to be maintained in the same repo. Then if a few cowboys decide to make a PyTorch backend even though the library is still pretty raw, we would be able to use it with Keras without giving FChollet too many new headaches.

1 Like

I like pytorch, it is more intuitive that keras but it seems that it is more of a research framework and not useful for production usage.
With keras, after building a model you can access the TF computation graph and then work in TF from then on. And also use Tensorflow Serving to serve the models trained with TF.

1 Like

Thank you for this cheatsheet @brendan . I started going through the PyTorch tutorials since a couple of days and found the numpy on GPUs thingy fascinating. To start off with a mini-project, I started working on converting the lesson-4 CF model in Keras to PyTorch. I got stuck and I’m still trying to figure my way out. Let me know if you are up for collaborating. I shall share my notebook. Cheers!

2 Likes