Have we reached peak Python and peak Pytorch?

Recent enhancements have established Tensorflow to be a greatly expanding eco system. I have trouble imagining how Pytorch can keep pace. I greatly value competitive environments but what will be Pytorch’s answer to Tensorflow 2.0, Tensorflow Lite (embedded, mobile), tensorflow.js, Coral (embedded TPUs)? Have we arrived at peak Pytorch? Has Tensorflow become so compelling that it will steal market share from Pytorch?

Is there a similar situation developing with Python? There’s compelling reasons to use tensorflow.js because javascript has the greatest market share and native browser support. Now Fastai has the Swift for Tensorflow inititive. Will tensorflow.js and other languages (Switft, C++, .Net) reduce Python’s machine learning marketshare? Will Python for Excel make a difference?

7 Likes

I’m guessing you haven’t used TensorFlow much. :wink: Honestly, all those things you mention are such a pain to use, compared to Pytorch.

I hope that Swift eats in to Python’s mind-share, because I like Swift much better than Python :slight_smile:

9 Likes

Then why aren’t we using Swift for Pytorch?

I’m not an expert on PyTorch, but “Swift for PyTorch” is definitely possible - it would mean using the PyTorch ops and kernels instead of the TensorFlow ones.

That said, I don’t see what the advantage of doing so would be. My understanding is that TensorFlow (from an op/kernel level and down) is a much larger ecosystem than PyTorch has, and all of the usability issues are at the level that S4TF is tackling. There are also some exciting things coming that are going to redefine how ops work in TensorFlow that dovetails very nicely with S4TF.

-Chris

12 Likes

Hi Chris. Nice job on Swift. I too wrote a Mac language. I implemented the first C compiler for the Mac back in 1984. It was called Softworks C. FWIW, it was difficult to convince Apple of the importance of C because they were married to Pascal.

Are you and Jeremy thinking that all good things with Pytorch will be even better with S4TF? Otherwise, I’m not sure how to reconcile Jeremy’s comment above.

For me, a couple things are missing from the S4TF conversation. I’ve read Jeremy’s blog posts about Swift. I’d like to see a white paper that clearly and compellingly details why S4TF is the way forward. I’m concerned that in a couple years when S4TF becomes an authentic option, people will re-visit the same question but in the context of newer tools. I’m particularly concerned about robust official cross-platform support. In particular, the 800 lbs gorilla that’s being ignored – What about official Windows support?

4 Likes

Because S4TF seems likely to eventually target MLIR, rather than the current TF runtime, which should give us something much better than either PyTorch or TF! :smiley:

5 Likes

Whilst they’re a little out of date, have you read the justification and design docs?:

Also, have you seen Chris’ slides on MLIR?:

https://drive.google.com/file/d/1hUeAJXcAXwz82RXA5VtO5ZoH8cVQhrOK/view

Although it hasn’t been quite laid out in black and white, a little imagination along with the above docs should give you a sense of where this is (hopefully!) going to end up.

2 Likes

Very cool! I just discovered you wrote a BASIC implementation as well… :slight_smile:

https://www.atarimagazines.com/startv1n3/Shoestring.html

2 Likes

Great work on Softworks @bsalita.

Windows support exists and is making progress but is still fairly early and I haven’t tried it myself. This would be a great way to get involved if you are interested,

-Chris

Nice find. Wrote APL, Fortran, VB6 compilers and other dev tools too. I’m now reinventing myself as an ML wannabe. Big thanks to Jeremy on that.

6 Likes

I wasn’t aware of Chris’ slides or MLIR. I’m going through the info now.

1 Like

Jeremy,

A jumped-up Unix admin language -Guido Van Rossum’s Python- has taken over the most fast-moving area of CS research. Surely that must mean something? Ok, so it’s not Perl at least :slight_smile:

Methinks better hacking and prototyping tools are going to force change faster than back-end compiler rewrites, or going from C to C++ or C**2 . Machine Learning needs its HyperCard, not a better C compiler :slight_smile:

BTW, Robert somehow tricked me into helping with the Paris study group. Beware, he may have powers of hypnotic suggestion :slight_smile:

Edmund

1 Like

You may not be aware that you’re talking to the person that chaired the Perl 6 working group for numeric programming… :wink: https://perl6.org/archive/rfc/202.pod

15 Likes

Oops. Shots fired

2 Likes

Swift for TensorFlow looks pretty fun to me: Swift for TensorFlow - TFiwS (TensorFlow Dev Summit 2018). And reading about Bret Victor’s work inspiring the Swift team on Chris’ homepage (at least the Playgrounds part) gives me hope :smiley:

Python is a lot of fun, too, don’t get me wrong. But why waste all those precious CPU-, GPU- and TPU-cycles?

Why not both?! Better hacking and prototyping tools and at the same time a community of passionate language developers that take a holistic computation perspective and optimise the entire chain, from your model all the way to the hardware? Everybody keeps hacking away and one day we might get out of the current “local optimum” as Noah Goodman puts it so nicely in the panel discussion with Jeremy and Chris.

BTW: This is Chris’ talk at the 2019 TF Dev Summit: Swift for TensorFlow: The Next-Generation Machine Learning Framework (TF Dev Summit '19).

4 Likes

Any harebrainers want to travel to Brussels, April 8-9th, to hear Tatiana and Chris give the 2019 European LLVM Developers Meeting keynote on MLIR?

“We will illustrate in this talk how MLIR can be used to build an optimizing compiler infrastructure for deep learning applications”
https://llvm.org/devmtg/2019-04/talks.html#Keynote_1

4 Likes

I’ll be there. :slight_smile: We are also giving an hour long tutorial/demo as well. Both will be targeted at the compiler developer audience.

4 Likes

I would say that the Deep Learning world changes quite fastly these years. Like, I have started a couple of years ago and was using Keras with Theano (!) backend for my projects. Then I’ve abandoned Keras/TensorFlow in favor of PyTorch with its very “pythonic” nature. And now it seems like a time to come back home again :smile:

The greatest thing is that if you know some generic concepts and a bit of “low-level” machinery, you can switch from one tool to another, or use them in parallel for different tasks. Like you use programming languages in general.

5 Likes

I watched the video on Swift for Tensorflow, which is very interesting. And one does see why Swift might -or rather will- make a better Python for assembling batch commands of API calls, which let’s face it is what Python programmers here do these days, calling Fastai, or TF, or Numpy.

But in the end, I think Alan Kay’s Smalltalk environment of 40 years ago (revived as Squeak) with its integrated code browser and change and inspect everything on the fly -or the Lisp Machines- were a better view on a thinking dev’s prototyping requirements, even though that fragmented view on code through browsing scales badly to industrial practice.

To me Swift for TF looks like an answer to the industrial requirements of Google to the same degree as C++ was an answer to those of ATT in its day. It may -it will- take over the world. And when it does I’ll still hate every time I have to use it :slight_smile:

Edmund

3 Likes

I’m not sure that’s true. The PyTorch kernel seems much more cleanly designed, and as a result seems to generally stay a bit ahead in terms of capabilities vs TensorFlow. Whilst TF from the outside appears to have a bigger ecosystem, once you actually scratch the surface trying to use much of that ecosystem turns out to be impractical and/or clunky.

Perhaps if you’re inside Google things are different. But in the world outside Google everything from the poor installation experience, to the lack of true programmability of TPUs, to the weak design and implementation and ecosystem of XLA, to the frequent announcements followed by failed delivery of TF projects at the dev summits, and especially to the massive amount of code repetition and tech debt inside the TF code itself, it seems like TF comes with a lot of baggage.

I expect the Swift for TensorFlow team to succeed and create something better than either PyTorch or Python TF. But if they do, it will be despite TensorFlow, not because of it. E.g. MLIR, which @clattner links to above, will (hopefully) allow S4TF to largely extricate itself from the TF baggage and forge its own path.

OTOH, PyTorch has to deal with the baggage of Python, which was not design with today’s parallel processors or data science in mind, and as a result the PyTorch devs have to do heroic things to make stuff like JIT that works around some of these foundational problems. I think in the long term this won’t be sustainable.

So whilst I think S4TF can, over time, work around and/or bypass the TF baggage, I’m not convinced PyTorch can do the same with the Python baggage. For the next couple of years at least, I’d expect PyTorch to remain the first choice for data scientists that just want to get stuff done, but I don’t think that will remain the case in the long term.

34 Likes