Ask Jeremy & Chris anything about Swift 4 TensorFlow

As S4TF evolves, what do you think will be the first kind machine learning work, accessible to people who don’t have access to big corporate data centers, where S4TF’s particular strengths will make it a better choice than the more traditional Python-based frameworks?


One great resource is the online Swift Language book on I’d recommend starting with a Swift tour to get a brief overview of lots of different things.



What will S4TF + MLAI/XLA mean for the deployability of models? For example, will it make it easier to train a model on eg a large TPU cluster, then produce inference-focused saved models for desktop, mobile & embedded applications with fewer steps?


Do you think swift will make deep learning debugging easier ?


All of the lectures (as I recall) were streamed using Windows. Is there a good story for Swift on Windows or does Jeremy plan to use an alternative approach for these lessons?

I have no stake in this (I use Ubuntu) but this will probably impact a good portion of the students.

All lectures are streamed on a Windows computer, but Jeremy uses a jupyter notebook through an Ubuntu system I believe. So I think Ubuntu is going to be the main OS for most things including Swift.

1 Like

what’s the fuss about swift compiler infra and ‘Multi-Level Intermediate Representation’ - where is this different from known compilers / interpreters and where is the big advantage for Deep Learning?

Some ways I tried self answering this:

  • speed: as fast as C but swift is much nicer to use
  • debugging: in python we often call C libraries, so we do not know what’s going on. In swift we can debug all the way down the stack?
  • It’s not about the infra really: rather, swift is well designed and has types (better than python)
  • …?

check this thread: Best book, podcast, tutorial a.o. for swift architecture & design


For someone who is just getting comfortable with the landscape of deep learning and coding in Python+Pytorch/Fastai (also Keras/Tensorflow for that matter !) , do you think jumping into swift now will be a good move? Or we should first get comfortable with Pytorch/Python/Fast ai to the point where we can be good practitioners/coders in this stack and then think of moving into Swift?
Don’t want to end up learning bits and pieces of everything but not comfortable (or fluent) in anything.

1 Like

Is Jeremy going to follow established Swift coding and naming conventions? :stuck_out_tongue:

I’m saying this a bit tongue-in-cheek, but I do worry that a lot of people will be exposed to Swift for the first time through Jeremy’s code examples, and in my opinion, they’re not representative of how most Swift code is being written currently. (For example, almost none of Jeremy’s published Swift code uses named parameters.)

There are some efforts in the Swift community to come up with an “official” style guide, but in the mean time here is one that is commonly used, in case anyone is interested in one opinion of how to write readable Swift code:


In my view, no issue. From the tutorials I saw swift will be easier to learn and read with eg types

Furthermore, the windows environment does not cope too well with LLVM.

Web based dev environments like Jupyter and Colab work great, and native Windows support is early but well underway.

If you are interested, I recommend following the windows tag

1 Like

I am hoping Swift could solve the following problems that I see in Deep learning in practice:

  1. One language for experimentation and deployment. Today experimentation is mostly in Python and deployment in Python/Java/C++, which creates this unnecessary handoffs between teams making training->deploy->retraining cycle harder
  2. Language which is designed to be used in production and designed for collaboration/long term maintenance etc. There are many problems which come with using Python as a primary language, like: multithreading problems, costs of dynamic typed language, etc
  3. Hoping that LLVM being a first class citizen in Swift world would mean moving towards a dream world of better interoperability across hardware.
  4. As value in understanding down until hardware acceleration layer is important with Deep Learning, one language for the entire stack seems like can enable people to understand things and modify things at a deeper level.
    Would love to hear your thoughts on the above. Also, would love if you can give a primer on LLVM and MLIR.

Will Swift 4 TensorFlow enable the practical use of DL on GPU’s besides NVidia and Google TPU? The monopoly of Nvidia seems to be a huge barrier to exponential growth of DL. Their latest 2080ti is 2x faster than 1080ti, but also 2x as expensive.


Will Swift 4 TensorFlow enable dynamic optimization of the allocation of GPU cores and GPU memory to advance hardware parallelization with GPUs? Will it improve on CUDA or will it just use CUDA?

1 Like

Will Swift 4 TensorFlow simplify or improve performance in the use of multiple GPUs into the training loop?

I thought about that, too. Perhaps once MLIR is out it will be easier to create “backends” for AMD Vega and other cards.

1 Like

Using the fastai python libraries are well documented and has great support through the community, forums, courses, etc. It is very easy to provision virtual resources to easily learn deep learning and apply deep learning to a problem of interest.

How does the setup of Swift 4 TensorFlow compare to fastai setup? What hosting options are available?

Fastai can only perform well on machines with Nvidia GPUs. Does S4TF have the same limitations? Others have asked about Windows compatibility - what about macOS?

Why not use swift on top of pytorch? Why use tensorflow?