Lesson 14 (2019) discussion and wiki

Thanks a lot Jeremy and Rachel. And Chris Lattner for the amazing two weeks.

4 Likes

Sure… :grinning:

Great to be part of this amazing community. So much to learn. S4TF will surely conquer the depth of practical deep learning. However, " take one particular thing or project and make it fantastic " is what really keeps me going on.

4 Likes

Thanks for a great course, wow! So much material in so little time. Thanks Rachel, Jeremy and the team, what a great way to spend my evenings (and my future evenings).

1 Like

big thanks to Rachel, Sylvain and Jeremy from fast.ai and Chris and team from S4TF for this class, amazing to be part of this!

1 Like

Well thats takes up the rest of 2019 and a good part of 2020. Very entertaining. It’s obvious a lot of effort by a lot of people has gone into this, including some chance meetings by some very focused individuals.

A great many thanks to all

2 Likes

Thanks Jeremy, Rachel, Chris and fastai + S4TF teams for this great Part 2 v3! Looking forward to the future with Swift TF.

1 Like

Thank you Jeremy, Chris & Rachel, for this amazing course and a very exciting last two weeks. Looking forward to contributing to S4TF.

1 Like

“You can use his camera while looking through your Topology eye wear glasses.” :rofl:

1 Like

Since no one has said this so far, let me:
Jeremy-months needs to be a thing. Hence forth all effort should be measured in it.

Which makes me realise that when Jeremy says “I pack enough material to keep you busy for a year” I need to allocate the next 6 yrs for it! :grimacing:

Thanks Chris for the reality check!

4 Likes

the forum probably doesn’t allow emotions. but allow me:

I feel both sad and heavy now that the course is over!

5 Likes

Well The course is of course over however good thing is we still have one more bonus class. It will be nice to seeing Jeremy and Rachel teaching again very soon…!!!

3 Likes

In case anybody else also found the "so we simply define a new compose operator >| " part very inspiring and thought “Can we do that in python, too?”, the short answer is “No!”, not without modifying the underlying CPython interpreter (so, point goes to Swift where anything is hackable :wink: ).

But - as Jeremy has shown before - overloading/redefining existing special dunder methods of objects is very easy, so redefining the pipe operator would be quite easy (| in python is “or” = .__or__(self, other), “bitwise or” = .__ror__(self, other)).
In order to avoid possible conflicts with existing uses of “or” in the code and for readability/clarity it might be better to use something rarely used like the shift operator (>> which is .__rshift__(self , other)).

Ideas from here:

3 Likes

I don’t watch TV or play computer games or get lost in social media so I maximize the time I spend on things I care about. So over time I’ve gotten faster at doing stuff, since I’ve been practicing and learning lots.

12 Likes

That’s an excellent thread, thanks for sharing!

That said, I do want to point out that each of us is in a different situation and faces different challenges. When I say I’ll need 6 years, I’m factoring all that in! Earlier I would compare and feel overwhelmed or disheartened. But nowadays I go read @radek’s tweet thread and remind myself to “keep walking”. :slight_smile:

5 Likes

Well, most of normal human beings (non Jeremy like :slight_smile: ) will go through these feelings. I can personally relate to them as I was so overwhelmed after part 2 last year that I gave up even during the middle of the course. I forgot that I must target being a little better today than what I was yesterday. I was probably wanting to compete with the pace of the course without being ready for it. But I persisted with the course this year both from Part 1 and Part 2. I also got involved in communities like Twimlai that helped a lot. Long story short, I am much more comfortable with the course and library than I was a year back. And now I feel challenged but not overwhelmed. Hopefully in a year from now, I will be much more at ease with Swift4Tensorflow as well.

9 Likes

In the notebook 02c_autodiff Step 6 should valueWithDeriv be valueWithPullback as in https://github.com/tensorflow/swift/blob/master/docs/DifferentiableFunctions.md?

This should be easily solvable using an IDE, where you click on the respective function to know its source. In jupyter notebook something like ??function name may work, I have not tried it though.

Here’s Chris Lattner talking to Lex Fridman. S4TF is mentioned from 51:36, especially 54:17 to 57:40 - TF as a compiler with Swift as a front-end language to which we can add language features

Video : https://www.youtube.com/watch?v=yCd3CzGSte8
Audio : https://lexfridman.com/chris-lattner

6 Likes

I think Chris mentioned it in the last lesson and not the 1st Swift class:

MLIR is like TensorFlow’s XLA - it tackles graph-level optimizations in TF, expands XLA beyond dense linear algebra. Currently we’re enjoying S4TF with XLA but the MLIR project is gonna make it more awesome.

It has been open sourced and you can join SIG MLIR which design meetings with the community:

https://groups.google.com/a/tensorflow.org/forum/#!topic/mlir/vj34VmkyTic