# About the Part 2 (2019) category

Thanks for the Invitation.

Super exciting developments in NLP of late. Can’t wait to see what topics Jeremy covers in this round of Part 2. Hopefully some topics on Semantic side of NLP

3 Likes

Thanks for the invite. Just finishing up part 1 this weekend (only the last lecture left) and have watched both parts from 2018 “back in the day”. Very excited about the overall changes and the quality of the library and seeing part 2 with v1 of the library.

Quick question: will there be a “secret youtube” again? I usually download lectures and rewatch them a couple of times. Do I have to screen capture the live lectures or will they be made available (and if so roughly how long after they “aired”)?

Very excited, I really think you are making sota Deep Learning accessible to the masses which has very exciting implications.

1 Like

Jeremy most of the times uploads nicely edited video on the same day under 12 hours max after the live stream.

5 Likes

Thanks for invitation. I use to contribute on this and others AI forums because I love what I do and I love to share knowledge. Lets continue doing this a very useful community to everyone.

1 Like

I must thank you for your invitation. I am not sure how active I may be in contributing during live videos as for me they come in the early hours of the morning being 9/8 hours ahead of PDT time. We don’t spring forward here until March 31st.
However I am honoured and look forward to this opportunity greatly.

1 Like

Thank you @init_27. It would be great to get your feedback in this post

1 Like

Done! Thanks for the feedback efforts!

1 Like

Thank you so much for the invite!
I’m still catching up on the first part and doing all the assignments and hope to be in shape for the start of part 2. Thanks again for the inspiring first course, looking forward to the next part.

Will we be learning unsupervised training of CNNs in this part?

Jeremy has given a teaser about TF Summit!

4 Likes

Your invitation email is just the nudge I need to re-commit to upgrading my skill set!

Many thanks for the opportunity, and for all you do to make this area accessible to so many people who wouldn’t otherwise have the opportunity.

Best regards,
Maureen

2 Likes

Screenshot at 10h pacific.

Explication of the screenshot in the tweet of Jeremy: https://twitter.com/jeremyphoward/status/1103354717206245376 : " If you missed it: the 1st announcement was Swift 4 TF will be part of the next course. Tune in to @ clattner_llvm 's talk on the livestream at 12.30p for more."

Video from Jeremy in the #TFDevSummit

5 Likes

I came here to find out more details on this exact topic
Are we going to be using swift +TF 2.0 in part 2? or is it an entirely different course that is in the works?

1 Like

Here you go:
https://www.fast.ai/2019/03/06/fastai-swift/

5 Likes

looks like it’s going to be included in part 2 starting soon!

if you want to be part of making this happen, be sure to join the upcoming class, either in person at the University of San Francisco, or in the next part 2 MOOC (coming out June 2019).

1 Like

Thank you Jeremy for the invitation. Feels like an honor.
As I’ll be starting my sabbatical soon, I pledge to contribute more this time.

Oh what a happy day. Thank you so much for including us.

On Swift For TensorFlow and FastAI, I am also interested in what’s brewing…

So just chime in and a bit…

I learnt Swift in order to switch away from objective c (for iOS dev), and I really like the language. I just never guess it may soon join the high perf numeric and DL dev, and some of the features would make even a Ruby programmer happy.

For now, some of the TF API names look a bit long and clumsy. But Swift let you define your own domain specific “sub-lang”, e.g. you can write this if you want:

let square = { (x: Int) in Double(x*x) }

let sum = ∑^{0..<4} ∘ square

// sum = 0^2 + 1^2 + 2^2 + 3^2


where square is a closure/func, and ∑^ is a prefix operator. And this:

let π = Double.pi
let trig_sum = ∑^{0..<100}∘{ sin(\$0 * π/4.0) }    // \sum^0_100 \sin(iπ/4) in TeX
trig_sum


This is almost something you can translate into LaTeX and imported into your paper!

If you love this, I can post more out in a gist and it is a good motivation to learn Swift. I am looking forward to seeing this kind of syntax used in FastAI.

Update: Here’s the gist. I just realized this may be on older version of swift, but i think it should still work. You can try create a google colab notebook for swift, and paste the code in a single cell. For that, you need to open an pre-existing one available in S4TF page on colab and copy it into your google drive and edit it. If you are mac dev, you know what to do.

Update 2: Google colab for swift has bug surrounding operator implementation. It only works if you keep all your code in the scope of a single cell. I will try to file a bug for this.

Here is the colab notebook (with highlighting the bug):

2 Likes

I posted this link in the harebrain forum, but you might enjoy watching this too: https://www.dotconferences.com/2019/01/jeff-biggus-scientific-swift

Will these lessons be recorded or do we need to watch them in real time? They’re a little late in the night for me…