Lesson 8 (2019) discussion & wiki

I think its lesson 1 instead of Lesson 8.

1 Like

I believe the lesson orders are all connected. Part 1 had the first seven lessons. Part 2 will have the next set of lessons starting at #8.

8 Likes

It is lesson 8 mod 7: )

10 Likes

https://www.crestle.com/dashboard

I don’t see the course ml2 link, will it be open soon?

1 Like

Small question before the beginning of the lesson : how redundant is part 2 v3 with part 2 v2 ? Is it worth it to go through part 2 v2 after finishing part 2 v3 ?

I got them from github: https://github.com/fastai/fastai_docs/tree/master/dev_course/dl2
which the forum tells me @jeremy also posted here recently.

The notebooks for today’s class look great!

2 Likes

Gentle reminder to avoid @ mentioning unless necessary.

9 Likes

Noob tip:
I keep 2 setups:

  • Bleeding edge (pip installed)
  • Conda installed setup (For no bleeding)
13 Likes

What will the “suggested homework” look like for that new format of fastai part 2 ? In part 1 it was using what we saw in kaggle competitions for example, how to do that with the lessons focused on building fastai from foundations ?

1 Like

#off topic
Finally Jeremy is kinda teaching Python As well indirectly!
Thanks :slight_smile:

6 Likes

I’m so glad you asked the question! Homework is writing docs and new tests :slight_smile:
You will know how the library works, so you’ll have all the tools to do that!

29 Likes

what is the best way to study this part 2?

(for part 1 we learnt the best way is to watch videos 3 times, write blogposts and try blah). Is the approach for part 2 the same?

3 Likes

That’s another way to get lots of contributors to the fastai librairy haha :wink:

9 Likes

For the distributed training, will we examine multi-node single/multi-GPU in addition to single-node mult-GPU?

Not sure. It will probably will be single-node only.

I love the new direction of part2 of the course! Getting into the fundamentals can eliminate those last little bits of hesitation and delay when dealing with modifying and customizing state-of-the-art training algorithms. Also I’m very excited about the direction of performance optimization, distributed training, and the emphasis on engineering. I have already convinced our engineers to get into pytorch, and I will have a year or two before I can get them to get into Swift.

21 Likes

I heard Julia language is built for numerical computation, why not Julia instead?

5 Likes

Jeremy is answering it :wink:

2 Likes