Another architecture that is very powerful, especially in “sequence-to-sequence” problems (that is, problems where the dependent variable is itself a variable-length sequence, such as language translation), is the Transformers architecture. You can find it in a bonus chapter on the book’s website.
There is a short passage in chapter 12 of the book that sounds promising. Maybe we can hope for transformers being included in later parts of the course
Dont think its gonna be released any time soon.
they are releasing fastai2 along with the course so due to that dependency its causing a delay.
Im guessing atleast 1-2 months.
better to start with fast ai V3
Only Jeremy knows that time, and I doubt it would be that long, but yes when it is done it will be released. Jeremy has been working extremely hard to get it ready to be pushed, so please just be patient
I startred this in oreilly and faced some issues particularly with PIL version. Guess the requirements.txt was not updated in github to the right version