Chapter 12 of the book mentioned
Another architecture that is very powerful, especially in “sequence-to-sequence” problems (that is, problems where the dependent variable is itself a variable-length sequence, such as language translation), is the Transformers architecture. You can find it in a bonus chapter on the book’s website.
It links to https://book.fast.ai/ which redirects to https://course.fast.ai/
I checked Practical Deep Learning for Coders - The book and GitHub - fastai/fastbook: The fastai book, published as Jupyter Notebooks, and wasn’t able to find it.
Does anyone know where I can find the bonus chapter? Thanks!