Can't find bonus chapter mentioned in chapter 12 of the book

Chapter 12 of the book mentioned

Another architecture that is very powerful, especially in “sequence-to-sequence” problems (that is, problems where the dependent variable is itself a variable-length sequence, such as language translation), is the Transformers architecture. You can find it in a bonus chapter on the book’s website.

It links to which redirects to
I checked Practical Deep Learning for Coders - The book and GitHub - fastai/fastbook: The fastai book, published as Jupyter Notebooks, and wasn’t able to find it.
Does anyone know where I can find the bonus chapter? Thanks!

1 Like

An example of how to incorporate the transfomers library from HuggingFace with fastai

In this tutorial, we will see how we can use the fastai library to fine-tune a pretrained transformer model from the transformers library by HuggingFace. We will use the mid-level API to gather the data. Even if this tutorial is self contained, it might help to check the imagenette tutorial to have a second look on the mid-level API (with a gentle introduction using the higher level APIs) in computer vision.

1 Like

Also an NLP with Transformers walk through / lesson here. Practical Deep Learning for Coders - 4: Natural Language (NLP)

1 Like

and for further exploration.

BLURR is a library designed for fastai developers who want to train and deploy Hugging Face transformers

1 Like

Hi Isasc Fung

Try the Fast AI NLP course ( - new course: A Code-First Introduction to Natural Language Processing).

Regards Conwyn

1 Like