This is the topic for any non-beginner discussion around lesson 8. It won’t be actively monitored by Jeremy or I tonight, but we will answer standing questions in here tomorrow (if needed).
Hi there,
Unless we aren’t adding or expanding our vocab_sz, then the Embedding layer will have the vocab_size from the old pre_trained model itself which you are trying to fine-tune which hopefully is trained on much more data than what we are fine-tuning with;;