Size mismatch for encoder.weight

I’m using fastai 2.4 to do Text Classification.

Here is how the data looks like:

Here is my code:

dls_lm = TextDataLoaders.from_df(words_df,
                                 text_col='Question',
                                 valid_pct = .2,
                                 is_lm = True,
                                 seq_len = 22,
                                 bs = 64,
                                 seed=20)
learn = language_model_learner(dls_lm, 
                               AWD_LSTM,
                               drop_mult = .4,
                               metrics = [accuracy, Perplexity()]).to_fp16()

# some fit and training.....
learn.save_encoder('finetuned_lng_encoder')


dls_blk = DataBlock(blocks = (TextBlock.from_df(text_cols = "text", seq_len = 22),
                              CategoryBlock),
                    get_x = ColReader(cols='text'),
                    get_y = ColReader(cols = "label"),
                    splitter = TrainTestSplitter(test_size = 0.2, random_state = 21, stratify=df_small.label))

dls_clf = dls_blk.dataloaders(df_small,
                              bs = 64,
                              seq_len=22,
                              seed = 20)

learn_tc = text_classifier_learner(dls_clf, 
                                    AWD_LSTM, 
                                    drop_mult=0.4,
                                    metrics = accuracy_multi).to_fp16()

learn_tc = learn_tc.load_encoder("finetuned_lng_encoder")

Then I got this error:

Could someone help please?