Using ULMFiT for Natural Language Inference

I guess technically you could somehow replace the embeddings with BERT or Elmo but I don’t think that’s a good strategy. What you will end up with is an embeddings layer that has been pre-trained on a completely different dataset than the rest of the model. Also, BERT requires a different tokenization technique than what is used in ULMFit.

If you want to use BERT or other embeddings, I would recommend to either build a new model on top of these embeddings or directly fine-tune the whole model. But on the other hand, ULMFit is also pretty powerful by itself and always a great starting point.

thanks a lot. I understood :slight_smile: