AdaptNLP: ULMFiT Approach with Transformers

Hi All,

I’ve been working on an experimental library called AdaptNLP that uses Flair and Transformers to streamline the training, inference, and deployment processes of a ULMFiT (transfer learning) approach with the latest state-of-the-art pre-trained language models.

A couple of features include:

  • Easy-to-use API for running batch inference on state-of-the-art NLP models like:

    • Text/Sequence Classification
    • Token Tagging
    • Span-based Question Answering
  • A ULMFiT approach to fine-tuning Transformers language models and training your own NLP-task Classifiers

    • Finetune language models for BERT, ALBERT, GPT2, etc.
    • Train classifiers that can be loaded into the above mentioned API
  • Deploy open pre-trained or custom-trained models as a microservice

    • Uses FastAPI
    • Two-line steps to deploy Docker
    • GPU compatible

The library is here:
and it is available on pypi to be installed with pip install adaptnlp.
Please feel free to try it out!

Due to being in its early development stages, feedback and issue threads would be very much appreciated in the AdaptNLP repo.



Is it possible to fine-tune a question-answering model with your library? If so, can you provide an example? Thanks.

1 Like

@xjdeng As of now adaptnlp doesn’t having retraining capabilities for question answering, but it’s something we’re working on

If you want to deep dive more into the transformers library, it is possible to fine-tune and train a QA model with the transformers Trainer module (which adaptnlp uses). An example can be found here:

Hi @aychang! Is the adaptnlp library still actively maintained?
I tried to install it, and I noticed a few failures right away when running the nbdev_test_nbs command.
I was wondering if that is a known issue and/or if it’s worth spending time looking into it.


I was the last maintainer of it, so I’m not sure (no longer working on it as I’m now at HF, and Andrew left too)

I’d recommend installing the fastai versions around the same time as the latest release.