Gradient boosting algo for NLP tasks

Hello, fastai community, I am wondering about trying to apply Gradient boosting algo such as xgboost and auto CV like grid search for doing a classification NLP problem, is that worthy it? instead of transformers !! THX

1 Like

While transformers are generally the SOTA, it doesn’t hurt to use as a baseline gradient boosting or other classic ML techniques for NLP tasks.
One good benefit is that while doing that, you’ll get an understanding of how algorithms like TFIDF, Word2Vec etc. can be used to map text data into numbers - something that computers understand.

Moreover, one might also resort to classic ML algos if compute (during inference etc.) is an issue.

Hope it helps,

1 Like