Creating an ensemble of CNN's

I have been reading upon the solutions of ma kaggle winners and most of them use an ensemble of CNN with some boosting like XgBoost.Can anyone help me doing the same with fastai

1 Like

Maybe we can save the weights at the end of each train cycle with a callback and use those weights to generate the ensemble of predictions? Anyone have more ideas on this?

1 Like

You can either use the CNN to predict the score of a kaggle dataset, so you’d add columns model1_score and modelN_score to the columns that XgBoost will use, or you could output embeddings into the XgBoost columns: modelN_emb1, …, modelN_embN. So when you are generating a CSV in fastai, you’d just load the already existing CSV for the XgBoost and add your newly estimated column(s), save, and then run XgBoost with the newly added column(s). The hope is that some of your new columns will predict different parts of the dataset better than the other models, so that your ensembled model is better than all the individual models used.