Applying Fastai to Kaggle's AMES competition



I am new to machine learning (ML) and am currently at Lesson 4 of Fastai’s ML course. I have been applying what I have learned on Kaggle’s AMES competition. The goal is to predict the sale price of homes in Ames, IOWA.

While my score is among the best compared to other submitted random forest models, I am currently in the top 56% on the leader board. :sweat:
Therefore, there is a lot of room for improvement but I feel like my model has ‘plateaued’ as I have already done a lot of features engineering and hyperparameters tuning.

Looking at other kernels (which gave me some ideas for features engineering), successful teams have used stacking and other techniques (I do not know about :sweat_smile:).

So I’d be happy to talk to anyone to see if I can push my random forest model a bit further. :+1:
And if you are also doing this competition we can share insights and explore other ways to get more information from the data.


(Jinu Daniel) #2

Even I participated in the same competition after going through the lessons and submitted my results which took me to the top 60% of the leader board. I even tried xgboost which improved my place marginally but I would like to know if this can be improved in some other way with less feature engineering.