I am new to machine learning (ML) and am currently at Lesson 4 of Fastai’s ML course. I have been applying what I have learned on Kaggle’s AMES competition. The goal is to predict the sale price of homes in Ames, IOWA.
While my score is among the best compared to other submitted random forest models, I am currently in the top 56% on the leader board.
Therefore, there is a lot of room for improvement but I feel like my model has ‘plateaued’ as I have already done a lot of features engineering and hyperparameters tuning.
Looking at other kernels (which gave me some ideas for features engineering), successful teams have used stacking and other techniques (I do not know about ).
So I’d be happy to talk to anyone to see if I can push my random forest model a bit further.
And if you are also doing this competition we can share insights and explore other ways to get more information from the data.