Lesson 7 - Official topic

Can you comment on real-time applications of Random Forests? In my experience they tend to be too slow for real-time (latency bound) use cases, like a real reccomender system. A NN is much faster when run on the right hardware.

The only other option I found that is good from the performance perspective is XGBoost or Cat boost (boosted decision trees).

3 Likes

Note Jeremy was once the President of Kaggle and at one point the top data scientist on Kaggle! :slight_smile:

4 Likes

Yes, but I only have outcomes for sales that happened or not in my training set. Basically, yes/no. But I’d like to know if all conditions are very favorable for making this big sale. I guess, my outcome variable will have to be yes or no then for my test set.

I agree. Try all of them. There’s an argument I recommend changing in Random forest. I’m not sure if its there in XGBoost. Try changing the class-weight argument to “balanced” to deal with the class imbalance. That’s what I use. In addition, F1 score is better in evaluating the model.

3 Likes

Have you looked at the current Abstract and Reasoning Challenge competition at Kaggle, which asks whether a computer can learn complex, abstract tasks from just a few examples? Can you share some thoughts on it?

3 Likes

There are implementations of RF that are optimized for the right hardware as well, for example see https://github.com/rapidsai/cuml

4 Likes

Regarding Kaggle: I’m trying to use fastai2 on TPUs (PyTorch version for TPUs came out March 25) as part of Kaggle’s “Flower Classification with TPUs” in case any one wants to join me https://www.kaggle.com/c/flower-classification-with-tpus/overview

Jeremy, I heard that you won every Kaggle competition for 5 years straight. Is this true? Do you have any favorite stories of Kaggle competitions you were involved in?

1 Like

fastai2 won’t work directly with TPUs at this point (even with the PyTorch TPU library). There is ongoing development for this though.

1 Like

You’ll find a few answers here: https://youtu.be/205j37G1cxw :tea:

6 Likes

Here is Fastai’s competition using GPUs:
https://forums.fast.ai/t/fastgarden-a-new-imagenette-like-competition-just-for-fun/65909

2 Likes

It seems this algorithm is only for categorical variables, correct?

If I understand correctly, decision trees also work with continuous (numeric) variables too. Is this true? If so, how does that work?

1 Like

If we are splitting only on categorical variables, then what do we do with the continuous variables?

2 Likes

We are just talking about the cleaning for now.

We split on some values: less than something or greater than something.

Oh I was referring to the section describing “The basic steps to train a decision tree can be written down very easily:”

Did I miss something?

Oh sorry, see my other answer above.

1 Like

Does fastai use any default data augmentation or create synthetic data for tabular datasets?
Doe such techniques exist?

1 Like

I don’t know if such a technique exists, and there is nothing in fastai for this. Such a thing is probably domain-dependent.

1 Like

@ilovescience I’m giving it a try and seeing what happens. If it doesn’t work, I’l either join the GPU competition or try to get the best of both worlds, i.e., data augmentation with fastai2 and learning with tf