Is it possible to implement cross-validation in fastai?

(guitar) #1

I tried to implement cross-validation in fastai, but I failed.

1 Like

(Esteban J Guillen) #2

I have used sklearn to help generate the folds, then train a fastai classifier on each train and validation dataset combination.

0 Likes

(guitar) #3

Can you tell me in more detail? I divided the data set into 5 folds to train, and when I did the second iteration, I got an error.

0 Likes

(Esteban J Guillen) #4

Here is how I iterate over my folds

I save off the results after each learner is trained

0 Likes

(Fernando A.) #5

This my way to implement Stratified K Fold cross validation with fast.ai and scikitlearn

I think that is ok and easy but I’m a newbie

8 Likes

#6

have u ever raise an error : out of memory? thanks for advance!

0 Likes

#7

I guess probably you can reduce the batch_size to avoid that error.

0 Likes

#8

I was just asking myself: which is the purpose of cross-validation in this situation? My guess: to get a better estimate of the accuracy of the model in the test_set?

Also what about early stopping if error starts increasing in the validation set, how can we do it?

0 Likes

#9

Regarding your second question you should probably have a look at the EarlyStoppingCallback in the docs.

0 Likes