Leave-One-Out Cross Validation

I need to build a RF classifier on a data set that has P >>>>> N. After extracting my test set ( no test set available), I do not have enough observations in my train for a validation set. Will leave-one-out cross validation be a good idea in this case?

Start with regular (e.g. 5-fold) CV. Leave-one-out CV is very computationally expensive!

1 Like

Very TRUE! Thanks!