I’m getting an OSError: [Errno 24] Too many open files
when running get_preds
on a large test set like so: learn.get_preds(ds_type=DatasetType.Test)
. I’m using this for collaborative filtering so the test set is a combination of all possible users and items (north of 5 million in some cases). The error occurs whenever I reach a 1000 batches (regardless of batch size).
I tried changing ulimit
but this is not a reliable solution. What I’m trying now is to loop over the test set and get predictions for a batch at a time with pred_batch
.
Is this a good way to solve this? Did anyone else encounter a similar problem and has a solution?