@oguiza Thank you for the reply. Do you think with this, say I had 4 years of data and I wanted to predict the next half year. Within the epoch training and for minibatch can you use this with the data bunch to train as well? Even if I can perform computations on this, can it be piped into the learner without causing memory issues? My understanding of the memory usage in an epoch is that the dataset is stored in memory, and the current minibatch is then tossed to the GPU to update weights.
If I only had 8gb of computer memory and I wanted to train on say a 50gb file, can this method be used with epochs and minibatches to train the tabular learner? I am really still kind of new to this.