My machine is 64GB memory, and I am following @jeremy 's lesson3 to run
np.array(trn, dtype = np.float32) firstly before running RandomForestRegressor for training the Grocery prediction model. However, python (or Jupyter) quits during the execution of this line of code if I use the train set more than 70 million. BTW, I figured out this number by trying multiple times. If I don’t run this
np.array code separately, the out of memory issue still happens during
fit() of the regressor.
I am pretty sure that this is out of memory issue, because I watched the
top output and the Used Memory size goes up all the way up until about 64GB and then suddenly python/Jupytor quits.
I am not sure I am the only person having this issue. Should I choose another machine with bigger memory? or there is some other solutions?