My query is Related to the ML course offered by fast.ai.In lesson 3,Jeremy started the class by explaining about this competition Kaggle Competition
So i tried to load the train.csv file but i found out that the file itself is around 4.65 GB which i believe would not fit in my system.
My system specs are-Core i7,2GB Nvidia Geforce 840M,8GB Ram.I usually use this system for only ML and i really do not want to go in the hassle of setting up fast ai on another system in the cloud.So is there any way i can load the dataset in like smaller pieces so that i may be able to work with it using pandas?
Once I had the problem when I had merged all data that I got an out-of-memory when writing to a compressed file format, which needed additional memory. In such cases the feather format solved the problem (I guess it writes the in-memory data directly to the file without doing anything additional).