Overflow error

Hey guys, this is my first post in the fast.ai forums.

I’m having trouble with my databunch, specifically when I’m calling any functions related to my databunch such as show data, creating a learner, etc.

The problem is an overflow error: cannot serialize a bytes object larger than 4 GiB.

If anyone could help me with the same, that would be great.

Thanks in advance.

Could you give some more information? OS, versions, some code, etc. Then it may be easier to help you.

Hi,
I was using the latest version of fastai, I had updated it using conda a day before on windows 10 in my local machine.

I was using a dataset given to me in a hackathon which was huge in size (around 14 gb when loaded into memory in pandas format). I had then compressed the dataset using the handy code in the fastai forums that used converted the types of int and float to make the dataset smaller as it wasn’t fitting into my 32 gb memory when I was trying to create a databunch.

Afterwards I had created a data bunch with no problems whatsoever but when I tried to use show data with any number of rows or tried to create a learner, it gave me an overflow error

Which Python version you used? I found in 3.8 this problem seems to be resolved. See the following link:

https://bugs.python.org/issue17560

I used python 3.6

Did anybody figure this out? I am running into the issue with Python 3.7. I changed protocol to 4 in reduction.py but I am still getting the error.