Why good accuracy already at first epoch on Tabular data?

Hey guys!

After chapter 5 of the course I have been training a simple neural net on the adult dataset.
target variable is binary, and it’s distribution is around 0.76 and 0.24 (76% 0s and 24% 1s)

Here is how i trained the model:

And i have the question about the following:

As you can see, at epoch 0 I already have 0.85 accuracy. Why it is already good at the beginning? Like when we trained on images, we already had a pretrained model with good starting weights, but here with tabular data we don’t have pretrained models, so just random starting weights.

Oh, now after a bit of thinking I actually got why this is happening (I guess, I will just post it for someone who is having the same question)
So it’s because i have got a big train dataset (32561 rows) and my batch size is 256. So at the first epoch it updates weights around 32561/256 127 times, so I guess it’s enough to get good weights.
When i tested on batch size 20000, it gave me bad accuracy at first epochs, since after first 3 epochs it updates the weights only like (32561/20000)*3 = 6 times?:

P.S. Thanks for reading! Would like to hear if I am missing something

1 Like