TabNet with fastai v2

There is now :wink:

Edit:

@grankin I believe that something may be wrong. Nope, just they train for a very long time.

learning rate of 0.02 (decayed 0.9 every 10k iterations with an exponential decay) for 71k iterations.

Sweet holy moley. 71,000 batches?! Let me try again…

By the way, if my math is right, that is the equivalent of 11,000 epochs.

(They have a 4096 batch size, 4096 * 71,000 iterations / 25000 rows)

Running those now

Surpassed fastai at ~1600 epochs

Just had a quick thought too, let me see…

Well, it won’t let me post so, let me post an exciting update here:

Alright so, half way through training I got a harebrained idea. This arch doesn’t change much, right? Can I use transfer learning here. I did a quick experiment. 50 epochs with the poker dataset, then 50 epochs transfered onto Adults. Here is what I found.

To reach above 80% accuracy:

  • Non-Transfer learning: epoch 31
  • Transfer learning: epoch 11

Finishing accuracy:

  • Non-Transfer: 78.5-79%
  • Transfer: 81-82.5%

There is certainly promise here.

Next is to try the reverse

Reverse didn’t make much of a difference, unsure why that’s the case but it is!

Still need to test, freezing layers: Tabular Transfer Learning and/or retraining with fastai - #20 by Jumonji

So I got the code for freezing and whatnot figured out. I’m training the poker model tonight then go from there tommorow morning. (Or this morning? I think it’s like 3:30 am)

Hmmmm… I could not recreate their accuracy they achieved for Poker Hand… :frowning:

@grankin even running in their source code I was unable to match it (PyTorch), I got early stopping ~epoch 207, which had an accuracy of 54%. (I still wasn’t able to achieve this on our version but that is a far cry from 99%)

8 Likes