Activation Function Experimentation Tabular

I’m wondering what the best approach would be to redefine TabularModel to experiment with different activation functions.

I see nn.Relu used here in fastai.tabular.models :

Maybe just grab the whole thing and dump it in my notebook and paste in new nn.activations?

As far as i can tell in the code there doesn’t appear to be a built in way to do this so i just wanted to confirm that before hacking away.

Thanks

2 Likes

I’d say hack away. I also do not see any built-in way.

I have done many similar hacks with CNN models. You can copy code and rewrite in the notebook. Or create the model, enumerate layers, and replace some of them. It all just works, thanks to PyTorch’s tracking of parameters.

1 Like