I’m currently using a Tabular learner, for which I can get the structure:
TabularModel(
(embeds): ModuleList(
(0): Embedding(4, 3)
(1): Embedding(8, 5)
(2): Embedding(4, 3)
(3): Embedding(3, 3)
)
(emb_drop): Dropout(p=0.0, inplace=False)
(bn_cont): BatchNorm1d(4, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
(layers): Sequential(
(0): LinBnDrop(
(0): Linear(in_features=4, out_features=5, bias=False)
(1): ReLU(inplace=True)
(2): BatchNorm1d(5, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
)
(1): LinBnDrop(
(0): Linear(in_features=5, out_features=2, bias=True)
)
)
)
I want to add a drop out layer for regularisation, as the model is overfitting, but I can’t seem to find the correct syntax to do so. Ideally, I’d like to be able to place it after the ReLU step. Any help on this would be greatly appreciated, and apologies in advance for the newbie question.