How to Add Layers to Learner

I’m currently using a Tabular learner, for which I can get the structure:

  (embeds): ModuleList(
    (0): Embedding(4, 3)
    (1): Embedding(8, 5)
    (2): Embedding(4, 3)
    (3): Embedding(3, 3)
  (emb_drop): Dropout(p=0.0, inplace=False)
  (bn_cont): BatchNorm1d(4, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
  (layers): Sequential(
    (0): LinBnDrop(
      (0): Linear(in_features=4, out_features=5, bias=False)
      (1): ReLU(inplace=True)
      (2): BatchNorm1d(5, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
    (1): LinBnDrop(
      (0): Linear(in_features=5, out_features=2, bias=True)

I want to add a drop out layer for regularisation, as the model is overfitting, but I can’t seem to find the correct syntax to do so. Ideally, I’d like to be able to place it after the ReLU step. Any help on this would be greatly appreciated, and apologies in advance for the newbie question.

Ideally you’d do:

learn = tabular_learner(dls, [5], ps=[0.5])

That ps arg is what controls the dropout

1 Like

Fantastic, thank you!

Will fastai intercalate the the values put into this parameter, so if I have let’s say [10, 5] as my layers and [0.5] as my dropout will the dropout only apply after the activation of the first hidden layer?

I believe so. You can try. If not it wants an array of values for every layer, so you should pass [.5, 0]

1 Like