How to specify dropout for Learner class

I(a beginner) was playing around with EfficientNet-PyTorch by lukemelas for classifying food images and it is working really well. However, I had some tough time solving the underfitting for b4. I have tried different combination of weight decays, epochs, and learning rates. The only thing that I haven’t played with is the dropout. I couldn’t figure out, how to define dropout for Learner class as it is not in the attribute lists. I was wondering if there is a way/best practice to define dropout percentage(like ps for cnn_learner) using Learner class for external libraries like the one I am using.

Any other ideas to tackle underfitting for below code will also be very helpful. :slight_smile:

model = EfficientNet.from_pretrained("efficientnet-b4")    
data = ImageDataBunch.from_csv(path, folder=".", valid_pct=0.2,
                              csv_labels='/content/models/cleaned.csv', ds_tfms=ds_tfms, bs=bs, 
                              size=img_size, num_workers=2).normalize(imagenet_stats)

top_5 = partial(top_k_accuracy, k=5)
learn = Learner(data, model, metrics=[accuracy, top_5], callback_fns=ShowGraph).to_fp16()
learn.split( lambda m: (model._conv_head,) )

lr = 1e-4
learn.fit_one_cycle(5, max_lr=slice(1e-6, lr), pct_start=0.1, div_factor=100, final_div=100, wd=1e-3)'/content/models/food-101-train-epoch-13')