Why is my learning rate the worst?

I attempted to create a learning rate with the following code

def get_data(sz):
    tfms = tfms_from_model(f_model, sz, aug_tfms=transforms_top_down, max_zoom=1.05)
    return ImageClassifierData.from_csv(PATH, 'boneage-training-dataset', label_csv ,
                                        bs = 64, tfms= tfms, val_idxs=val_idxs, suffix='.png',
                                        test_name=None, continuous=False, skip_header=True, num_workers=4)


data = get_data(256)
learn = ConvLearner.pretrained(arch,data,precompute=True)


100%|██████████| 158/158 [02:46<00:00, 1.05s/it]
100%|██████████| 40/40 [00:42<00:00, 1.07s/it]

Why does this code run 2 runs — 158 and 40?

learn.sched.plot_lr()

Is realistic – But why does my loss curve have such a weird shape – Would you choose a learning rate of 0.1 in this case?

I think the first run is the training data and the second run is the validation data.

@Judywawira, can you say why you think your loss curve looks weird? It doesn’t look to far off from the ones we’ve seen in the lectures.