Unable to find a LR with structured data model


(Michael) #1

I’m attempting to build a structured data model similar to the Rossmann approach but with a completely different data set. When I try to use the lr_find it looks like its not able to able to find any paths to reduce the loss (at least that’s how I’m reading it). See in the image below.

How should I interpret this and what are the most likely causes when you can’t find a learning rate.
My dependent variable is a vector of true/false values that I have converted to 1 and 0 as float64. Rather than try and predict the sales value as in the Rossmann case, I’m simply interested to know if the sales in the next month will have increased by more than 50%. In the case that they do I set y = 1 and in the case that they don’t I set y = 0. I’m trying to build a model that will give me a probability of the event occurring. i.e. p = 0.95 that the event will occur.

md = ColumnarModelData.from_data_frame(PATH, val_idx, df, y1.astype(np.float64), cat_flds=cat_vars, bs=128)
m = md.get_learner(emb_szs, n_cont = len(df.columns)-len(cat_vars),
               emb_drop = 0.04, out_sz = 1, szs = [1000, 500], drops = [0.001,0.01], use_bn = True)

#2

You should try m.sched.plot(100,1): the first argument is how many points you skip at the beginning (default 10) and the second one how many points you skip at the end (default 5). Your graph has values that are very concentrated (0.052 to 0.06) but the last loss when the LR Finder stopped is 0.263 so it spiked extremely quickly toward the end.
My guess is that the spike is around 5e-2 (the last point we can see) and so 5e-3 would be a good choice.