Code reading


(Jeremy Howard) #41

Yes it is. I believe we should do np.exp before we take the mean. Or use a different kind of mean.

Well spotted!


(Aditya) #42

This function is defined in dataset.py (lower end)

def split_by_idx(idxs, *a):
    mask = np.zeros(len(a[0]),dtype=bool)
    mask[np.array(idxs)] = True
    return [(o[mask],o[~mask]) for o in a]

What happens to that *a parameter when it’s called as such for example(from column_data.py)?

def from_data_frame(cls, path, val_idxs, df, y, cat_flds, bs, test_df=None):
        ((val_df, trn_df), (val_y, trn_y)) = **split_by_idx(val_idxs, df, y)**
        return cls.from_data_frames(path, trn_df, val_df, trn_y, val_y, cat_flds, bs, test_df=test_df)

is *a equivalent to a list of [df, y]


(Jeremy Howard) #43

This is described here: https://stackoverflow.com/questions/36901/what-does-double-star-asterisk-and-star-asterisk-do-for-parameters


(Anup) #45

This is precious !! Thanks a ton !!


(Santhanam Elumalai) #46

@jeremy
I got this code from fast.ai library “fit” function

avg_loss = avg_loss * avg_mom + loss * (1-avg_mom)
debias_loss = avg_loss / (1 - avg_mom**batch_num)

I figured it out it (avg_loss) is linear interpolation, but what it exactly does and why we are using it? Also what is debias_loss how it differs from regular loss response?


(khan) #47

Some more Url’s I found useful reading the code.

https://www.programcreek.com/python/example/4684/collections.Iterable


(Jeremy Howard) #48

It’s an exponentially weighted moving average, to make the reported loss more stable.

Take a look at the Adam optimizer paper to learn about debiasing in this way.


(Santhanam Elumalai) #49

https://arxiv.org/abs/1412.6980 Is this the paper you referring to ??


(Jeremy Howard) #50

Yes. :slight_smile:


(Santhanam Elumalai) #51

Gotcha !!
image

Is it possible to explain this two line, just like the RELU, dropout, and back-prob?


(Santhanam Elumalai) #52

In fast.ai, this.opt or layer_opt implies optimizer and layer optimizer, but don’t know why I am kind of reading it as option and layer options :blush: Very tough to agree with the optimizer :slight_smile: why can’t we have it as optim instead of opt


(Jeremy Howard) #54

Seems reasonable - PRs welcome (but please grep the full source to ensure all usages are updated, including notebooks!) :slight_smile:


(Santhanam Elumalai) #55

Changing variable name is fine for me, but testing functionality after name change is the critical part. How can I validate it?