Fit_one_cycle make reproducible

Hey guys,
I use the fit_one_cycle method with a tabular learner.
I want to make the result of it reproducible. I already did:

def random_seed(seed_value, use_cuda):  #gleaned from multiple forum posts
    np.random.seed(seed_value) # cpu vars
    torch.manual_seed(seed_value) # cpu  vars
    random.seed(seed_value) # Python
    if use_cuda: torch.cuda.manual_seed_all(seed_value) # gpu 
random_seed(42,True)

However I still get different results, if define the learner again and use fit_one_cycle again.

Can anybody help me with that?

Thanks a lot!

I know what you are trying to do, though I do not know how to do it. My approach has been to generate the weights save them off, and save specific inputs off, those specific ones if I want to make something reproducible. Definitely not as simple as we want it to be though. This would help me as well, if someone has found a way to do this…

Personally I do run based reproducibility though. Inputs and weights are random between runs, but in a single run I make sure the same weights are loaded into different models(same architecture), and same inputs are used across models. This has introduced enough randomness that I get some interesting failure cases that have slipped up on me. (Ie, being 1-off in my vocab length), though on each run I can compare everything.

1 Like

ahh ok, hmm, that sounds a little bit difficult. Do you have a code snippet which you could show?

My understanding is that since any new layer is initially filled with random weights, then you can never achieve exactly the same results every time, but if we’ve gotten our hyper parameters correct then it shouldn’t matter as we’ll get almost the same results every time.

If it’s wildly different each time then there’s an element of luck involving the initial weights which points back to hyper parameters not being optimal.

To be fair, one set of reason numbers is as good as another, so I’d be very happy if we could seed initial weights and have pseudo random instead.

Yeah, that is true. However I need it for documenting my work, so that somebody who reads my paper can reproduce it. There must be a way somehow I guess. Or how do other people deal with that problem?

Does this work?