Why Do we get Different Results on Different Runs?

I agree with what @nareshr8 wrote. On top of that, random weight initialization and dropout are additional sources of randomness in the training procedure of neural networks.

@jimmiemunyi I also recommend having a look at this thread: [Solved] Reproducibility: Where is the randomness coming in?

2 Likes