Meet Mish: New Activation function, possible successor to ReLU?

Hi @mdp777 - here’s a screenshot from @Diganta Mish github with some basic comparison testing with GELU.

GitHub - digantamisra98/Mish: Official Repository for "Mish: A Self Regularized Non-Monotonic Neural Activation Function" [BMVC 2020]

2 Likes