Hi,
Where can I find the implementation of General ReLU as mentioned in the article below
General ReLU — Introduced by FastAI, this is ReLU plus a leakiness option to address the lack of negative handling from ReLU, along with ‘mean shift’ and ‘clamping’.
Thanks in advance