Shifted ReLU (-0.5)

Your input data should have mean zero and var one, and weights selected to keep the activations that way! :slight_smile: