BatchNormalization axis for Convolution layers in Keras?

I got a few questions about BatchNormalization in Keras with Tensorflow background.

  1. In Documentation in keras 2 it has been written to use axis = 1 in “th” dimension ordering. What if we use the same in “tf” dimension ? Are we going to get wrong results?

    https://keras.io/layers/normalization/

  2. I read a post in this forum about BatchNormalization (axis = 1) in Conv layers. If you are going to use Batch-normalization in Conv2d with keras we need to set axis depending on your input tensor is for
    [b, h, w, c] or [b, c, h, w]. Here for theano we can see that channels is in second dim and for tensorflow channel is in last. Then as in keras documentation what is -1 axis doing ?

  1. When should we use axis =-1, 1, 2, 3 if using tensorflow background?

Thank you.