Normalizing input data

I want to use a custom dataset with CNN-type network.
Dataset comes in 3 channels shaped: (3,20)
First 2 channels can have values between 1-50,000 (but likely a limited subset of these).
3-rd channel are characters.

How to properly scale/normalize data before sending it to model?

Subtract the mean and divide by the stdev.

Thanks Jeremy,
do i do it per batch or pre calculate this first for the whole dataset?

Whole dataset.

Thanks Jeremy.

Something like that, I guess? :

num_of_elements=500
num_of_channels=3
input_vector_length=20
a = np.zeros((num_of_elements, num_of_channels, input_vector_length))
for i in range(num_of_elements): # Build array
    for j in range(num_of_channels):
        a[i][j]=i*3+j # make channels: 1,1,1,1,  2,2,2,2,  3,3,3,3, ...

# Actual normalization of array 'a'
mean  = np.mean (a, axis=(2,0))
stdev = np.std  (a, axis=(2,0))

for i in range(num_of_channels):
    a[:,i,:]-=mean[i]
    a[:,i,:]/=stdev[i]

I don’t really follow your code since I’m not sure about the details of your data, but you can easily check it worked by simply checking the mean and stdev is (0,1) after you’re done.

I suspect you can simplify and speed up your code using broadcasting, BTW.

Regarding the input batch normalization, the notes from Lessen 4, part 1 says that we can plug a batchnormalization layer as the first layer of a model and it will do the normalization. But does this normalization happens per batch or it uses global parameters? If it happens per batch, instead of using global mean and std, I think using this trick will have different outcome than normalizing beforehand and feeding the model a normalized dataset, specially for smaller batch sizes.

1 Like