Activation_stats.plot_layer_stats always shows 400 activations

Why is the x axis always 400? I thought from summary that the number of activations per layer is going down?

Sequential (Input shape: 64) ============================================================================ Layer (type) Output Shape Param # Trainable ============================================================================ 64 x 4 x 14 x 14 Conv2d 40 True ReLU ____________________________________________________________________________ 64 x 8 x 7 x 7 Conv2d 296 True ReLU ____________________________________________________________________________ 64 x 16 x 4 x 4 Conv2d 1168 True ReLU ____________________________________________________________________________ 64 x 32 x 2 x 2 Conv2d 4640 True ReLU ____________________________________________________________________________ 64 x 2 x 1 x 1 Conv2d 578 True ____________________________________________________________________________

Any pointers on the above? I must be mis-understanding something; as I’m struggling to figure out why the count on the x axis of the graphs doesn’t mach the Summary: Output

4×14×14 = 784
8x7x7=392
16x4x4=256

Ah; breakthrough

The x axis is batch number; not activation
So the y axis is showing the statistic of all activations in that layer for each batch; so there where 400 batches