I am trying to understand some of the terminology behind the naming of the layers:

[<keras.layers.pooling.MaxPooling2D at 0x11071438>,
 <keras.layers.core.Flatten at 0x110712e8>,
 <keras.layers.core.Dense at 0x11071320>,
 <keras.layers.core.Dropout at 0x11071240>,
 <keras.layers.normalization.BatchNormalization at 0x110714e0>,
 <keras.layers.core.Dense at 0x11071400>,
 <keras.layers.core.Dropout at 0x110715c0>,
 <keras.layers.normalization.BatchNormalization at 0x110715f8>,
 <keras.layers.core.Dense at 0x11071550>]

In Keras: Maxpooling(), flatten(), Dropout(), BatchNormalization() are listed as layers. Are they in fact layers or in theory only the Dense() layers are actually layers and the rest are just intermediate operations performed b/w Dense (or fully connected layer?) layers?

Edit: Just discovered this thread: Dense vs convolutional vs fully connected layers … so it answers my Qs!