Digging into the VGG.py code

I am looking at the vgg16.py file, and trying to get more comfortable with some of the Keras in it.

The method

def ConvBlock(self, layers, filters):
    model = self.model
    for i in range(layers):
        model.add(ZeroPadding2D((1, 1)))
        model.add(Convolution2D(filters, 3, 3, activation='relu'))
    model.add(MaxPooling2D((2, 2), strides=(2, 2)))

uses Convolution2D.

The Keras documentation for Convolution is pretty miserable:

“Convolution operator for filtering windows of two-dimensional inputs.”

My question is, does a Convolution2D have a linear model in it?

My reasoning is that you can’t have an activation if you don’t have a Dense layer first. And here we have something with an activation, such as:

Convolution2D(filters, 3, 3, activation=‘relu’)

1 Like

I was looking into the Keras documentation about the different initializations for each layer. The uniform option is the default. Is uniform the Xavier initialization we talked about in class. Or would we write an xavier_init() function to pass to the model, which Keras lets do happily?

Changing the type of initializations is one of the knobs to turn when tuning a model, yes? If we changed the initialization to another option say the normal/Gaussian would that correct the “over-confidence” of the the model that we adjusted for manually?

Matt, this is the convolution (/correlation) operation that was introduced in lesson zero. Here’s a good tutorial: http://colah.github.io/posts/2014-07-Understanding-Convolutions/ . In short - yes, it is a linear model, although it is a very particular type of linear model.

@melissa.fabros the keras documentation shows that “init=‘glorot_uniform’” is the default. This is the Xavier initialization (the creator was name Xavier Glorot, so the method goes under both names!)

Neural networks always have a linear layer before a non-linear layer. However, the linear layer does not have to be dense. A dense layer is just one type of linear layer. There are many other types of linear layers, including: convolutional, recurrent, recursive, inception blocks, and more

1 Like