I was looking into the Keras documentation about the different initializations for each layer. The uniform option is the default. Is uniform the Xavier initialization we talked about in class. Or would we write an xavier_init() function to pass to the model, which Keras lets do happily?
Changing the type of initializations is one of the knobs to turn when tuning a model, yes? If we changed the initialization to another option say the normal/Gaussian would that correct the “over-confidence” of the the model that we adjusted for manually?
Neural networks always have a linear layer before a non-linear layer. However, the linear layer does not have to be dense. A dense layer is just one type of linear layer. There are many other types of linear layers, including: convolutional, recurrent, recursive, inception blocks, and more