Visualizing Spectral Normalization on Conv Weights

To anyone who is interested, I’ve made a little notebook on google collab which helps visualize the effects of spectral normalization on convolution kernel weights (It’s my first time using Collab, so if anyone has tips to improve I’m all ears). Here is the link:

Here’s an Example:

I was having some strange results using spectral normalization in combination with pixel shuffle, so I wanted to visualize how a saturated weight in one part of the weight tensor could affect the rest of the weights. To do this, I’ve made a two-layer image inverter, and then taken the weights of the second layer and cut them up by depth. The kernel size is 3x3 so I decided(kind of arbitrarily) to visualize each slice as a 3d transform matrix, maybe not fully representative of how kernels work, but easy to understand visually.