I uploaded the notebook to bitbucket. However, it is a private repo. I’ll be happy to give anyone taking the course access to it. @jeremy, this is also a general question - at what point is it OK to put code on public repos for as long as part2 is not open sourced yet? Originally the code is based on your implementation, though I made numerous additions and changes. I tried to understand your WGAN implementation in pytorch by “backporting” it to Keras. Not that I don’t like pytorch, I see its potential, but I am not productive with it yet and thought it might be a nice excercise to understand how your implementation works.

As for weight clipping: @rodgzilla pointed out this repo, where Thibault de Boissiere has implemented different GANs and other cool stuff in Keras. Btw - DenseNet is among them.

I used his weight clipping strategy:

```
def clip_weights(net, clipvalue):
for l in net.layers:
weights = l.get_weights()
weights = [np.clip(w, -1*clipvalue, clipvalue) for w in weights]
l.set_weights(weights)
```

Sorry, tldr. @jeremy with your permission I would make the repo public so as to make it easier for other students to look at it.