Part 2 Lesson 13 Wiki

Is this “keep the last 10 or 20 approximate hessians”?

1 Like

Is there any work on pre-training networks for generic style learning?

3 Likes

Just thinking of how some of this semi-relates to Feature Visualization from distill.pub

2 Likes

Universal Style Transfer via Feature Transforms

6 Likes

Is @ matrix multiplication or dot product?

did we skip the part where the vgg model is actually trained on the painting? I feel like we skipped straight to extracting the layers of interest.

1 Like

This uses the pretrained VGG model. Probably no fine-tuning needed, since we’re using the features extracted by some layer in the middle of the network.

2 Likes

@ is matrix multiplication (new in python 3)

5 Likes

I am enjoying today’s lesson a lot :slight_smile:

1 Like

I’m really glad about this too…For anyone that might have missed it I actually started a thread a couple days back specifically for an open discussion on this subject in the forums so welcome to anyone that wants to join in or share insights there :slight_smile:

11 Likes

first came to know about it on your computational algebra course :slight_smile:

2 Likes

Please ask him to talk about Multi-GPU

7 Likes

what was the name of the paper with style-transferring captain america’s shield?

It’s matrix multiplication. It’s a pure python operator. But dot product and matrix multiply are the same, no?

2 Likes

Deep painterly harmonization https://arxiv.org/abs/1804.03189

6 Likes

I CANNOT get enough of Jeremy’s frustration with Math notation. I would watch a friggin’ netflix special about that.

Can we make this happen?

13 Likes

Is there a name for the type of operation where you just multiply each pair without adding, like this
[[a,b],[c,d]] * [[e,f],[g,h]] = [ [a * e, b * f], [ c * g,d * h ] ]

There is a convention (especially loved by physicists) called Einstein summation. Numpy can do it: https://docs.scipy.org/doc/numpy/reference/generated/numpy.einsum.html

2 Likes

I guess it is Hadamard Product

3 Likes

element-wise multiplication. And yeah, that’s just * in pytorch/numpy/tensorflow

1 Like