Part 2 Lesson 13 Wiki


(Brian Muhia) #101

Is this “keep the last 10 or 20 approximate hessians”?


(blake west) #105

Is there any work on pre-training networks for generic style learning?


(Erin Pangilinan) #106

Just thinking of how some of this semi-relates to Feature Visualization from distill.pub


(Brian Muhia) #107

Universal Style Transfer via Feature Transforms


(Hiromi Suenaga) #109

Is @ matrix multiplication or dot product?


(Kyler Connelly) #110

did we skip the part where the vgg model is actually trained on the painting? I feel like we skipped straight to extracting the layers of interest.


(Brian Muhia) #112

This uses the pretrained VGG model. Probably no fine-tuning needed, since we’re using the features extracted by some layer in the middle of the network.


(Rachel Thomas) #114

@ is matrix multiplication (new in python 3)


(rkj) #115

I am enjoying today’s lesson a lot :slight_smile:


(James Requa) #116

I’m really glad about this too…For anyone that might have missed it I actually started a thread a couple days back specifically for an open discussion on this subject in the forums so welcome to anyone that wants to join in or share insights there :slight_smile:


(Nafiz Hamid) #117

first came to know about it on your computational algebra course :slight_smile:


(Bart Fish) #118

Please ask him to talk about Multi-GPU


(Mohammad Saad) #119

what was the name of the paper with style-transferring captain america’s shield?


(blake west) #120

It’s matrix multiplication. It’s a pure python operator. But dot product and matrix multiply are the same, no?


(William Horton) #121

Deep painterly harmonization https://arxiv.org/abs/1804.03189


(Brian Holland) #122

I CANNOT get enough of Jeremy’s frustration with Math notation. I would watch a friggin’ netflix special about that.

Can we make this happen?


(Quan Tran) #123

Is there a name for the type of operation where you just multiply each pair without adding, like this
[[a,b],[c,d]] * [[e,f],[g,h]] = [ [a * e, b * f], [ c * g,d * h ] ]


(Emil) #124

There is a convention (especially loved by physicists) called Einstein summation. Numpy can do it: https://docs.scipy.org/doc/numpy/reference/generated/numpy.einsum.html


(Ananda Seelan) #125

I guess it is Hadamard Product


(blake west) #126

element-wise multiplication. And yeah, that’s just * in pytorch/numpy/tensorflow