New Style Transfer Technique Examples


(Vincent Marron) #1

Stream here:
https://vimeo.com/212990452

Or better yet download the non-compressed .avi (libx264 codec) from here and watch fullscreen:
https://www.dropbox.com/s/f9cw5oa0m5kclzw/final.avi?dl=0

I will be providing a lot of detail on my technique shortly… but its pretty technical so likely in a longer form piece.


(Even Oldridge) #2

Vincent this is AMAZING! It’s easily the best style transfer example I’ve seen to date, at least for that style. Does it generalize to other styles or is it best suited for images of that type?

Either way I’m super excited. My artwork is very busy just like the style image and I’ve been fooling around with style transfer but haven’t been satisfied. I can’t wait to give your method a try.

Looking forward to seeing more results, the description and I’m hoping the source as well if you’re willing to share it.

Incredible work man!


(Brendan Fortuner) #3

Truly great!


(Jeremy Howard (Admin)) #4

This is so great. Let’s work together to share this with the highest impact we can. I think this will be huge. My suggestion is to not share too much, other than teasers (like this video), until we’ve got a really great paper/post/whatever done, and then do a big release. In my experience that’s the best way to get great coverage (see DeepMind for an excellent role model on how to do this effectively).

I’ve got some good media relationships who I suspect will want to cover this kind of news. It’s exciting, inspiring, and visual.


(RENJITH MADHAVAN) #5

Awesome !!!


(Rachel Thomas) #6

These are fantastic Vincent!


(Sahil Singla) #7

Vincent: I am curiously waiting for a blog/paper describing the technique you used for this.


(Vincent Marron) #8

Working on it… Still water runs deep.


(Vincent Marron) #9

Sorry for the delay - I finally put up a repo on github with tensorflow code and some explanation: https://github.com/VinceMarron/style_transfer

This is still a work in progress – I’m having difficulty explaining the math/intuition behind the loss function (wasserstein distance in feature space). Hopefully I will be able to improve the explanation with time. Let me know if you have any suggestions.


(lateralplacket) #10

This is 404ing:

https://github.com/VinceMarron/style_transfer/blob/master/problems_gatys_fullEMD.ipynb


(Vincent Marron) #11

this should be fixed.


(Adam) #12

I think this should be included in the next cycle of Part 2. It is an improvement over the most advanced technique for this covered in the course (Gram matrix) - he uses the Wasserstein metric instead of the Gram matrix, and it is significantly better.


(Rohit Singh) #13

This is so groundbreaking - why isn’t this so popular?


(Vincent Marron) #14

Made some animations of technique:




(Karl) #15

This is super cool.

I briefly tried to implement this in pytorch, but found that torch.eig() doesn’t have a derivative implemented. Then I tried to implement the closed form calculation in the paper referenced:

Unfortunately M1 tends to have negative values so taking the square root of it leads to complex values. I assume this is why an eigenvalue approach was used in the first place. Clamping the values of M1 to a minimum of 0 results is the final term in the expression becoming overwhelmingly negative relative to the other terms, so the entire expression evaluates to the square root of a negative number. I suppose it makes sense that clamping the matrix values causes issues as it fundamentally changes the math.

Also on this subject - I’m assuming the square root operation is element-wise. Am I correct in thinking this? Or does it refer to decomposing M1 into a matrix X such that X @ X = M1?

Can anyone think of a way forward for doing this in pytorch other than waiting for torch.eig() to get a backward function?


(Vincent Marron) #16

Hey Karl - The square root operation is not element wise, it is a matrix square root. This means, as you note, finding X such that XX = M1. The eigendecomposition is a convenient way to do this as, for positive semi-definite M1=VLV^T, sqrt(M1)= V sqrt(L) V^T.


(Karl) #17

Thanks for the reply. I came back to this with pytorch 1.0 having a functioning symeig backward and the technique works really well. Its been interesting to play around with different conv layers and other parameters.