I’m a big fan of the Julia language for scientific programming, and especially of the
Convex.jl libraries for optimisation and
HPAT.jl for parallel programming, which are best of breed. A key advantage of Julia is speed - it can be an order of magnitude faster than Python for some tasks. That said, it’s a very new language (version 0.5) with a relatively tiny support base, which can be extremely challenging. On the bright side, it has really nice Python and C interoperability, which makes it easy to spin out wrappers for python libraries in Julia.
Because I use Julia a lot, I’m interested in finding out whether it would be possible to do deep learning in Julia by wrapping libraries (Theano, Tensorflow, Keras) from Python.
As a feasibility study in whether this is a good idea or not, I’m thinking of trying to replicate the part 1 notebooks in Julia (and making some libraries along the way).
I’m keen to get feedback on whether anyone else finds this interesting and what approaches they would try. I’ll share my experiences here for those who may find it useful in future.