Theory


Spatial Transformer Networks (https://arxiv.org/abs/1506.02025) (5)
How does a fully connected neural network learn features compared to convolution neural networks? (7)
"Differentiable programming" - is this why we switched to pytorch? (12)
New Blog post - Graph embeddings 2017 (6)
Two types of Transfer Learning (4)
Deep gates in RNNs? (1)
Skip connections in RNNs (1)
Stuck while implementing Gradient Boosting from scratch (1)
Google's AI creates AI (1)
DeepLearners paper reading Dec. 2nd Meetup (11)
Great post: A Year in Computer Vision (1)
Is fast.ai in a grand movement of revolution?: software vs learnware (3)
New Nature journal: Nature Machine Intelligence (1)
Some really nice looking ML/DL cheat sheets (3)
DeepLearners Nov.18th 6PM on GANs (1)
Estimating distribution parameters using RNNs (1)
How to understand “individual neurons are the basis directions of activation space”? (3)
Crazy Thoughts -- Residuals, Transfer, Capsules (1)
New AlphaGo Zero reinventing Go without data (4)
Why do we use accuracy as metric rather than something like f1 score/AUC? (4)
Does deep learning require dense data and works bad with sparse data? (2)
3D face Reconstruction using single image (1)
Theoretical ML book with solutions (2)
Can we calculate the impact of an image to the model? (6)
Loss on hierarchical categories (1)
Ridge regression for model ensembling - why do we want to use it? (1)
SWISH: Google researchers found new activation function to replace ReLU (7)
Vocabulary Complexity? (1)
How good is "Differentiable Neural Computers"? (1)
How to deal with missing labels when you have multiple losses? (5)