Ever struggled to understand the role of ReLU. Mathematically extremely simple, but without these layers neural nets become useless. So, here is my take on the importance of non-linear layers:
Enjoy, Christian
Ever struggled to understand the role of ReLU. Mathematically extremely simple, but without these layers neural nets become useless. So, here is my take on the importance of non-linear layers:
Enjoy, Christian
A very enjoyable and concise post I’m not on medium so here’s a c .
So what we do is implementing an learning algorithm?
Cheers mrfabulous1
Thanks for the feedback. Deep learning is kind of weird. What we implement and run is indeed a classical algorithm - the learning algorithm. We can write down every step like importing data, matrix / tensor multiplication, etc. But this algorithm looks basically the same for any learning task. Internally, a kind of secondary, or hidden algorithm is formed by the learning process. Through this process, the model parameters get a meaning, as they carry the information which patterns to extract from the data. But we mostly don’t understand this meaning (and through the non-linear character will never have a real chance to do so). Otherwise we would be able to extract this information and write a classical algorithm that could perform the task that has been learned by the neural net (like object identification, etc.)
Cheers, Christian