Has anyone created a simple Feed-Forward Backpropagation neural network without using any external libraries (even numpy) ? If so, please share with me. I need to create one with 2 inputs and 2 outputs but I’m just starting in this domain(and unsure how to do it) so I would love to see if someone’s already implemented it.
- There is karpathy’s micrograd that is very very simpole and small (less than 100 lines of code).
- There is also geohot’s tinygrad that is bounded to under 1000 lines, it is more powerful, with numpy as the vector lib underneath. He also has videos on youtube showing the full process of building the lib. (they are very good in my opinion)