Looking for collaborators for Neural Network Library

I am working on a simple library meant to be used as a learning aid to understand how neural networks work internally.

It supports arbitrary operations and can compute the gradients for any op (just like tensorflow,pytorch, etc).
An output from one such computation graph:

I have implemented most popular operations like:
Add
Subtract
Matrix Multiply
Relu
MSE (Mean squared Error)

and a few more

Now I am looking for more collaborators to implement additional operations and also working on documentation/tests.

More ops include:
convolution
sigmoid
binary cross entropy loss

Hope to find people who are interested to work on this.

2 Likes

Hi harveyslash,
I’ve taken a quick look at your library and I you can easily improve the performance. For example in Add.py:

def forward(self):
    left = self.children[0].forward()
    right = self.children[1].forward()
    add = np.add(left, right)
    return add

def backward(self, respect_to_node, parent_grads=None, **kwargs):

    back = parent_grads
    if back is None:
        back = np.ones_like(self.forward())

    if respect_to_node == self.children[0]:
        child = self.children[0].forward()
    elif respect_to_node == self.children[1]:
        child = self.children[1].forward()

In the backward method you are calling the forward method. You can assume that before a backward pass there is a forward pass. So you can save self.add = np.add(left, right) on the forward pass and use self.add in the backward pass. Using this approach you won’t need to recompute all the graph.
You can also do the same with children.

1 Like

Hi!

Thanks for showing interest in my library!

The forward() method is called only once when its required. Any other time its called, the values are cached , so it will be returned instead of recomputing.
This cached value can be manually removed by calling clear() on the graph.

You can find the logic to do that here:

Its the same for backward() too

1 Like

Oops, I told you it was a quick look :grimacing:
Good luck with your library, it’s a fantastic way to learn.