# MAE doesn't change after tweaking the parameters

I’ve watched lecture 3 of the fastai course. I am toying around with the notebook: How does a neural network work? provided in the lesson resources.

The notebook ends with a DIY prompt asking you to create a neural net of 3 ReLu’s to approximate a quadratic function.

Here’s my code snippet:

``````def triple_relu(m1,b1,m2,b2,m3,b3,x):
return rectified_linear(m1,b1,x) + rectified_linear(m2,b2,x) + rectified_linear(m3,b3,x)

@interact(m1=-1.5, b1=-1.5, m2=1.5, b2=1.5, m3=1.5, b3=1.5)
def plot_triple_relu(m1, b1, m2, b2, m3, b3):
plt.scatter(x,y)
loss = mae(f(x), y)
plot_function(partial(triple_relu, m1,b1,m2,b2,m3,b3), ylim=(-2,14), title=f"MAE: {loss:.2f}")
``````

My concern is that the MAE doesn’t change when I tweak the parameters.

I want the plot function to behave similarly to what Jeremy shows in his lecture – the MAE changes when you move the sliders giving you a sense of whether you’re sliding in the right direction.

Thanks for the help!

Hi,

I think you have to redefine `f` inside of the `plot_triple_relu` like how Jeremy defined his version:

``````@interact(a=1.1, b=1.1, c=1.1)
plt.scatter(x,y)
loss = mae(f(x), y)
plot_function(f, ylim=(-3,12), title=f"MAE: {loss:.2f}")
``````

I hope this helps.

Isn’t that what we’re doing in this line `plot_function(partial(triple_relu, m1,b1,m2,b2,m3,b3), ylim=(-2,14), title=f"MAE: {loss:.2f}")`?
Passing the `triple_relu` method through the plot_function.

That’s how it’s done in the lesson, right?

Only here, instead of storing the function in a variable f and then passing it to the plot function, we’re directly passing `triple_relu`

Please point out any gaps in my understanding.

Hi,

So, let’s take a look at how loss is calculated. We have `loss = mae(f(x), y)` to find loss. In your function, you did not define what `f` is. So, it is probably referring to a global function if you defined it outside of the `plot_triple_relu`. But you are plotting a new function (`partial(triple_relu, m1,b1,m2,b2,m3,b3)`), which has nothing to do with how you are calculating loss (using `f`). Therefore, the loss does not get updated when you change your graph.

1 Like

Make sense. Here’s how I updated my plot function:

``````@interact(m1=-1.5, b1=-1.5, m2=1.5, b2=1.5, m3=1.5, b3=1.5)
def plot_triple_relu(m1, b1, m2, b2, m3, b3):
f = partial(triple_relu, m1,b1,m2,b2,m3,b3)
plt.scatter(x,y)
loss = mae(f(x), y)
plot_function(f, ylim=(-2,14), title=f"MAE: {loss:.2f}")
``````

Thanks for the help!

1 Like