I have made a 2 layer linear net from 2 2x2 tensors. Very simple network. It is based on actual equation so I kind of know how the weights should look like and what rules should be there in the weight matrix e.g. w1==w2, w3==0 etc. Problem is, I can’t seem to figure out the place where to apply these constrains or how exactly.
I did create nn.Parameters in my model:

But I have no idea on when or how to access them later. Should it be a custom loss function, optimizer, callback?
Help will be much appreciated.
Cheers!

I am not sure I understand your needs exactly. But you can access the weights with model.m1.weight

To change a weight: model.m1.weight.data[1,0] = 99

Analogously for model.m1.bias. You can change the parameters at any point that makes sense, though I’d be cautious about changing them between model.forward() and optimizer.step(). It might mess up the gradient calculations.

As I am using learn.fit_one_cycle, I can’t just put this model.m1.weight easily in-between my training steps unless I wanted to make the whole Learner code custom (so that I could edit what is being run and when).
So my problem is rather accessing that model variable inside a loss or optimizer function. Not sure how to do that. For example, optimizer object only has parameters: