Obtain parameter values at training time using Hook Callback

Dear TomB, thanks for your very helpful reply. I have waited some more time because I wanted to have a working version of your suggestions.

I got some inspiration also from https://forums.fast.ai/t/callbacks-in-fast-ai/31655, https://forums.fast.ai/t/help-understanding-and-writing-custom-callbacks/28762 and https://github.com/sgugger/Deep-Learning/blob/master/Using%20the%20callback%20system%20in%20fastai.ipynb (unfortunately not working on my computer).

So now I got what I need if I use the following code:

@dataclass
class MyCallback(Callback):
    def __init__(self, learn:Learner):
        super().__init__()
        self.imparo = learn

    def on_batch_begin(self, num_batch, **kwargs):
        fileName = f'file_{str(num_batch)}.pt'
        torch.save(self.imparo.model[1][0].weight.data, fileName)

path = untar_data(URLs.MNIST_SAMPLE)
data = ImageDataBunch.from_folder(path)
learn = Learner(data, simple_cnn((3,16,16,2)), callback_fns=MyCallback)
learn.fit(1)

I get file_0.pt, file_1.pt, etc. with the weights of the the second cnn layer just before the first batch, the second batch, etc. In fact learn.model[1][0] corresponds to Conv2d(16, 16, kernel_size=(3, 3), stride=(2, 2), padding=(1, 1)).

This is very good because it offers a solution to my problem. I am nevertheless looking for a solution that stores the results in some variable I can use directly in the notebook. Something llike:

@dataclass
class MyCallback(Callback):
    def __init__(self, learn:Learner):
        super().__init__()
        self.imparo = learn
        self.stuff = []

    def on_batch_begin(self, num_batch, **kwargs):
        self.stuff.append(self.imparo.model[1][0].weight.data)

In this case, I attach all the weights from the different batches to the variable / attribute stuff. Unfortunaly, I cannot find it.
I have already searched inside learn but without success.

Am I overlooking something?
Thanks a lot!