Read values of weights and biases


(Ehab Ibrahim) #1

Hey all,

I’ve been training a model for a classification problem. The inputs and outputs are in the form of tabular data.
My code looks something like this:

After performing m.fit(), I need to read the values of the trained weights and biases. I have checked out functions related to “m” and “md” objects, and I have searched through the forums, but I didn’t find anything helpful. I’ve also tried saving my model using m.save() function, and tried to read the MODEL.h5 output file with h5py library but with no use.

Sorry if this seems like a simple question, but can anyone help me out?
I also wanted to know what’s the default activation function used. I assume it’s ReLU, but is there any way to check it out or change it in my code?


(Nick) #2
list(m.model.parameters()) # get the weights and biases
m.summary() # to see model architecture

Actually m.summary() wiill probably throw an exception, i did a pr for that https://github.com/fastai/fastai/issues/529

For now you can just run

m.model

It will show you less info, but you can figure out which activations are used in every layer.


(Ehab Ibrahim) #3

Thanks for the response, it worked!

Is there any way to know the activation function used from code?


(Nick) #4

Take a look at MixedInputModel class in column_data.py. Based on your parameters is_multi=True and is_reg=False, the output activation must be sigmoid.

    x = self.outp(x)
    if not self.is_reg: 
        if self.is_multi:
            x = F.sigmoid(x)
        else:
            x = F.log_softmax(x)

(Ehab Ibrahim) #5

Again, thanks a lot!


(Ehab Ibrahim) #6

I have looked into MixedInputModel class in column_data.py, and from there I found that all linear layers “input and hidden layers” use Relu() as the activation function. Only the last output layer’s activation function can be either sigmoid or log_softmax.
Thought I’d post it in here in case someone reads this post