No activation function in rossmann

HI there,
Just tried to use Rossman notebook on another dataset.
When I get the learner, I realized the nn contains linear layers, but no activation function… ! No relu, no tanh, nothing. Just intermediate dropout. Isn’t that crazy ?

Now… Does anyone know how can I add some through fast ai interface…?

Never mind.
Seems to be directly included in the code with no mention in ‘m.summary’.
Code here for those interested : in MixedInputType class
"def forward(self, x_cat, x_cont):
if self.n_emb != 0:
x = [e(x_cat[:,i]) for i,e in enumerate(self.embs)]
x = torch.cat(x, 1)
x = self.emb_drop(x)
if self.n_cont != 0:
x2 = self.bn(x_cont)
x = torch.cat([x, x2], 1) if self.n_emb != 0 else x2
for l,d,b in zip(self.lins, self.drops, self.bns):
x = F.relu(l(x))
if self.use_bn: x = b(x)
x = d(x)
x = self.outp(x)
if not self.is_reg:
if self.is_multi:
x = F.sigmoid(x)
else:
x = F.log_softmax(x)
elif self.y_range:
x = F.sigmoid(x)
x = x*(self.y_range[1] - self.y_range[0])
x = x+self.y_range[0]
return x
"