First conv without ReLu?

This is in Lesson 7. Around the batch norm part, Jeremy mentioned a modern practice is to place a conv layer with 5x5 or 7x7 or 11x11 first. In the code, it seems that there is no ReLu or anything else for the activation of that conv layer. Just to double check if I understand correct. Below is the code. In the forward method, x = self.conv1(x) and then no ReLu.

class ConvBnNet(nn.Module):
def init(self, layers, c):
super().init()
self.conv1 = nn.Conv2d(3, 10, kernel_size=5, stride=1, padding=2)
self.layers = nn.ModuleList([BnLayer(layers[i], layers[i + 1])
for i in range(len(layers) - 1)])
self.out = nn.Linear(layers[-1], c)

def forward(self, x):
    x = self.conv1(x)
    for l in self.layers: x = l(x)
    x = F.adaptive_max_pool2d(x, 1)
    x = x.view(x.size(0), -1)
    return F.log_softmax(self.out(x), dim=-1)