No, these layers run in parallel. You can see this in the fastai code here: https://github.com/fastai/fastai/blob/8013797e05f0ae0d771d60ecf7cf524da591503c/fastai/layers.py#L176-L184
Basically this does: torch.cat([self.mp(x), self.ap(x)], 1)
where mp
and ap
are the two pooling layers. Notice how both layers work on the same input, x
?
Just because they’re shown one after the other in the summary, doesn’t mean that’s how they are connected.