SelfAttention appears to be different from paper

SelfAttention as in https://github.com/fastai/fastai/blob/75653d571060d2af03e145e5c623bb854811a223/fastai/layers.py appears to lack the v(x) conv1x1 operation at the end.
v(x) as in https://arxiv.org/pdf/1805.08318.pdf Fig.2 and eqn. (2).
grafik

Was that done intentionally and if so why?