Following is the snippet from the code
(refer:https://github.com/fastai/fastai/blob/master/fastai/layers.py, Line 290)
def embedding(ni:int,nf:int) -> nn.Module:
“Create an embedding layer.”
emb = nn.Embedding(ni, nf)
# See https://arxiv.org/abs/1711.09160
with torch.no_grad(): trunc_normal_(emb.weight, std=0.01)
Now if we test for, emb.weight.requires_grad(), it gives TRUE. What was the purpose for torch.no_grad(), if the gradient calculation is still active?