need more context on this :

It is really important for you to commit to memory and practice these bits of tensor jargon: *rank* is the number of axes or dimensions in a tensor; *shape* is the size of each axis of a tensor.

Watch out because the term “dimension” is sometimes used in two ways. Consider that we live in “three-dimensonal space” where a physical position can be described by a 3-vector `v`

. But according to PyTorch, the attribute `v.ndim`

(which sure looks like the “number of dimensions” of `v`

) equals one, not three! Why? Because `v`

is a vector, which is a tensor of rank one, meaning that it has only one *axis* (even if that axis has a length of three). In other words, sometimes dimension is used for the size of an axis (“space is three-dimensional”); other times, it is used for the rank, or the number of axes (“a matrix has two dimensions”).

so ‘v’ is a 3 vector entity . but according to Pytorch v.dim = 1 not 3 because v is a tensor of rank 1 . but the definition of a rank of a tensor is “number of axes or dimensions in a tensor” which in this case happens to be 3 . so how is v.dim a tensor of rank 1 and not a tensor of rank 3 .

its confusing .