# Neural_net_foundations chapter 4 in github

need more context on this :
It is really important for you to commit to memory and practice these bits of tensor jargon: rank is the number of axes or dimensions in a tensor; shape is the size of each axis of a tensor.

Watch out because the term “dimension” is sometimes used in two ways. Consider that we live in “three-dimensonal space” where a physical position can be described by a 3-vector `v`. But according to PyTorch, the attribute `v.ndim` (which sure looks like the “number of dimensions” of `v`) equals one, not three! Why? Because `v` is a vector, which is a tensor of rank one, meaning that it has only one axis (even if that axis has a length of three). In other words, sometimes dimension is used for the size of an axis (“space is three-dimensional”); other times, it is used for the rank, or the number of axes (“a matrix has two dimensions”).
so ‘v’ is a 3 vector entity . but according to Pytorch v.dim = 1 not 3 because v is a tensor of rank 1 . but the definition of a rank of a tensor is “number of axes or dimensions in a tensor” which in this case happens to be 3 . so how is v.dim a tensor of rank 1 and not a tensor of rank 3 .
its confusing .

I think the data structure needs to be defined not what the data represents
Need to think in terms of the rank of the data-structure

A vector is a rank 1 tensor: no matter how long the vector is - A[3], B[10] are both rank 1 tensors

A Matrix is a rank 2 tensor: A[2][30] is matrix with shape (2, 30) and is a rank 2 tensor. We generally call a matrix a 2 dimensional array.

3 numbers representing 3 dimensions of a physical world is not the same as a 3 dimensional array.