Lesson 2 In-Class Discussion ✅

I think it is making x an nx2 tensor (array), then setting the first column to be sampled from a uniform distribution on [-1, 1]

Right, but I don’t understand the x[:,0] syntax.

In that case I believe it’s just a matter of time…

1 Like

Specifically in the first column.

Jeremy is explaining it now.

1 Like

: means select all rows,0 selects first column

2 Likes

Reads to me as follows,
First create a tensor of 1s with n rows and 2 cols.
Then uniformly distribute the first col between -1 and 1. (hence overwriting it, _ is a in place operation suffix)

all rows in column 0

what’s the difference between np.ones or torch.ones if there is any?

We use pytorch because numpy isn’t compatible with a GPU.

2 Likes

It’s being explained now.

x = torch.ones(100, 2) creates a 2x100 matrix where every single value is a 1

The x[:,0] means “every single row of column zero” - and the uniform_ function ends in a _ which means that it must run an in-place replacement.

x[:,0].uniform_(-1.,1) = replace column zero of every row with a uniform value selected between -1 and 1

6 Likes

np.ones belongs to numpy library torch.ones belongs to torch library (as far as I understand). May be they are the same code-base - I don’t know :smiley:

np.ones returns an numpy array, torch.ones returns a pytorch tensor

3 Likes

Numpy (in general, matrix prog. lang.) syntax.

thanks. that’s a much better answer. are they (np array and torch tensor) structurally (axes, ndim, whatever) different?

torch.ones(n,2)
is it a rank 2 tensor ?

PyTorch has implemented some of the same functionality as Numpy. As Sylvain said, however PyTorch takes advantage of the GPU (offering speedups), which Numpy does not

8 Likes

yes it is

Sylvain being the first name of @sgugger :wink:

1 Like

does it mean even if I do matrix-vector multiplication using numpy, it will not take advantage of GPU?