 # Tensors Quick Chart - PyTorch to/from S4TF

(Stefano Giomo) #1

# Tensor

## Indexing

Operation PyTorch ? S4TF Note
single dimension `T` `T`
multi dimension `T[1,2]` `T[1,2]`
closed range `T[0:2]` `T[0..<2]`
open range `T[:2]` `T[..<2]`
TODO `TBD` `TBD` TODO: Add more…

## Properties

Operation PyTorch ? S4TF Note
shape `T.shape` `T.shape`
TODO `TBD` `TBD` TODO: Add more…

## Operations

Operation PyTorch ? S4TF Note
broadcasting auto IMPLICIT IMPLICIT Auto broadcast if applicable
reshape `T.reshape([1,2])` `T.reshaped(to: [1,2])`
unsqueeze `T.unsqueeze(1)` `T.expandingShape(at: 1)` Works with single or multiple dimensions
squeeze one dimension `T.squeeze(1)` `T.squeezingShape(at: 1)` Works with single or multiple dimensions
sum `T.sum()` `T.sum()`
`T.sum(dim=0)` `T.sum(alongAxes: )`
mean `T.mean()` `T.mean()`
`T.mean(dim=0)` `T.mean(alongAxes: )`
standard deviation `T.std()` `T.standardDeviation()`
`T.std(dim=0)` `T.standardDeviation(alongAxes: )`
`` `T.std()` Using fast.ai `` `T.std(alongAxes: )` Using fast.ai TODO `TBD` `TBD` TODO: Add more…

## Boolean operations

Operation PyTorch ? S4TF Note
all elements ‘operator’ `(TA<TB).all()` `TA<TB` Bool - Operators (<, ==, !=, >)
memberwise ‘operator’ `TA<TB` `TA.<TB` Tensor<Bool> - Operators (<, ==, !=, >)
TODO `TBD` `TBD` TODO: Add more…

Note that all-elements operatos in PyTorch return tensors with zero shape, not a Python’s boolean!

``````>>> res = (t1 == t2).all()
>>> print(res, res.shape)
tensor(0, dtype=torch.uint8) torch.Size([])
``````

## Initializers

Operation PyTorch ? S4TF Note
zeros `torch.zeros((1,2,3))` `Tensor(zeros: [1,2,3])` PT has factory, ST init
ones `torch.ones((1,2,3))` `Tensor(ones: [1,2,3])`
normal distributed `torch.zeros([8,9]).normal_()` `Tensor(randomNormal: [8,9])`
TODO `TBD` `TBD` TODO: Add more…

## Interoperability

Operation PyTorch ? S4TF Note
numpy `T.numpy()` `T.makeNumpyArray()`
TBD `TBD` `TBD` TODO: Add more…

LEGEND

• Same
• Similar
• Missing*
• Dunno / To Be Defined

NOTE

CONTRIBUTORS

• Try to be essential - this should be a quick reference not a complete explanation of concepts 32 Likes

(Joseph Catanzarite) #2

Thank you @ste This is brilliant!

1 Like

(Chris Lattner) #3

Super awesome! 1 Like

(Stephen Johnson) #4

Very nice!!

This one is missing the right bracket ]

``T.mean(alongAxes: [1)``
1 Like

(Richard Wei (Swift for TensorFlow team)) #5

Awesome comparison of APIs! I hope we can do this more down the road and survey more frameworks/languages to make sure Swift APIs are not only Swifty but also consistent with terms of art.

2 Likes

(Ilia) #6

At first glance, the S4TF seems (in general) to be a bit more verbose Though I guess it comes from Swift’s way to name and construct things. It seems that the S4TF’s API doesn’t precisely match with TF for Python, right?

Sounds very reasonable. Probably as time goes we could build a kind of cheat-sheet between PyTorch/TF/S4TF/Keras/etc. to make the transfer between frameworks more simple. (Especially taking into account how fast the things change from one year to another).

0 Likes

#7

Note you can edit it yourself. I just did the change thanks for high lighting it.

2 Likes

(Stefano Giomo) #8

Thank you @rxwei!
My goal was to show that PyTorch and S4TF are essentially a different name for the same thing.
As I’ve learned in this very course: syntax changes, but concepts last To start, I’ve focused on the “Tensor” concept because is the very basic of any “matrix manipulation” framework.
AFAIK: PyTorch API is highly inspired to numpy to leverage “muscle memory” of numpy users… (such as numpy probably tried to leverage matlab users, but 0-based).

## A SLICE OF THE SLICING HISTORY:

``````1968: Algol 68 -> a[:2, :2] # leading 2-by-2 submatrix "slice"
1970s: MATLAB -> A(1, :, 3) % single-dimension array along second dimension
..
Kenneth Iverson's APL (1957) had very flexible multi-dimensional array slicing, which contributed much to the language's expressive power and popularity.
``````

(The last one was for @jeremy )

## MY OPINIONATED OPINION

The only thing I would change from the actual syntax is standardize the “`..<`” slicing operator, creating an “alias” with “`:`” (IE: `A[:5]` should be the same as `A[..<5]`).
Maybe @clattner can help us on that I’ve two reasn for that:

1. PERSONAL REASON: I love Types because I’m a “distracted” person, having a “lott oF typo s” (a lot of typos) in all my code.
I’m pretty sure that thinking at `A[:N]` I’ll write a lot of times `A[...N]` in place of `A[..<N]`. This mistake is very dangerous because your code usually won’t break, but has a silent bug inside… (And Types can’t help me! For now ).

2. HORIZONTAL SPACE REASON: More characters for slicing (I’m joking of course )

``````(A[1:N]/B[1:N]) + C[5:N+4]     //Classic pytorch, numpy
(A[1..<N]/B[1..<N]) + C[5..<N+4]    //Standard Swift
``````

## PERSONAL EXPERIENCE

I’ve got a “similar” experience in the past (more than 10 years ago) working on a project that required a lot of linear algebra. At the time my workflow was:

1. experimenting and prototyping with matlab: concise and comes with all math functions you need
2. once things works: port the “formulas” to plain STL/c++ (no option for fancy libraries, it was a 32MB “embedded” device…).

The big problem was that Matlab has 1-based arrays and c++ has 0-based. At the beginning I was sticking to the specific language convention (ie matlab code 1-based, c++ code 0-based), converting indices all the time and having a lot of bugs!
Eventually I’ve written my own c++ PointsLibrary, overriding the operator[] and forcing it to be 1-based (pretending c++ to be matlab). This choice let me iterate way faster, with less bugs and “feeling more secure” coping and pasting formulas from one language to the other.

Coming back to my opinionated (not requested) opinion on that: I think that with a “:” like sintax we’ll reduce the number of bugs in porting existing code from python to swift.

2 Likes