Lesson 9 Discussion & Wiki (2019)

I seem to remember that in fastai there’s a xSequential or something like that. If so, what does it do?

nn.Parameter is to register a parameter inside a model. This is for basics layers like Linear.

1 Like

Thanks @sgugger ! makes sense

1 Like

I think DummyModule's parameters method calls layer.parameters() - that’s a bit like using the word in the word’s definition

1 Like

In the training model section, how the was the backpropagation calculated? Was used the pytorchs autograd? The derivative of the new loss function was not defined manually, so I assume “we are allowed” to use pytorch’s autograd then.

Yes I believe we have been allowed to use it from the start of lesson 2 :slight_smile:

Yes we are allowed to do so since notebook 02 :wink:
You do not want to see the gradients of a softmax.

2 Likes

Its SequentialEx … and yah, I’m hoping that comes up soon too :slight_smile:

2 Likes

yes, as it is created by fastai around the start of part 1 last year.

WHAT IS TORCH.NN REALLY ?

by Jeremy Howard, fast.ai. Thanks to Rachel Thomas and Francisco Ingham.

This is like taking a walk through Pytorch development, a tiny but important part of it… Nice!

1 Like

What are the default initializations for the fast.ai learners? I must be missing it.

Does similar things exist in other libs like Mxnet, TF etc? as these are the literal building blocks.

We are not creating only fast.ai from from scratch but pytorch too :slight_smile:

That’s a very wide question, could you refine it? Do you mean model initialization?

Sounds like a good usecase for property based testing.

2 Likes

Understand how things developed give much more context and information about it. Learning how to develop this kind of programming mindset is much more useful than calling library API only. :slight_smile:

3 Likes

What’s the degree of support of fast.ai towards TensorFlow? i.e. I’d think that some classes could be very helpful also for training in TF (e.g. all the data management bit)

2 Likes

is it fair to think of generators (output of yield) as task graphs?

There is no support towards TensorFlow. For now, fast.ai is build on PyTorch only.

1 Like

I believe that TF has a ton of its own iterators and datasets already. As well as high-level estimators. Not sure about their data augmentation though.

1 Like