SOURCE CODE: Mid-Level API

Not to my knowledge, no.

1 Like

Thanks @init_27 enjoy the weekend then. Best regards

Np! You too! :tea:

1 Like

Hi all – sorry, I ended up being a bit out of the loop with this in the end.

Something I’m trying to work out: say you had a NN written fully in numpy, but still wanted to use the DataLoader or DataBlock API. Is it possible to apply something like np.asarray() as a batch transform? Or is there a better way to do this?

I was going through the DataLoader documentation and found that using batch_sampler is mutually exclusive to :attr:batch_size, :attr:shuffle, :attr:sampler, and :attr:drop_last, can someone explain to me how is this working?
PS: mutually exclusive means that the two or more events cannot coincide

I am reading chapter 7 of fastbook and I found that in creating the datablock, the splitter was not specified but training was able to proceed as expected as show below:


Is there a default value that is being assigned here and what is it?

It uses RandomSplitter by default

1 Like

great thanks

So still on chapter 7, I read this paragraph and honestly, I don’t understand the problem that it’s talking about. Would be glad if any one could break it down for me.

One issue with this, however, is that Mixup is “accidentally” making the labels bigger than zero, or smaller than one. That is to say, we’re not explicitly telling our model that we want to change the labels in this way. So if we want to change to make the labels closer, or further away, from zero and one, we have to change the amount of Mixup–which also changes the amount of data augmentation, which might not be what we want. There is, however, a way to handle this more directly, which is to use label smoothing .

Guys when is the next meeting, it seems the meetings has stopped

There’s another group where the meetings have now shifted. It’s the Fastbook study group which you can find on the forums.

ok thanks

This seems like it was an awesome study group. Did you guys have notebooks by any chance of the lessons? can I have access to them?

Hi @jimmiemunyi.
All of the resources that we created while doing the course were posted in comments in this thread. There are no other resources to speak of.
Thanks for asking!

1 Like