Lesson 2 Clarifications

Hi,
I have some doubts after listening to the first 45 minutes of lesson 2.

  1. Confused about the concept of batch/mini-batch, I recollect this was mentioned in part 1 but I am unable to recollect where and how we used this in the code?
  2. Jeremy talks about weights, and I wonder what they exactly mean, are they outs of each hidden layers in a NN?

I recollect some feedback that Jeremy did mention was to go through the course even if you dont understand each line, does that still hold for 2018 course. I reckon, I spend a lot of time understanding each line when it may not be the best approach.

Weights are what is used by a layer to perform a calculation. It takes the inputs, combines it somehow with the weights that it stores and out comes the output.

batch / minibatch - you present your train data to your model in batches. If you have 100 training examples and batch size of 10, it will first learn from first 10 examples, then the next 10 and so forth.

Yes, it does I think :slight_smile: Or even to a greater degree, I would say.

1 Like

where is the example of mini batch/batch?

I am not sure I understand your question. The training algorithm uses batching so that for each iteration of training our model only sees whatever the batch size is amount of examples. Might be 64, might be 256, whatever arbitrary number.

The cats and dogs train set has I think ~ 20k images? I think the dataset has 25k images with half for cats and half for dogs. So say we use 5k for validation, that leaves us with 20k in train set. We divide them up in batches and feed them to the model.

An epoch on the other hand, will be one pass through the entire train set. In the example I gave earlier, if we have 100 images in train set, the epoch will be seeing 100 images. The batch size can be 10, 20, 1, whatever we pick.

I don’t think there is any practical difference between a batch / minibatch as far as I can tell for how they are used in this course. They both stand for the same thing I believe.

Unless a minibatch is a very small batch but then it is all in the eye of the beholder :slight_smile:

(I get you might want to maybe make the distinction for some online algorithms or something but don’t think we use this distinction anywhere in the discussion we have in the course, at least as far as my memory serves me)

1 Like

Thanks for the clarification