Part 1, online study group

Just a reminder, the meetup will be held today in ~1h :grinning:

1 Like

The meetup is on :grinning:

There should be a reminder!!! one day prior to the meeting.

1 Like

Thanks for the feedback. If there’s an automated way to do that, let the group know.

It’s much easier for us as community members to create a personal calendar event with the information shared in the wiki to make that happen. @shahnoza is volunteering the time to host this and kind enough to provide zoom for this study group. I try not to ask the host to do more work than needed.

1 Like

Thanks for the feedback. Currently, @shahnoza does remind us on the group regarding the meetups. However, based on the feedback, I have now added an automated reminder to the slack group. We should be getting a reminder on Fridays. As @msivanes pointed out, it would be easier for the members to create a personal calendar event.

2 Likes

Meeting Notes 11-01-2020

  • Announcement about restart of cadence (starting Lesson 2) by @shahnoza

  • Lesson 2 review by @gagan

    • Classifier for pen vs pencil followed by questions. @gagan actually timed it starting from data collection to the inference & time taken is 23 min to demonstrate fastai is really FAST AI :slight_smile: . (@gagan++) Colab

    • Conceptual Framework of Supervised Learning (Gradient, Parameters, Loss, Model, Observations, Targets) by @msivanes for lesson2 - sgd.

  • Projects Showcase

    • Bengaliai - kaggle competition by @danny Kaggle
    • Car Classifier along with showing EarlyStopping & SaveBestModels callback during training by @tendo. Colab

Advice

  • Stacked transfer learning - use fine tuning on smaller 224 images (fine tuned) followed by using actual image size data (fine tuned)

Discussion

  • Class Imbalance : Is it still valid when we use transfer learning?. It might due to the fine tuning. The best thing to do is try it out as Jeremy said.

  • num_workers : number of cpu cores to speed up the data grabbing process. If you are get out of memory error, reduce num_workers to a smaller number or reduce the batch size(bs).

Resources

4 Likes

Hi @msivanes
How do I take part in the discussion?

@AjayStark
The top post(wiki) has all the information that you need to participate in the study group & in the discussions. Let us know if you face any difficulties with anything specific.

Lesson 5

I shared the below image in our previous meetup. This is an updated version along with annotated code and notes.

Building a minimal Neural Network (Logistic Regression with no hidden layer) from scratch

Let’s walk through step by step and also refer how we code each blocks from the below image


Source: Natural Language Processing with PyTorch by Delip Rao et al.

  • Predictions: y_hat = model(x) , here we are using own model.
  • Loss function: loss_func(y_hat, y). In addition to that we are also adding it with w2*wd
  • Gradients: parameter.sub_(learning_rate * gradient), performing an inplace subtraction on parameters with product(learning_rate, gradient). But since our model has multiple parameters (weights, biases), we are looping through them using PyTorch parameters.
  • Extras:
    • Weight Decay:
      • a) w2: using each parameter, we are calculating the sum of squared weights, w2 , for p in model.parameters(): w2 += (p**2).sum()
      • b) wd: a constant (1e-5)
      • multiply w2 and wd & add to regular loss_func
  • Combined
    • We are going to calculate the loss for each minibatch by calling update(x,y,lr) on them. losses = [update(x,y,lr) for x,y in data.train_dl]
    • .item() turns into a python number in order to plot & see them visually.
def update(x, y, learning_rate):
  wd = 1e-5
  #prediction
  y_hat = model(x)
  w2 = 0.
  #sum of squared weights
  for p in model.parameters():
    w2 = w2 + (p**2).sum()
  # regular loss
  loss = loss_func(y_hat, y) + w2*wd
  # updates the gradients in the model ie parameters
  loss.backward()
  # instruct pytorch not to record these actions for the next gradient calculation
  with torch.no_grad():
    for p in model.parameters():
      #gradients
      p.sub_(learning_rate * p.grad)
      p.grad.zero_()
  return loss.item()

Resources

Feedback is welcome.

5 Likes

Hi msivanes hope your having a splendid day.
Thanks for a great post.

Cheers mrfabulous1 :smiley::smiley:

1 Like

Just a reminder, there is a meetup tomorrow at 2pm GMT, dedicated to Lesson 3 :slightly_smiling_face:

1 Like

The meetup starts in about an hour. Link (Always the same): zoom.us/j/226775879 .

The meetup is on :slight_smile:

Meeting Notes 18-01-2020

  • Multilabel classification review by @gagan
  • Segmentation classification notebook walk-through by @gagan
  • Image Regression notebook walk-through by @Shahnoza

Advice

Discussion

  • Class Imbalance : Doesn’t appear to be valid for images. Research suggests that it might be valid for tabular data. Further exploration required.
  • Partial functions : Concept review by @gagan (Used in multi-class classification)

Resources

4 Likes

We are deciding timing for the next meetup, please write your suggestions in the comments or vote in the slack :slight_smile: The meetup will be dedicated to Lesson 4

Curious general question: how do you guys keep track of changes in your model. For instance, when you change parameters or change resnet#. Do you guys save the graph in word document, use a software to compare different runs or what do you guys do?

Personally, i had been trying to use Wandb to compare different models but still trying to learn how to use different features. For instance, if i wanted for it to grab a graph of a certain run and the others to compare how its changing.

1 Like

Highly interested. expecting an invitation. Thanks

You can join our open slack group. Link is in the first post of this thread. The timings of the next meetup will be updated soon. :slight_smile:

Hi hi, just a reminder, the meetup is today at 4PM GMT ( 8AM PST = 9:30PM IST = 11AM EST) :slight_smile: It is dedicated to NLP and based on Lesson 4. Click to the Zoom link when it is a time.

1 Like

The meetup is starting in ~11mins, zoom link is on!