Lesson 2 In-Class Discussion

If we fixed a seed for random generation we should be able to get the same results.

Every time you run a deep learning optimization there is randomness involve almost everywhere. For example, we use randomness in the cropping of images, in the initialization of weights etc… I think 0.993 is a very good result. You can run the

plot_confusion_matrix

and see how it differs from Jeremy’s.

1 Like

@jeremy something happened from last week to this week.
Before the learner was giving feedback when it was downloading the model from fast.ai.
Now it takes a lot of time to start the fit process??
Looks like the whole thing is taking all the memory on the system (16 GB)

Is that normal?

1 Like

Make sure you’ve git-pulled the repo.
That might correct any issues.

The weights are now available under /datasets/fast.ai/models/weights.

2 Likes

Thank you for your suggestion. I got some additional comments on this thread too.

– You can use cross-validation, one of the notebooks have an example on how to use it.
– You can add your own metrics. Look for the argument “metrics”.
– As Jeremy mentioned the best approach is to oversample the minority class. You can also under-sample the majority class.

2 Likes

@yinterian which notebook shows an example of how to use cross-validation? Jeremy had mentioned that it wasn’t needed unless the dataset was so small that we couldn’t afford to set aside the validation set. Is there a good benchmark/threshold for deciding when its “too small”? Or is it simply a matter of checking whether or not k-fold cv improves accuracy?

I think it was also mentioned that at the end of validation, you can actually run training one last time with the whole dataset (validation included) to get a better accuracy which again I guess would mean cv isn’t needed in that case.

Take a look at this notebook.

3 Likes

Hi I am getting this error from running:

from fastai.transforms import *

File “fastai/torch_imports.py”, line 26
if pre: load_model(m, f’{path}/weights/{fn}.pth’)
^
SyntaxError: invalid syntax

Is there something I am missing out here.

It is because VGG has fully connected layers which have a fixed number of weights – which depend on the required input size.
These links are helpful:


https://www.quora.com/How-is-Fully-Convolutional-Network-FCN-different-from-the-original-Convolutional-Neural-Network-CNN

3 Likes

That makes sense! I did try printing out what zip returned, and it just told me that it was a zip object :sweat_smile:

Thank you for a great lesson yesterday!

@jeremy will the Dog Breeds Walkthrough also be setup on AWS so one can step thru it?

When center cropping an image (see red boarder), we may loss the important details (head and paws in this case). Should we use image re-sizing from rectangular to square instead?

8 Likes

We’ve had this question a few times already - please do a ‘search’ on the forum before posting. This error means that you’re not using python 3.6.

1 Like

Excellent question. This type of resizing is what keras does by default. I’ve found that it seems to generally work less well, since it has to learn how images look different depending on how they’re squeezed. But we do have the ability to use this squeezing approach in fastai - maybe @yinterian could show an example?

8 Likes

No, because it’s an active competition, so I’m not allowed to under kaggle rules. But replicating it yourself would be a great exercise.

1 Like

This is going to depend on the problem. For many problems center cropping will be fine for other problems you may want to resize. You can do both with the fast.ai library.

1 Like

Curious if anyone has tried ResNet style architecture for language models. Would be interesting to see what kind of features initial layers would capture and how starting with smaller sequences and retraining on larger ones (like how we did in class with starting with smaller size images and changing to larger ones) would affect the model.

1 Like

A recent architecture called ‘Transformer’ uses a ResNet style block for NLP. It worked really well.

5 Likes