Rebelling conventions - Jeremyisms

Just wanted to collate the different mentions & approaches that @Jeremy declares are against conventional methods.

Lesson 8:

Using Jupyter notebooks for iterative modules/library development.

On Chain rule - differentiation and Calculus of infinitesmals

Please add more, if missed out.

Though entire fastai approach is top-down anti-conventional, these are some of the Jeremyisms that will be nice to be captured

5 Likes

Overfitting:

  • When validation metrics are not improving anymore
  • Not when training loss is lower than validation loss
1 Like

To be more precise, he said when validation loss is getting worse. Sometimes your validation loss stays the same but your metrics still improve for instance.

4 Likes

TBH, the “notebook library” seems an old idea when I did that in 2017, I just needed to copypaste a few lines from the Jupyter docs, probably because other people had the idea before me and even wrote the documentation needed to do just that.

And I don’t know who invented the overfitting, but I think A. Kaparthy had it in his Stanford CS whatever course.

Best regards

Thomas

1 Like

It’s not a new idea, but it’s still highly controversial. I’ve been assured, many times, that it is impossible to build up modules using Jupyter!

It’s many many decades older than that. However most advice about it still recommends checking for whether training loss is lower than validation loss, unfortunately.

4 Likes

I think you mean validation accuracy, or validation metrics.

2 Likes

“The cutting edge of deep learning is about engineering, not papers.” --Jeremy

10 Likes

i keep hearing ‘deep learning’ and ‘engineering’ a lot recently
first Leslie then Jeremy… deeplearning.engineering hmm… :thinking: :slight_smile:
are we moving to a new field within AI space? if so what would we need to change in current workflows and how?

2 Likes