Wiki: Lesson 1

Check your understanding of the lesson 1

Check your understanding of the lesson 2 >>>

(post original in portuguese at Deep Learning Brasília - Lição 1)

Hi guys,

I did watch again the video of the lesson 1 (part 1) to get the whole image and I took notes of the vocabulary used by @jeremy.

Let’s play ! OK ? :wink:
Can you give a definition / a url / an explanation for all the followings terms and expressions ?

If yes, you are done with the first lesson !!! :sunglasses: :sunglasses: :sunglasses:

PS : you do not want to test yourself or you want to check your answers ? Go to the blog post “Deep Learning 2: Part 1 Lesson 1” of @hiromi : " super travail !!! :slight_smile: "

  • course Fastai
  • forum Fastai
  • GPU
  • CUDA
  • NVIDIA
  • Crestle / PaperSpace
  • jupyter notebook
  • Data Science
  • SHIFT + ENTER in a jupyter notebook
  • python 3
  • wget
  • exclamation mark in a cell (ex : !ls)
  • bash command
  • python variable into brackets
  • training set
  • validation set
  • Fastai Machine Learning course : prerequesite or not ?
  • image Classifier
  • label
  • keras
  • plt.imread
  • plt.imshow
  • python 3.6 format string
  • img.shape
  • 3 dimensional array (rank 3 tensor)
  • Red Green Blue (RGB) pixel values between 0 and 255
  • kaggal competition
  • pre-trained model
  • resnet24
  • ImageNet competition
  • Convolucional Neural Network (CNN)
  • accuracy
  • train a model
  • 3 lines of code
  • epoch
  • testing set
  • learning rate
  • loss function
  • cross entropy loss
  • validation and testing set accuracy
  • Fastai library
  • transfer learning
  • pytorch
  • tensorflow
  • network architecture
  • data augmentation
  • validation set dependent variable val_y
  • data.classes
  • classes
  • object data
  • object learn
  • the model
  • prediction on validation set
  • learn.predict()
  • log of the predictions : log_preds
  • get the predictions on validation set np.argmax(log_preds, axis=1)
  • get probabilities on dogs : np.exp(log_preds[:,1])
  • numpy
  • top-down, the whole game
  • code driven approach
  • world class neural network
  • stalelite images
  • structured data
  • NLP classifier
  • recommendation system
  • text generator
  • create our own architecture from scratch
  • donwload a pre-trained model and precompute
  • alphago
  • image classifier for fraude dectection
  • machine learning
  • Arthur Samuels, 1950s, ML father
  • IBM mainframe
  • play checkers
  • traditional Machine Learning
  • features engineering
  • domaine experts and specialits
  • algorithm (Deep Learning) :
    ** infinitely flexible function
    ** all-purpose parameters fitting
    ** fast and scalable
  • neural network, number of simple linear layers interspersed with a number of non linear layers
  • universal approximation theorem
  • Fit parameters, Gradient Descent (how good are they, find a minimum on loss function curve, local miminim)
  • minimum time, GPU 10 time faster than a CPU
  • hidden layer
  • increase of number of parameters by layer is a problem but increase number of layers is teh solution
  • DL = neural network with multiple hidden layers
  • Google starts using DL in 2012
  • Geoffrey Hinton, DL father
  • Andrej Karpathy
  • inBox by Gmail
  • Skype Translator
  • Semantic Style Transfer
  • cancer detection
  • true/false positive/negative
  • CNN, Convolucional Neural Network
  • convolucional
  • find edges
  • multiplication of pixels values by a kernel (filter)
  • linear operation
  • linear layer
  • non linear layer
  • sigmoid
  • Relu
  • element wise multiplication
  • michael Neslon
  • Stochastic Gradient Descent
  • derivative
  • small step
  • learning rate
  • combine convolution, non linearity, gradient descent
  • picture of what each layer learns
  • parameters of the kernels are learnt using gradient descent
  • learn.fit()
  • learning rate not too high, but not too low as well
  • choosing a learning rate
  • learn.lr_find()
  • best improvement of the loss before it gets worse
  • learn.shed.plot_lr()
  • learn.sched.plot()
  • mini batches
  • traing loss
  • validation loss
  • validation accuracy
  • overfitting : stop fitting your model
  • tab to get list of function
  • SHIFT + TAB (once : parameters, twice : documentation, 3 times : pops up a window with source code)
  • 1 question mark : documentation
  • 2 question mark : pops up source code
  • H to get short codes in jupyter notebooks
  • Stop your Crestle or PaperSpace machine !
  • use the fastai forum !
12 Likes