Lesson 8 in-class

@mo.shakirma I had to install these to get the notebooks running

pip install xgboost
pip install gensim
pip install keras-tqdm

1 Like

@mo.shakirma pip install xgboost

1 Like

I shared a link on the part 1 forum. anyway here it is: https://github.com/dunovank/jupyter-themes

1 Like

for me it was:

pip install matplotlib
pip install pandas
pip install xgboost
pip install bcolz
pip install gensim
pip install nltk
pip install keras_tqdm

from utils2 import * caused all that dependencies

3 Likes

I found out about Part I of this course from Import AI.

2 Likes

If you import the VGG model that is built into keras (keras.applications), do you still have to re-order the channels, etc.?

https://docs.scipy.org/doc/numpy/user/basics.broadcasting.html

4 Likes

I like the Import AI and Wild ML newsletters: http://www.wildml.com/newsletter/

1 Like

Shouldn’t we use something like Resnet instead of VGG (with avg pooling) since the residual blocks carry more context?

Should/do we put in batch normalization like we did in lesson 5?

Will the pre-trained weights change if we’re using Average pooling instead of max pooling?

2 Likes

how about vgg16_bn_avg?

MaxPool --> AvgPool change still preserves the “pretrained”?

Do you have to retrain VGG on imagenet when you change max pooling to avg pooling?

Is it advisable/recommended to learn Tensorflow?

@sakiran Yes

Can jeremy scroll down a bit on the screen when he goes over code snippets so they are centered on the board?

Can you walk us through [loss]+grads ? Why is there a plus sign there and what does it do?

1 Like

It’s python syntax to concatenate lists. (Make loss into a list, then concatenate it w/ grads.)

1 Like

So it’s just [loss, grads] flattened?