Lesson 1 discussion

This is what my path looks like:

This is what the code that accesses the valid folder looks like:

To me it seems as if it would work perfectly, since it would access data/redux/valid/, which has 2 folders (cats and dogs) in it.

What does ‘%pwd’ show?

ubuntu@ip-10-0-0-8:~/nbs/data/redux/sample/valid$ pwd
/home/ubuntu/nbs/data/redux/sample/valid

try %pwd within your notebook.

1 Like
%pwd
u'/home/ubuntu/nbs'

Wow - I’m stuck! Any chance you can come in a little early? We can take a look for you.

Oh one more thing. Try ‘ls -la’ in bash. Just in case there’s a hidden directory…

4 Likes

Oh that may have been it!

How do I remove .ipynb_checkpoints ?

1 Like

rm -rf .ipynb_checkpoints

1 Like

Thanks! I’m running it; hopefully there’s no error! :smiley:

I just created a page & copy pasted the relevant info from the Windows install. I already had a python environment up and running, so I’m not super clear on what the full steps will be for others. Hopefully the page can get fleshed out as more people get set up.

The vgg.fit is working perfectly. However, when I do vgg.model.save_weights(path+'results/ft1.h5'), in ft1.h5 in my “sample” folder, it says this:

Error! /home/ubuntu/nbs/data/redux/results/ft1.h5 is not UTF-8 encoded
Saving disabled.
See Console for more details.

I know that UTF-8 encoding is for character encoding… Does anyone have any ideas on how to solve this issue? Perhaps I need to somehow convert it into UTF-8? (http://stackoverflow.com/questions/4182603/python-how-to-convert-a-string-to-utf-8)

Thanks!

Can you show a screenshot so we can see exactly what you typed and what happened?

I fit and it works perfectly. Then I run the save weights command:

It saves this file into my “samples” folder:

And the file looks like this:

That last screen that says “the file looks like this” - what is that screen? How did you get to it? If you’re trying to open the .h5 file directly in Jupyter (which it looks like perhaps you are) that’s not going to work. You can open the file only using ‘model.load_weights(filename)’, or by using an hdf5 viewer.

Ah, yes I was opening straight from Jupyter. Thanks!

Now I found another error, which I’ve spent the last 45 minutes playing around with. The code is:

Jupyter looks like this under the redux folder; The “test” folder is there.

The notebook itself is under nbs, and path is set to data/redux/.

What I’ve noticed is that even if I change the code to say (path+‘randomword’…), the error still always says "AttributeError: Vgg16 instance has no attribute ‘test’ ". So, this test attribute requirement must be coming from somewhere, otherwise the error message would have said “… no attribute ‘randomword’”

Thanks!

@ethan I added the test() method later - I showed in lesson 2 how to create it. I think if you download the latest from platform.ai you should find it - although better still would be to watch the video and see if you can create it yourself! :slight_smile:

I was revisiting the lesson1.ipynb and have a doubt in the step Creating a VGG Model from Scratch in Keras.

In this step, why do we need to import the mappings from Imagenet?

I opened the json file and it does contain the imagenet category names. However, I could not figure out what the other field indicates. (Example: n01440764).
Here is a snapshot of the imagenet_class_index.json

My question is, when we finetune, the model will be trained to be able to predict the categories based on the data (here, the different folders in train).
So why do we need to have a mapping of the imagenet classes?

1 Like

Great question. We import this just so we can print out the predicted classes later:

Those class names at the bottom couldn’t have been printed out if we didn’t have those mappings loaded. So you can use the vgg network for finetuning without these mappings - you only need them for printing predicted class names. That’s why the notes say “for display purposes” when we import that file.

The field with the numeric codes prefixed by ‘n’ are wordnet ids, which imagenet uses for categorizing the classes.

2 Likes