# Lesson 2 discussion

(Rachel Thomas) #22

@mattobrien415 can you give more details about your setup and what you’ve tried? See http://wiki.fast.ai/index.php/How_to_ask_for_Help for tips on the info we need to be able to help.

You can always run `which python` to check if you are using anaconda’s python or the system python, and `which pip` to check which pip you’re using. Here’s some info on installing TF with conda: https://anaconda.org/jjhelmus/tensorflow

(sravya8) #23

@jeremy Your description of error message was useful. It turned out I had a third sub folder in valid/ which was causing this. Deleting that fixed the problem. Thanks Jeremy!

(Jeremy Howard) #24

Sounds like the exact same issue that @ethan had!..

(sravya8) #25

I am trying to understand how correlate function works. (I watched the first lecture on the USF page).

In the first example, I understand where 6 came from, but unsure how the other cells are filled. With 1x2 cells, the pattern seems to make sense with the given formula. For the 1x3 cells example, I understand whre 23 came from trying to understand how other cells are filled - 14? 27?

(sravya8) #26

In other words, I am trying to understand how “reflect” mode works: https://docs.scipy.org/doc/scipy-0.16.0/reference/generated/scipy.ndimage.filters.correlate.html

(Jeremy Howard) #27

Great question. Here’s a spreadsheet showing how it works. Column I contains a sumproduct of columns C thru E for the 2 rows to its left. For each of the 3 calculations, I’ve moved the array (in italics) left one position, and then right one position.

As you can see, the trick is that the first and last values are duplicated. (‘reflect’ simply copies the whole array in reverse to the start and end.)

(sravya8) #28

Now I understand where 14 and 27 came from With respect to

I get the duplicated part of it. But not sure of the “reverse to the start and end” part of it.

(Jeremy Howard) #29

Well since we only needed one more item, it’s harder to see. But if we needed more, the array would have been: 3 2 1 1 2 3 3 2 1 . As you see, the array is reflected to create the additional items.

(vedshetty) #30

Great explanation Jeremy! But what is the purpose of choosing the ‘reflect’ mode? Edges or corners of the pic and/or edges of object (of interest) in the image?
Also what is convolution trying to achieve vs correlation? (For comparison b/w the two):

(Jeremy Howard) #31

@vshets look closely at the code at the top of the image. It shows that:

`convolve(arr, np.rot90(filt, 2))`

gives identical results to:

`correlate(arr, np.rot90(filt, 2))`

Since they are totally identical, we can’t say that one is better than the other in any way!

Regarding your first question - convolutions are used for a lot of applications: https://en.wikipedia.org/wiki/Convolution#Applications . I’m certainly not familiar with most of these, so I couldn’t say in which situations they would use reflection.

I haven’t seen people using it for deep learning, although I’m not sure if that just means no-one has tried it, or if people have tried it and found it unhelpful.

(vedshetty) #32

Hmm … ok. Any chance you meant one of your code lines to be
`correlate(arr, np.rot90(filt, 2))` ? Otherwise they would be identical.
Also in the image comparing the two above, there is a slight difference b/w the two: darker edges are on the top in correlate vs convolution.
So in essence one could use either as long they are able to detect features using those filters.

(Jeremy Howard) #33

No, convtop has the rot90(), corrtop doesn’t. The key line is the ‘np.allclose()’ line. Your examples are not the same, because they both have the rot90() call.

(vedshetty) #34

ok … got it … so basically they are identical, if like in the above example, one of the filters was rotated by 90 degrees for example.

(Jeremy Howard) #35

Precisely

(vedshetty) #36

A small correction to your point earlier:
"look closely at the code at the top of the image. It shows that:

`convolve(arr, np.rot90(filt, 2))`

gives identical results to:

`convolve(arr, np.rot90(filt, 2))` >>should instead be >> `correlate(arr, filt)`

"

(Jeremy Howard) #37

Oops! Thanks. Will edit it now.

(jbrown81) #38

I’m getting to this a week later so you probably already solved this issue but for others who may encounter it:
I had to install tensorflow through the jupyter notebook, by:

1. connecting to the main jupyter page
2. clicking the ‘Conda’ link at the top
3. search for ‘tensorflow’ in the bottom left
4. clicking the right arrow to get it to install

Now I can run through the convolution-intro.ipynb and tensorflow imports properly.

(anamariapopescug) #39

Quick question - I was revisiting all the notebooks and for the lesson2.ipynb notebook and I get the error message below. The valid/ and train/ subdirectories have the required structure/content, was wondering if anyone has run into this. Thanks!

## Message

Notebook line:
val_data = get_data(val_batches)

## Error:

TypeError Traceback (most recent call last)
in ()
----> 1 val_data = get_data(val_batches)

/home/ubuntu/nbs/utils.pyc in get_data(path)
77
78 def get_data(path):
—> 79 batches = get_batches(path, shuffle=False, batch_size=1, class_mode=None)
80 return np.concatenate([batches.next() for i in range(batches.nb_sample)])
81

/home/ubuntu/nbs/utils.pyc in get_batches(dirname, gen, shuffle, batch_size, class_mode)
69 def get_batches(dirname, gen=image.ImageDataGenerator(), shuffle=True, batch_size=4, class_mode=‘categorical’):
70 return gen.flow_from_directory(dirname, target_size=(224,224),
—> 71 class_mode=class_mode, shuffle=shuffle, batch_size=batch_size)
72
73

/home/ubuntu/anaconda2/lib/python2.7/site-packages/keras/preprocessing/image.pyc in flow_from_directory(self, directory, target_size, color_mode, classes, class_mode, batch_size, shuffle, seed, save_to_dir, save_prefix, save_format)
288 dim_ordering=self.dim_ordering,
289 batch_size=batch_size, shuffle=shuffle, seed=seed,
–> 290 save_to_dir=save_to_dir, save_prefix=save_prefix, save_format=save_format)
291
292 def standardize(self, x):

/home/ubuntu/anaconda2/lib/python2.7/site-packages/keras/preprocessing/image.pyc in init(self, directory, image_data_generator, target_size, color_mode, dim_ordering, classes, class_mode, batch_size, shuffle, seed, save_to_dir, save_prefix, save_format)
553 if not classes:
554 classes = []
–> 555 for subdir in sorted(os.listdir(directory)):
556 if os.path.isdir(os.path.join(directory, subdir)):
557 classes.append(subdir)

TypeError: coercing to Unicode: need string or buffer, DirectoryIterator found

Type error in lesson 2, get_data method
(Jeremy Howard) #40

get_data() has been changed to take a path, rather than a generator. So it should be something like:

``````get_data(path+'train')
``````

Search the forum for ‘get_data’ to learn about the reason behind this change, if you’re interested.

(anamariapopescug) #41

great, thanks !