GTX 1060Ti learn.fit() 10x slower than GTX 1080Ti?

Good evening! I am working through the lesson 1 notebook and just ran the code to train & evaluate dogs vs. cats below:

I know that my 1060Ti is not going to get the same results as Jeremy running it on a 1080Ti, but was wondering if the 1060Ti is really 10x slower, or if there are areas of optimization that I might have missed.

Thanks!

Sorry! I didn’t realize this was already being discussed in this thread:

This is a separate issue - the 1st time you run it, it has to precompute the activations. If you run it again, it’ll be fast.