Trying to fix: Lesson 10 imdb training is slow. Also: Anybody used snakeviz+portforward with success?

I’m trying to make imdb classification training time shorter.
I have managed to make it 15 min per epoch, but still, it’s very slow.
There seem to be a bottle neck that has nothing to do with model complexity.

Any starting pointer of how to try to see what’s taking a long time?

I tried to use %prun but the result is too messy.
So I was gonna use %sankeviz, but port forwarding seems not to be working.
Can somebody take a look at my ssh setting and error message and tell me how to set it right?

ssh -L 9999:localhost:9999 -L 8080:localhost:8080 -L 8080: paperspace@mypaperspacebox

where 9999 is used by jupyter and 8080 is reserved for snakeviz.

this is what I get when I run %snakeviz train_my_model
Usage: snakeviz [options] filename
snakeviz: error: no web browser found: could not locate runnable browser
snakeviz web server started on; enter Ctrl-C to exit

but when I look at or even http://localhost:8080/snakeviz/%2Ftmp%2Ftmpdz4pglqa, nothing’s there.

Thoughts and ideas please. :’(

(Hello from the future!)

I’m running into this issue with slow training time for the language model now. Did you manage to find a solution for it?

I also encounter very slow training. For tuning language model from Imdb it takes me about 1 hour 25 minutes per epoch with a batch size of 48