The Life of Py (Torch 0.4) - Beta experience

What’s it about?

I’ve started this topic because pytroch 0.4 is coming and some of us are now experimenting with it and fastai. I’ve run into some issues with it and for the rest of us that are not using 0.4, it doesn’t seem like a good idea to post stuff about it in the other forums. So please post your questions, comments, etc. about pytorch 0.4 here.


1 Like

IMDB hangs on Tokenizer().proc_all_mp(partition_by_cores(texts)) , In side by side tests using the IMDB notebook (pytorch 0.31 vs. 0.4) on my Linux box, the proc_all_mp hangs and never completes on 0.4 and works fine on 0.31. I know this is a native python function and not related to pytorch, but it happens and it can be “fixed” by reducing the number of cores passes into the function.

1 Like

Is nn.Embedding missing in 0.4?

Running the imdb notebook I ran into an error when the forward function of the RNN_Encoder tried to run the EmbeddingDropout.forward() method. At line:

X = self.embed._backend.Embedding.apply(words,
masked_embed_weight, padding_idx, self.embed.max_norm,
self.embed.norm_type, self.embed.scale_grad_by_freq, self.embed.sparse)

it crashes with a NotImplemented error. Setting a breakpoint shows:

which IS implemented in pytorch 0.31. I’m wondering if this is a bug or if somehow I built Pytorch 0.4 incorrectly. Has anyone else seen this?

The AWD LSTM stuff relies on some pytorch internals. I’m not surprised to hear those internals are changing - will not to update fastai for that.