Much like the thread here: Can't import QRNN or QRNNLayer, I’m having issues importing the qrrn model (same error, ImportError: No module named 'forget_mult_cuda'). I’ve tried to reinstall fastai and to manually import the QRNN function from source code, to no success.
Doing a new thread since I’m using google colab, so the issue might pop up again for some other users also who are dependent on the google colab system.
There is no module forget_mult_cuda in the library. I’m guessing it’s the JIT command to compile the custom CUDA kernels that fails on colab, nothing much we can do about it if that’s the case.
I noticed that the awd-lstm version (https://github.com/fastai/fastai/blob/master/fastai/text/models/awd_lstm.py) has qrnn “baked” into it, but I’m trying to import the separate qrnn model since it’s indicated to do so in the part 2 translation notebook. Will try to solve it by using modified awd-lstm instead.
I ran into this problem as well and in looking at the PyTorch code it says that it uses Ninja to build the module. It seems that Ninja is missing in the Colab package list so you need to do a !pip install ninja in a cell before trying to load the learner.
After that I was able to make the QRNN function in Colab.
You need to do a pip install ninja and then a pip install -U fastai to get the QRNN to work. It seems that Ninja needs to be present before the FastAI package is installed so that it detects it.
I found a fix based on @nsecord 's reply when I ran into this error on Google Colab: I ran pip install ninja and then pip install --upgrade --force-reinstall fastai. Colab prompts the user to restart the runtime, which solved the issue for me. @matheuscosta hope this helps!