Did anyone tried to reimplement seq2seq translation code using the material presenting in v3 of the course? Seems, all I can find is about fastai v0.7 or something…
See the NLP course, there is a seq2seq notebook github.com/fastai/course-nlp
It uses fastai 1.0
Yes, https://github.com/fastai/course-nlp/blob/master/7-seq2seq-translation.ipynb is what I’ve checked before posting. But it doesn’t use modularized preprocessing that was used in https://github.com/fastai/course-v3/blob/master/nbs/dl2/12_text.ipynb
The later I feel to be much more clean.