Seq2Seq translation based on v3 course?

(Andriy) #1

Did anyone tried to reimplement seq2seq translation code using the material presenting in v3 of the course? Seems, all I can find is about fastai v0.7 or something…

0 Likes

(Zachary Mueller) #2

See the NLP course, there is a seq2seq notebook github.com/fastai/course-nlp

It uses fastai 1.0

0 Likes

(Andriy) #3

Yes, https://github.com/fastai/course-nlp/blob/master/7-seq2seq-translation.ipynb is what I’ve checked before posting. But it doesn’t use modularized preprocessing that was used in https://github.com/fastai/course-v3/blob/master/nbs/dl2/12_text.ipynb

The later I feel to be much more clean.

1 Like

(Andriy) #4

Still cannot reproduce.

Can someone take a look at https://github.com/akhavr/fastai-lectures-v3-seq2seq/blob/master/seq2seq.ipynb ? Might be I’m missing something stupid, since the model doesn’t train at all.

0 Likes

(Andriy) #5

Got a deep belief that something is wrong with optimization routines in v3 course.
Gave up, so far, trying to reproduce seq2seq on fastai-dev v2

0 Likes

(WG) #6

I’m working on major updates to this including transformer implementations and huggingface integration.

2 Likes

(Andriy) #7

Is there a “work in progress” snapshot somewhere? Because, right now it uses fastai v1, and there’s a seq2seq fastai-v1 notebook here https://github.com/fastai/course-nlp/blob/master/7-seq2seq-translation.ipynb

0 Likes

(WG) #8

Not at the moment.

1 Like