Reproducibility Challenge 2020 - fastai folks interested

@DanielLam Probably keeping much of the boilerplate in fastai I think. For example if it is a new model architecture proposed then that can probably be dropped in to fastai pretty easily without too much concern that we don’t use the exact same Pytorch Dataloader, Dataset etc (as long as the preprocessing/augmentation etc are the same of course)

@tyoc213 ASR/TTS are super interesting alright. I’ve been focussed on NLP recently but would enjoy dipping into another area too

@stefan-ai Yes Reformer is very cool, it would be a lot of fun to implement alright. Just wondering when it comes to experiment replication would the GPU compute needed be too much? But happy to give it a shot if you think its manageable! HuggingFace also had a nice blog explaining it: The Reformer - Pushing the limits of language modeling

(@Richard-Wang you should definitely enter your ELECTRA work to the reproducibility challenge too, you’ve done phenomenal work on it so far!)

I’ll have a look around at a few NLP/ASR/TTS papers and try find some interesting, useful & low resource ones to replicate. I’ll share here when I do. Probably the Good Readings threads have a few contenders:

2 Likes