It’s not exactly a GAN for text generation but maybe you’ll find ELECTRA interesting. It follows a similar idea since it’s using generator and discriminator models for language model pre-training.
@Richard-Wang recently did some amazing work reimplementing ELECTRA from scratch. He probably can say more about it. See the thread below:
Yes, as @stefan-ai said, ELECTRA is actually not a GAN (b/c it doesn’t backprop through discriminator and generator), but it is inspired by GAN, and successfully applies GAN-like “idea” on NLP pretraining.
The authors have also mention the difficulty of applying GAN to text by citing Language GANs Falling Short. Although I haven’t found time to read it , but it should be related to your question.