Lesson 4 In-Class Discussion ✅

http://academictorrents.com/details/a4fee5547056c845e31ab952598f43b42333183c

3 Likes


In my experience with the wiki103 LM, I have noticed it favors past tense sentences over present tense, which I suspect is because of most sentences in wikipedia article are in pas tense.

3 Likes

What would be a good source for transfer learning to medical notes models?
edit: physicians use a lot of abbreviations

Are LSTMs involved in making the language model?

7 Likes

Sounds like the wiki text 103 might be a good start for a transfer learning model

2 Likes

yes, you’ll see them shortly

1 Like

Yes, the backbone of ULMFit is an AWD-LSTM

1 Like

Rachel’s mic didn’t seem to be tied into the live stream.

And then you’d need to finetune your model to your medical scripts corpus.

4 Likes

Has transfer learning for NLP been proven useful for other languages than English?

5 Likes

is there bias in language models? such as gender, race in embeddings? how to deal w it?

5 Likes

I will check about this next time.

There are many algorithms - statistical as well as neural (LSTMs) that can be used to make language models.

1 Like

Does the wiki-text model and the fine-tuned version share the same vocabulary?

As a matter of fact yes! Check the language model zoo topic :wink:

9 Likes

I can hear you clearly.

1 Like

Curious to see if a similar model for Russian exists. Articles on Wikipedia in other languages can be much more limited in quantity and length.

yes, always. there are debias techniques…

2 Likes

can the transfer learning of wiki text can be used different domains which wiki text does not have any context ?