Hi! I’m sorry if this is a n00b mistake. I’m using @lesscomfortable’s Spanish LM, which he so graciously has a GDrive link to on the linked GitHub repo. However, in general, is something like the fwd_wt103.h5 model useful without the corresponding itos_wt103.pkl?
That is, without mapping the classification task’s vocab to the LM’s vocab, would we get any benefit?