Loading pretrained encoder(AWD_LSTM) when model structure is the same but layer name is different?

I modified the pretrained model AWD_LSTM with the same model structure but different model name. Then I defined my own classification model structure.

encoder = SentenceEncoder(AWD_LSTM_edited(vocab_sz, **config), bptt)
model = SequentialRNN(encoder, AttentionDecoder())

How can I load a pretrained encoder since the encoder structure is the same but learn.model.state_dict().keys() is different?