How to use intermediate output of one model as input to another model in the same Lerner?

Im trying to build an autoencoder trained on the CIFAR-10 dataset, and use the encoded features to input into a classifier to do classification. My issue is, I want to be able to train both the models in the same training loop as designed in this thread:
https://discuss.pytorch.org/t/autoencoder-and-classification-inside-the-same-model/36248

This means, I need to be able to write a custom training loop, but I want to be able to enclose it into a Fastai Learner so I can get all the nice features which come along with it.

Strangly, I am not finding much information on either the forums or the docs.

I understand that if we have one model, which takes some input and gives some output, Its pretty convenient to wrap it in a learner by doing model = Learner(model = model). But In my case, as shown in the link, I have three models, the encoder, decoder and the classifier, the output of the encoder serves as the input to both classifier and decoder, and both have differnt losses and update steps. How can I wrap this training loop in a Learner?

How can I use callbacks to do optimization steps for two models at once? Im a begineer in PyTorch, Kinda frustrating. Please Help.

You could do something like so:

Net = nn.Sequential(model3, model2, model1)

So far, I think this helps chop off a part of the network, and appends a new head.

custom_head = nn.Sequential(AdaptiveConcatPool2d(), Flatten())
new_model = nn.Sequential(*list(children(old_model[:-20])), custom_head)

But I have no idea how to use the new_model? Does anyone know how to “put back” the model into a fastai cnn_learner?

I was able to extract out a part of a model, but the model object is not the same as the ones used by default, like models.resnet34.

Might be better to create your own thread?

Have you been through part 2? This will be explained in more detail there.