If the inputs to my model are x,y,z and my outputs are continuous variables a,b,c,d I can obviously use the model to predict the vector [a,b,c,d] from [x,y,z]. However what happens if I find myself in a situation whereby I have say value b as a prior, before inference? Can i run the network in a manner such that I am predicting [a,c,d] based on [x,y,z,b]? Real World example: I have an image with pixel locations (x,y) and pixel values (R,G,B). I then build a neural implicit model to predict pixel values based on pixel locations, say now I have a situation where I have the green values for some pixels as well as their locations, can I use these green values as a prior with my original network to get an improved result. Note that I am not interested in training a new network on said data.

In mathematical terms: I have network f(x,y,z) â†’ (a,b,c,d) how can I perform f(x,y,z|b) â†’ (a,c,d)?

I have not tried much here, thinking of maybe passing the prior value back through the network but am kinda lost.

Note that I am not interested in training a new network on said data.

Iâ€™m a novice, just throwing out an answer so I can prune my divergent thinking when someone corrects me. Based on what Iâ€™ve picked up the last six months, you are changing the inputs from [x,y,z] to [x,y,z,b]. You would need to reshape-the-inputs** to add a new input node that would have random weights linked to it, so the model would pay no attention to that new input node without further training.

**p.s. Iâ€™m guessing â€śreshaping-the-inputsâ€ť might perhaps be able to be redone similar to how fine-tuning reshapes the output layer of a pretrained model, but Iâ€™ve not seen it discussed anywhere. Iâ€™m curious but no time right now to search.

While your approach can work, it is the approach which I am trying to avoid as it requires a new network that is trained on a new input dataset (xyz) â†’ (xyzb). I am trying to see if there are any differentiable programming techniques that can be used here to solve my problem whilst maintaining the original network.