Let’s say I want to do a regression task with MSE loss function but I also want to give more weight to certain observations, how could I pass in the weight of the batch to the loss function in fastai?
If you write your custom collate function to return input, [target,weight], you can have a custom loss function to handle this. I don’t know when your weight is determined, but when building the batch seems like a good time.
Let’s say I already know in advance the weights of my observations, for example I want to put less weight on observations in the past for example. So I know all the weights before putting the data through fastai.
In this situation, I am not sure I understand how I would do that with a custom collate function.
The only way I can think of right now is that I would write a custom Module for my model where the forward method contains two tensor parameters, one for the actual data and another for the weights:
def forward(self, data:Tensor, weights:Tensor)
Then I guess I would have access to that in the loss function?
Or you could add them to your targets. The only downside is you would have a tensor of weights instead of just one, but you can extract the first/last.
Makes sense! Just trying to add the weights to my targets by using label_from_func, but realizing that the func gets called with the index of the dataframe instead of a row in the dataframe. I see in fastai code that there`s a _label_from_list method, but since it begins with an _, I guess it is a private method. What would be the best way for me to add weights to my targets? Here is my code for reference.
Got this working using the private method _label_from_list, but I was wondering if I should use one of the public methods or maybe create a new public method to handle this scenario?
How would this work for classes/categorical targets?
Then I can’t have FloatList, also I think I need a CategoryList so that the classes are determined correctly?