I try to evaluate a trained model in v1- which I assume should be done by calling the validate(...)
function (possibly for a different set of metrics/a different dataloader compared to the training process). I was only able to make it work with a call of the form validate(learn.model, learn.data.valid_dl,metrics=[accuracy],loss_fn=learn.loss_fn,cb_handler=CallbackHandler(learn.callbacks))
- It took me a while to figure out that
validate(...)
should not be called withloss_fn=None
as this would leadloss_batch
to return the data in a different format. Is this the intended behavior? - the
metrics
argument has to be passed as list- unlikefit(...)
which seems to provide a workaround if one provides just a single argument. - calling
validate
without acb_handler
did not work for me (as the shape of the input tensor seems to modified by one of the callbacks- although I didn’t figure out where exactly) - given these issues- is there a reason why there is no
validate
member function in thelearner
class very much like theget_preds
member function (that would in the ideal case also allow to pass a custom dataloader and a custom list of metrics to evaluate)?
I think the new library is great step forward. Thank you for your hard work so far.