I try to evaluate a trained model in v1- which I assume should be done by calling the
validate(...) function (possibly for a different set of metrics/a different dataloader compared to the training process). I was only able to make it work with a call of the form
- It took me a while to figure out that
validate(...)should not be called with
loss_fn=Noneas this would lead
loss_batchto return the data in a different format. Is this the intended behavior?
metricsargument has to be passed as list- unlike
fit(...)which seems to provide a workaround if one provides just a single argument.
cb_handlerdid not work for me (as the shape of the input tensor seems to modified by one of the callbacks- although I didn’t figure out where exactly)
- given these issues- is there a reason why there is no
validatemember function in the
learnerclass very much like the
get_predsmember function (that would in the ideal case also allow to pass a custom dataloader and a custom list of metrics to evaluate)?
I think the new library is great step forward. Thank you for your hard work so far.