Error while doing: learn.get_preds(DatasetType.Train, with_loss=True)

Doing
learn.get_preds(DatasetType.Train);
runs fine

but if I do
learn.get_preds(DatasetType.Train, with_loss=True);
I get
TypeError: mean_squared_logarithmic_error() got an unexpected keyword argument 'reduction'

---------------------------------------------------------------------------
TypeError                                 Traceback (most recent call last)
<ipython-input-40-2f4cff9693e6> in <module>()
----> 1 preds, loss= learn.get_preds(DatasetType.Train, with_loss=True);

1 frames
/usr/local/lib/python3.6/dist-packages/fastai/basic_train.py in get_preds(model, dl, pbar, cb_handler, activ, loss_func, n_batch)
     44            zip(*validate(model, dl, cb_handler=cb_handler, pbar=pbar, average=False, n_batch=n_batch))]
     45     if loss_func is not None:
---> 46         with NoneReduceOnCPU(loss_func) as lf: res.append(lf(res[0], res[1]))
     47     if activ is not None: res[0] = activ(res[0])
     48     return res

TypeError: mean_squared_logarithmic_error() got an unexpected keyword argument 'reduction'

I found as workaround

ypred, yreal  = learn.get_preds(D);
loss=mean_squared_logarithmic_error(ypred,yreal)  # The MSLE of all losses together
L= [mean_squared_logarithmic_error(yp,yr) for yp,yr in zip(ypred,yreal)] # individual losses

I’d like to know if anyone knows a better way to do this or what I am doing wrong here. Specifically how to get the individual predictions and loss on a new dataset without having to do it by row.

I know, this is a bit messy.

  P=[learn.predict(train.iloc[i])[0]  for i in train.index ]
  # Then convert Itemlist > list > number
  P2=[(i.data)[0] for i in P] # Converted to simple list
  P2_Tensor=Tensor(P2)
  COLUMN=Tensor(train[dep_var].tolist())
  # The loss function needs tensors
  L= [mean_squared_logarithmic_error(yp,yr).tolist() for yp,yr in zip(P2,COLUMN)] ;# individual losses

In any case, I am sure I will end up here somewhere in the future when looking for this error.