I was wondering if the problem with sampling is due to the fact that this is a timeseries and there is a time an order dependency between the rows, so with shuffling or sampling we would lose this order (I don’t know what’s the right name for this).
I’m using this dataset from the timeseries classification website. I checked the prediction of my model and I see it’s outputting 1
s so it probably it learned something
>> learner.get_preds(DatasetType.Valid)
tensor([[9.9425e-01, 5.7537e-03],
[7.2724e-01, 2.7276e-01],
[9.9526e-01, 4.7443e-03],
[9.9658e-01, 3.4218e-03],
[6.0188e-01, 3.9812e-01],
[9.9603e-01, 3.9694e-03],
[9.8937e-01, 1.0634e-02],
[9.8595e-01, 1.4051e-02],
[6.0619e-01, 3.9381e-01],
[9.5675e-01, 4.3249e-02],
[9.9305e-01, 6.9478e-03],
[9.5982e-01, 4.0185e-02],
[8.4680e-01, 1.5320e-01],
[9.7382e-01, 2.6183e-02],
[9.9200e-01, 8.0023e-03],
[7.2946e-01, 2.7054e-01],
[9.0504e-01, 9.4956e-02],
[9.9728e-01, 2.7198e-03],
[9.6200e-01, 3.7998e-02],
[8.4773e-01, 1.5227e-01],
[8.2070e-01, 1.7930e-01],
[9.9535e-01, 4.6532e-03],
[9.0245e-01, 9.7546e-02],
[9.6488e-01, 3.5121e-02],
[9.6708e-01, 3.2921e-02],
[9.1114e-01, 8.8860e-02],
[6.7380e-01, 3.2620e-01],
[9.9140e-01, 8.6026e-03],
[9.9147e-01, 8.5279e-03],
[9.9574e-01, 4.2569e-03],
[6.7916e-01, 3.2084e-01],
[8.1221e-01, 1.8779e-01],
[7.2926e-01, 2.7074e-01],
[6.7195e-01, 3.2805e-01],
[9.7636e-01, 2.3637e-02],
[9.8936e-01, 1.0645e-02],
[9.8023e-01, 1.9773e-02],
[9.8331e-01, 1.6693e-02],
[9.8056e-01, 1.9441e-02],
[8.0585e-01, 1.9415e-01],
[8.5409e-01, 1.4591e-01],
[9.6576e-01, 3.4244e-02],
[5.7844e-01, 4.2156e-01],
[8.5839e-01, 1.4161e-01],
[8.3605e-01, 1.6395e-01],
[7.9539e-01, 2.0461e-01],
[9.9599e-01, 4.0103e-03],
[8.9708e-01, 1.0292e-01],
[9.9833e-01, 1.6663e-03],
[9.9922e-01, 7.7785e-04],
[9.3736e-01, 6.2635e-02],
[9.7506e-01, 2.4941e-02],
[9.9683e-01, 3.1704e-03],
[9.6329e-01, 3.6711e-02],
[9.9205e-01, 7.9503e-03],
[8.9988e-01, 1.0012e-01],
[3.4490e-01, 6.5510e-01],
[9.9277e-01, 7.2321e-03],
[9.4736e-01, 5.2636e-02],
[8.4546e-01, 1.5454e-01],
[9.8601e-01, 1.3993e-02],
[9.6343e-01, 3.6571e-02],
[9.6380e-01, 3.6198e-02],
[5.7684e-01, 4.2316e-01],
[7.6970e-01, 2.3030e-01],
[5.4828e-01, 4.5172e-01],
[9.5975e-01, 4.0253e-02],
[6.9527e-01, 3.0473e-01],
[8.5458e-01, 1.4542e-01],
[9.9969e-01, 3.0637e-04],
[9.5228e-01, 4.7721e-02],
[9.5492e-01, 4.5078e-02],
[9.8068e-01, 1.9323e-02],
[7.1458e-01, 2.8542e-01],
[5.6506e-01, 4.3494e-01],
[9.8045e-01, 1.9546e-02],
[9.2896e-01, 7.1041e-02],
[9.9604e-01, 3.9574e-03],
[9.8500e-01, 1.4995e-02],
[9.3539e-01, 6.4615e-02],
[6.9669e-01, 3.0331e-01],
[8.9084e-01, 1.0916e-01],
[9.2574e-01, 7.4258e-02],
[9.9943e-01, 5.7221e-04],
[9.5959e-01, 4.0410e-02],
[9.5426e-01, 4.5743e-02],
[9.8531e-01, 1.4690e-02],
[9.9888e-01, 1.1250e-03],
[6.8742e-01, 3.1258e-01],
[9.9715e-01, 2.8496e-03],
[7.7061e-01, 2.2939e-01],
[6.5534e-01, 3.4466e-01],
[4.4688e-01, 5.5312e-01],
[8.8147e-01, 1.1853e-01],
[9.9980e-01, 1.9565e-04],
[7.6115e-01, 2.3885e-01],
[9.9205e-01, 7.9482e-03],
[6.2418e-01, 3.7582e-01],
[9.5457e-01, 4.5432e-02],
[9.3219e-01, 6.7809e-02],
[9.7844e-01, 2.1556e-02],
[8.5520e-01, 1.4480e-01],
[9.2151e-01, 7.8488e-02],
[9.9145e-01, 8.5452e-03],
[6.2208e-01, 3.7792e-01],
[9.5117e-01, 4.8829e-02],
[5.7008e-01, 4.2992e-01],
[9.8578e-01, 1.4219e-02],
[9.9276e-01, 7.2404e-03],
[6.2644e-01, 3.7356e-01],
[9.4103e-01, 5.8970e-02],
[2.6042e-01, 7.3958e-01],
[8.4114e-01, 1.5886e-01],
[9.9983e-01, 1.7048e-04],
[9.8680e-01, 1.3196e-02],
[8.4676e-01, 1.5324e-01],
[9.6020e-01, 3.9797e-02],
[8.7532e-01, 1.2468e-01],
[9.8866e-01, 1.1343e-02],
[8.3955e-01, 1.6045e-01],
[8.9132e-01, 1.0868e-01],
[9.9871e-01, 1.2883e-03],
[9.8665e-01, 1.3352e-02],
[7.1213e-01, 2.8787e-01],
[9.3852e-01, 6.1476e-02],
[2.2901e-01, 7.7099e-01],
[9.4070e-01, 5.9303e-02],
[9.7276e-01, 2.7241e-02],
[7.0276e-01, 2.9724e-01],
[9.7210e-01, 2.7895e-02],
[9.9671e-01, 3.2880e-03],
[5.3830e-01, 4.6170e-01],
[8.5895e-01, 1.4105e-01],
[8.7945e-01, 1.2055e-01],
[9.9073e-01, 9.2734e-03],
[9.6120e-01, 3.8799e-02],
[5.0640e-01, 4.9360e-01],
[5.9930e-01, 4.0070e-01],
[9.7728e-01, 2.2716e-02]]),
tensor([0, 1, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0,
1, 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 1, 1, 1, 1, 0,
0, 0, 1, 0, 0, 0, 0, 0, 1, 1, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1,
0, 1, 0, 0, 0, 0, 1, 1, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0,
0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 1, 0, 1, 0, 0, 0, 0, 0, 0, 0,
0, 1, 0, 0, 1, 1, 0, 1, 0, 0, 0, 1, 1, 0, 1, 0, 1, 0, 0])]