Using mining gpus for deep learning

I am wondering if anyone here has tested to use for instance msi P106-100 mining cards for deep learning.
I think the 106 is equivalent to the GTX1060 according to this site:

since there is an aftermarket with quite cheap gards, it would be interesting to build a deep learning rig from unused mining gear.

I’m doing my DL research on a rig of 1080ti :slight_smile:

1 Like

I wonder if four P106 (think GTX 1060) would be better than one 2080ti

1060: 6Gb RAM
2080Ti: 11Gb

Go, figure…