Hey folks,
Has anyone tried (or is thinking about) putting together an external GPU kit so you won’t have to use AWS for training models?
I know it’s possible but haven’t dug deep enough myself yet.
Hey folks,
Has anyone tried (or is thinking about) putting together an external GPU kit so you won’t have to use AWS for training models?
I know it’s possible but haven’t dug deep enough myself yet.
You’d be better off buying a cheap desktop and putting a GPU in it, rather than adding an external GPU to a macbook. The external GPU approach is here: https://bizon-tech.com/us/bizonbox2s-egpu.html/ . As you see, MacOS doesn’t support recent GPUs - and you’d definitely want a recent GPU, because they are much better for deep learning.
For the same price you can buy a powerful used desktop, eg: http://www.used-pcs.com/dell-xps-8700-twr-core-i7-4770-cpu-3ghz-16gb-ram-160gb-hd-cosmetic-grade-b-22840151/ . Install Linux on it, and add a recent GPU (eg GTX 1070 or 1080).
It’s possible now with Razer Core, however, I found it much cheaper to just go with an EC2 instance on Amazon (p2.xlarge has a Tesla K80 for $0.90 an hour…).
That has changed now: pascal drivers are there http://www.nvidia.com/download/driverResults.aspx/117854/en-us I am a laptop user as well and would love to have a quick GPU for deep-learing available locally like the 1080-TI.
Some folks mentioned that the connection throughput of the thunderbolt 3 cables compared to putting the GPU into a normal computer is inferior. I wonder if this really is an argument against e-GPUs for deep-learing as the data is transferred only once and then the number crunching starts.