Second Hand Hardware Purchase Guide/Discussion

Hi Everyone!

I created yet another discussion thread about hardware even though the previous ones are pretty great. I think this topic warrants a discussion in itself.

Earlier @balnazzar had suggested that I look for second hard hardware, I was completely dismissive of it mostly since India in itself has not a great GPU market (for primary cards), forget about second hand ones.

After reading up on the prices and the expected ranges, I think it could actually be a good bet if you’re travelling to US and bring back one with you.

Here are my thoughts: even if I get a GPU for around 300-500$ and it dies in an year or two-Comparing that with the AWS costs, it’s still a good deal.

So I’ll invite experts on this matter to share their ideas: How to judge the prices, evaluate the card, places to look at (websites/stores/etc) when getting a card and sharing any warnings for GPUs or other hardware.

Thanks in Advance!
PS: I’ll keep adding ideas or caveats that I find on reddit or on the Fastai forums itself.

2 Likes

Annoying @balnazzar, @Adrian, @shoof for suggestions. Thanks!

Hi @init_27. I think that the problem of choosing a second hand gpu revolves around the capability of doing 16-bit computation correctly. As you may have noticed, I posted a specific question in the main rtx thread. I tagged Jeremy in the hope he would clarify such issue, since I expect he and his team do a lot of experiments routinely upon a wide range of hardware. Let’s wait and see if he sheds some light.

If Pascal, as is seems, is capable of doing fp16 without sacrificing on convergence, I think you should buy 2-3 1080ti. These cards will allow you, for a fair price, to train even a big transformer or big convolutional networks for vision, e.g. the Efficientnet in its large versions.

Note that is quite difficult that a card would “die”. However, as you get an used card, you should repaste the die, and be sure that the thermal pads make correct contact with the VRMs and the memory chips. Also, use a good power supply. A 1080ti can peak at almost 300W. You need a 1000W power supply to safely run two of them, or a 1200W psu to run three of them, estimating some 2-300W power draw for the rest of your system.

2 Likes

Thanks so much! @balnazzar

That’s a great point. I’ll look for it on reddit/elsewhere elsewhere (I’m sure you must have, I’m personally curious so I’ll search as well).

This might be a stupid Question, please excuse me if its: to train a transformer, do you need these to be in SLI? Or is it parallelisable across 3 cards. I own 1 GPU and I’ve run experiments no longer than 12 hours so I don’t know the pain.

For a 3xGPU setup, one might also have to look at blower editions/liq cooled ones.

Great points! Thanks.
I’d be much less scared when re-pasting a 2nd hand card Vs a newer one. :smiley: Maybe I’m too scared when it comes to hardware (I didn’t even dare to assemble my first PC Myself)

1 Like

SLI is just an interface for the cards to exchange synchronization signals. It is for gaming, you won’t need it for doing deep learning. Even the NVLink on high-end cards provides just a marginal speedup with respect to the pci-express bus. Be sure, though, to provide at least 8 gen3 pcie lanes per card (max 2 cards for mainstream i7/i9 or ryzen 3000. For 3-4 cards you’ll need an i9-2066, a xeon e5, a xeon W-2xxx, a xeon scalable, or a threadripper/epyc).

1 Like

I agree wrt having your own card vs cloud compute. So handy to have your own rig, and gpus hold their prices well so you probably won’t loose more than 1/3 to 1/2 the cost (depending how long you keep for).

Having 2 x cards (or 1) in machine is fairly straightforward, but if you plan to have more than 1 gpu and if you can, get a motherboard that has 3+ double spaced pcie slots so you can have some room between the cards. (Alternatively 2 x blower style (each 2xpcie slots wide) can fit right next together).

Gpus that are 2x pcie slots wide are more manageable than the ‘extreme’ editions that can be 2.5 pcie slots wide (unless you find these really cheap).

Having 3 or 4 gpus in one machine is a headache -blower style cooling may be fine but still need psu with sufficient power and decent cpu , air cooled can do 3 with big case and good fans, 4 probably getting too hot with air cooling unless you use pcie risers, custom water cooling is a time consuming project, AIO’s may be good option-but need to have enough room for the radiators. i would do 3 or 4 gpu setup as a second build once youve built a 1 or 2 gpu box. If you do want a 3/4 gpu rig I would select motherboard first, then plan build around it. If air cooling -ve pressure in the case has worked best for me.

Cpu wise 4 core fine for 1 gpu, another 2 or 4 cores would be handy if have second gpu. 32 gb ram is nice to have per gpu, but can get by with less.

GTX1070/80/80ti/RTX2060/70/80/80ti all decent choices depending on your budget and their price. I found online store here in au selling galax 2080ti model that was superceded at great price.

Re used gpus: I prefer to buy used gpu from source where can see running and pickup from owner rather than getting posted.
Cpu’s i have bought online via ebay (used)
Motherboard i prefer new (ebay) but have also bought used locally.
Ram i mostly get used from ebay
HD/ssd/nvme id only buy new (local store)
PSU i go with gold or better rating ideally new.
Case, fans, buying used can save a fair bit.

If you are after a used decent box that can fit 1 or 2 gpus the lenovo s30/d30/hp 420?/820/dell equivalent you may be able to find at a good price (though they all have motherboards with 2x pcie spacing and can only fit 2 cards max so they limit what gpus you can get)-or get a used gamer box without gpu.

I have only run gpus on own (each training own model) or using pytorch dataparallel, haven’t tried nvlink yet

I have had no trouble with any gpus i have had and think it reasonably unlikely your gpu will die unless you get one without seeing it running and checking temp under load/or without some knowledge of its history.

In order of items that hold their price i think: gpu>cpu/motherboard/ram>case/fans>drives/psu

3 Likes

I am not on expert on these things by any chance but I also found cloud prohibitive on how much one can use their compute power (colab, kaggle). I then took my chances with Amazon and found it very good, until I saw my weekly bill!

I am still a learner (in UK) but I invested into a second hand rig from ebay (HP Z440) for around £350 and the bought RTX 2070 from NVIDIA directly for under £400. I know there are limitations on the number of GPUs I can have with this box, but for me 1 is sufficient for next year or 2.

Apart from the upfront cost (~£750), my non-tech background made it difficult to set the box up. It took a week but after a week I had everything ready.

I can only say if your goal is to excel at deep learning, having your own box is better considering a few years you will be spending in becoming an expert.

Only thing missing (which I did not know while buying) is lack of a Wifi adaptor. This means I had to plug in my box with the modem through a LAN cable and then create a sort of LAN inside the house that I can now access from anywhere. In hindsight, I would have asked that too but guess its not much expensive to add on if needed.

Regards,
Dev.

2 Likes

Any recommendations for a Macbook pro external GPU setup? I would like not to spend much more than £500 including all the bits that need to be included, don’t mind taking chances on eBay.

From my reading:

Thus, the 2060S got an amazing price/performance ratio, the best among the cards with 8Gb.

I’m against the 2x GPU setup as the potential money saved would be then required to invest into a water cooling solution or blower ed GPU.

TLDR #1 :despite half its VRAM, and half its retail price, the RTX 2060 can blast past the 1080Ti in Computer Vision, once its Tensor Cores are activated with ‘FP16’ code

From this I’m thinking for £624 new:

Have I overlooked anything?

*had to remove links on quotes due to being a new user

MacOS is an issue, and it seems like maybe I have overlooked something.

From the Razer Core X link above

Macs require macOS High Sierra 10.13.4 or later and compatible AMD graphics card.

And best-egpu-mac

Unfortunately, Apple and nVidia don’t seem to get along these days, so the macOS currently only works with a limited selection of graphics cards from AMD. Apple’s website currently these GPUs as being compatible with macOS:

  • AMD Radeon RX 470 and RX 570
  • AMD Radeon RX 480 and RX580
  • etc

Does seem possible with some minor hacking and running in Win10 using bootcamp:

macOS 10.14.6 is loaded but cannot drive the eGPU as Apple-Nvidia have no RTX Nvidia drivers.