GTX 2080/2080Ti RTX for Deep Learning?

Your GPU issue in cats and dogs is being discussed here: Pytorch v1.0 stable is now working on Windows but fastai v1 needs some tweaks to get it work on Windows

Just edited my post with a link, sorry for the confusion.

@PeterKelly Hi Peter, not a problem at all! Everyone learns in their own pace :smiley: I was struggling with out-of-memory errors and stuff like that for a long time. So, you know, not too much progress here :smile:

Would be great to finally try an RTX architecture in the field. I have one 2080 (plain) card and still haven’t had enough time to update the drivers and check how fp16 precision works. Have this point on my list and very glad that you and many other people here found this topic worthy of the discussion! I guess this forum is on the best places to talk with Deep Learning practitioners and related software/hardware.

1 Like

@devforfu
Hi Ilia,
I had done a little test on comparision for 2080Ti Vs 1080Ti run times using fp_16()

If you’re interested, we could have a “fastai fp_16 leaderboard” to compare our scores?
I know @EricPB is already doing some interesting tests on a 2060

Link to my writeup, Test code

2 Likes

For those interested in fp16 and mixed-precision, nvidia has a recent series on tensors and mixed precision:

And an upcoming webinar:

Webinar: Accelerate Your AI Models with AMP - NVIDIA Tools for Automatic Mixed-Precision Training in PyTorch

Join us on February 20th:

• Learn from NVIDIA engineers on real-world use cases for significant speed-ups with mixed-precision training


• Walk through a use case with NVIDIA’s toolkit for Automatic Mixed-Precision (AMP) in PyTorch
• Live Q&A

REGISTER NOW

4 Likes

I created a post with link to some quick and dirty benchmarks, training Cifar-10 & Cifar-100 on two GPUs, the current and cheapest RTX (2060) and the last-gen “King of the Hill” GTX (1080Ti).

If you can use FP16 for the RTX Tensor Cores, it will most likely be faster than the 1080Ti, despite half the price and half the VRAM.

Now should you get 2* RTX 2060 over a single RTX 2080, for the same price, is open for discussion and beyond my pay-grade :sunglasses:

Comparing the RTX 2060 vs the GTX 1080Ti, using Fastai for Computer Vision

3 Likes

My take on 2x GPU Vs 1x GPU.

I think this should be a completely different discussion but a few thoughts from me:

My experience: I have a 1x2080Ti air cooled build- Case: Coolermaster H500

I’m against the 2x GPU setup as the potential money saved would be then required to invest into a water cooling solution or blower ed GPU.

In theory 2x Blower GPU would work but then having them setup such that their hot air is exhausted from different locations on the case is a challenge as most of the cases don’t assume that you’d be requiring to blower cool your GPU.

Water cooling/Liquid Cooling needs custom solutions and would require some expertise with hardware.

My take is-given how nervous I get when Ubuntu takes 2 more seconds to boot and I start glancing at my case, it was a more relieving choice to have 1xGPU and some peace of mind over 2x liquid cooled ones.

any new insights on this?

I have now 3000$ and I want to build a workstation.

Any recommendations, on what to build? I want to buy something that will last a couple of years at least

Thank you

$3000 will make quite a decent DL rig.

If you can add another grand, you could build a workstation with a RTX Titan, which is the most capable consumer gpu for deep learning. Almost as capable as the V100, but at a quarter of the price.

But let’s stay with your 3000$. I would recommend:

  1. You may want to avoid consumer CPUs, which have just 16 lanes. Buy an used Xeon e5 (40 lanes) or a new threadripper (60-64 lanes). Be aware that Intel castrates MKL (the math library) as it runs on a non-intel cpu. You could try with OpenBLAS, but I have no information about the performances.

  2. Buy a motherboard with sufficient number (>= 3) of slots for future expansion.

  3. Use NVMe storage.

  4. Buy enough ram. Twice the FP16 gpu memory should be enough (example: if you buy a single 2080ti, you will end up with 22Gb of “fp16” vram. So go with at least 48Gb of ram. Realize that x79/x99/x299 chipset do support dual, triple and quad channel, e.g. you can go with 3 or 6 sticks and attain more ram speed than 2/4 sticks).

  5. Be aware that having your gpus on 8x gen3 slots won’t penalize you significantly.

  6. The gpu having the best price/performance ratio is still, in my opinion, the 1080ti. It is capable to do fp16 (although with a very modest speedup), which means you can effectively double its memory.
    That in turn does mean that with 1000$, the price of a single 2080ti, you can buy two of them and have a combined memory of 44Gb for vision. Their speed as they run in paralled will be significantly higher than a single 2080ti.

So, my advice is to go with two 1080ti, spare some money, and add an RTX titan in future. Doing so, you can reckon one grand for the gpu and another grand for the rest. Buy a good psu, in the range of 1000/1200W, and a case with more than 7 slots.

5 Likes

thank you !

I can definitely spare extra 1000$ if it is going to be better than doing double 1080ti and upgrading in the future.

Can you give advice on this one please? I have the money now and I need to spend it on equipment as it comes from EU project.

Does anyone know how the Titan RTX performs with multi-GPU setups? As I understand it, there’s no blower style version of the Titan RTX like there is for the Titan V/2080 ti/1080 ti.

It should perform like any other gpu in terms of performance.

In terms of heat dissipation, you can put two of them in non-adjacent slots and/or go liquid (e.g. nzxt kraken g12 should be compatible, since it is compatible with the 2080ti).

If you have other blower gpus and want to buy just one titan, you can put it in the last slot (counting from above): if should have enough room to breathe.

Do not ever forget to check Tim Dettmer’s advice before building a deep learning rig… https://timdettmers.com/

3 Likes

I strongly agree, Tim is another ‘AI Cool Dude’, a rare breed indeed.
PK OZ

What does everyone think of the Super series?

Seems like 2x 2070 super GPUs for a total of 16gb RAM are roughly the same price range as the Ti for 11gb.

Seems enticing, especially considering the faster clock speeds for both memory and core.

2 Likes

I did some benchmarking on a couple of 2070 Supers, no NVLINK.

Used Lambda Labs’ benchmarking tool. Results are here.

Any thoughts on why it performs so well on most, but falls off on VGG16 and AlexNet?

2 Likes

Thanks that is great! Could it be that there are more fully connected layers there? Really not sure lol.

p.s. I went and bought the RTX 2080 Super Aorus! Just came so haven’t tried, will benchmark on pix2pix since it’s the model I use the most.

hi guys, I am having trouble deciding which GPU to pick for my DL rigs. Which one would you recommend?

2080 Super $750 (new)
2070 Super $600 (new)
1080 TI $550 (used, never used on crypto-mining)

Thanks

I do not consider the 2080 as a cost-effective option: it got the same amount of memory as the 2070, and delivers ~15% more speed while costing substantially more.

That leaves us with the 2070 vs 1080ti.
The latter is still the best option in terms of memory for money. It will be faster than the 2070 while operating in fp32, but a bit slower in fp16 (note that pascal cards are capable of operating in fp16 mode, thus effectively doubling the vram size, but the performance gains are maginal when compared with the RTXs).

Since both of them cost more or less the same we can summarize as follows:

  • the 2070 is a bit faster in fp16, and draws less power.
  • the 1080ti has almost 50% more memory, and is faster in fp32.
1 Like

It scales very well. I forgot the page address, by I did read a report by pugetsystems in which they benchmarked two titan rtx both with and without nvlink. Two cards scale rather well even without nvlink.
If you want the blower version, buy the quadro rtx 6000. It has identical specs. But bear in mind that just one slot of separation would provide the necessary space for two titans to breath, given the case has a good overall airflow.