GTX 2080/2080Ti RTX for Deep Learning?


(Sanyam Bhutani) #143

Tagging @EricPB.
I think Eric is one or the only one on the forums who has tried all of the RTX cards? (He had mentioned that rn he is using a 2060-earlier it were a 2070 and a 2080Ti)


(Dana Ludwig) #144

This article also supports the idea that the 2080 TI is about twice as fast as the 1080 TI (though the 2080 may be actually slower!).

Another consideration is that until we get easy ways to share one learning run between multiple GPU’s, you will want a single GPU that is fast.

Initially I thought the 2080 TI was far more expensive than the 1080 TI, but today I’m seeing street prices where it is only 2x more expensive ($1300 vs $500-700)


(Eric Perbos-Brinck) #145

Are you using fastai under Windows or Linux/Ubuntu ?

Afaik, under Windows native Linux doesn’t support GPU, only CPU computing.


(Robert Salita) #146

At the base level, Windows supports GPU natively. There’s no GPU support in WSL (bash). Some software isn’t fully compatible such as pickle (PIL). pickle is used in some fastai notebooks. fastai is aware of the issue. pytorch 1.0 and tensorflow are officially supported on Windows.

For more info: Pytorch v1.0 stable is now working on Windows but fastai v1 needs some tweaks to get it work on Windows


(Robert Salita) #147

Your GPU issue in cats and dogs is being discussed here: Pytorch v1.0 stable is now working on Windows but fastai v1 needs some tweaks to get it work on Windows


(Eric Perbos-Brinck) #148

Just edited my post with a link, sorry for the confusion.


(Ilia) #149

@PeterKelly Hi Peter, not a problem at all! Everyone learns in their own pace :smiley: I was struggling with out-of-memory errors and stuff like that for a long time. So, you know, not too much progress here :smile:

Would be great to finally try an RTX architecture in the field. I have one 2080 (plain) card and still haven’t had enough time to update the drivers and check how fp16 precision works. Have this point on my list and very glad that you and many other people here found this topic worthy of the discussion! I guess this forum is on the best places to talk with Deep Learning practitioners and related software/hardware.


(Sanyam Bhutani) #150

@devforfu
Hi Ilia,
I had done a little test on comparision for 2080Ti Vs 1080Ti run times using fp_16()

If you’re interested, we could have a “fastai fp_16 leaderboard” to compare our scores?
I know @EricPB is already doing some interesting tests on a 2060

Link to my writeup, Test code


#151

For those interested in fp16 and mixed-precision, nvidia has a recent series on tensors and mixed precision:

And an upcoming webinar:

Webinar: Accelerate Your AI Models with AMP - NVIDIA Tools for Automatic Mixed-Precision Training in PyTorch

Join us on February 20th:

• Learn from NVIDIA engineers on real-world use cases for significant speed-ups with mixed-precision training


• Walk through a use case with NVIDIA’s toolkit for Automatic Mixed-Precision (AMP) in PyTorch
• Live Q&A

REGISTER NOW


(Eric Perbos-Brinck) #152

I created a post with link to some quick and dirty benchmarks, training Cifar-10 & Cifar-100 on two GPUs, the current and cheapest RTX (2060) and the last-gen “King of the Hill” GTX (1080Ti).

If you can use FP16 for the RTX Tensor Cores, it will most likely be faster than the 1080Ti, despite half the price and half the VRAM.

Now should you get 2* RTX 2060 over a single RTX 2080, for the same price, is open for discussion and beyond my pay-grade :sunglasses:

Comparing the RTX 2060 vs the GTX 1080Ti, using Fastai for Computer Vision


(Sanyam Bhutani) #154

My take on 2x GPU Vs 1x GPU.

I think this should be a completely different discussion but a few thoughts from me:

My experience: I have a 1x2080Ti air cooled build- Case: Coolermaster H500

I’m against the 2x GPU setup as the potential money saved would be then required to invest into a water cooling solution or blower ed GPU.

In theory 2x Blower GPU would work but then having them setup such that their hot air is exhausted from different locations on the case is a challenge as most of the cases don’t assume that you’d be requiring to blower cool your GPU.

Water cooling/Liquid Cooling needs custom solutions and would require some expertise with hardware.

My take is-given how nervous I get when Ubuntu takes 2 more seconds to boot and I start glancing at my case, it was a more relieving choice to have 1xGPU and some peace of mind over 2x liquid cooled ones.