VRAM amount suggestions

So…I took a long break from fast.ai this year, and now that I am planning to return to it (including the online courses), it appears things have changed quite a bit. “Back in my day” the P4000 card (8GB) would have been fine on Paperspace, but now it appears the recommended setup is a P5000 (16GB).

This poses a bit of a problem for my current machine, which has a 6GB 2060 card. Using Tim Dettmers’ approximate rule of “FP16 ~ 1.5x as much VRAM,” and assuming 8GB Pascal cards are no longer adequate, I don’t know that 9GB is going to cut it either. Or will it? I’m asking for both the 2019 courses, and general use. 2080ti is out of my price range, so it appears my options are…

Stick with 2060 (9GB at FP16)
Used 1080ti (11GB, FP32 only)
2070 or 2060 super (12GB at FP16)

If anyone with recent experience on these cards can give some input on what they are and aren’t able to do with them, I’d appreciate it. If even 12GB won’t be enough for some things, or if you are able to do everything with a 4GB card with FP16 and a lower batch size, that is also valuable input. Thanks.

Link to Tim’s GPU article: https://timdettmers.com/2019/04/03/which-gpu-for-deep-learning/
Looking at the performance charts, it looks like the newer RTX cards are not so great at word-level RNN’s, which is somewhat unfortunate because this is an area I would like to delve into.

3 Likes