I’ve been told a good rule of thumb In estimating your memory needs is to double your largest GPUs memory. On a 2080 Ti using FP16 (16-bit precision) that would be about 34gb.
Can you share some examples where you needed more than 32GB with a 2080 Ti or less, or a proportional example?
I use a 2080ti with 16GB RAM and have never found it limiting. I work with vision. Donate me another 16GB or 48GB and I’ll let you know if performance seems higher
Are you using FP16 or FP32?
Yes, both at times. I think the answer is to get what RAM you can afford, and add to it when you can afford more if you’ve found a restricting lack of it. But running with 32GB or even 16GB isn’t going to stop you doing good work, unless your use case is very heavy on RAM (like what, massive tabular data?). My most recent purchase was to upgrade a 512GB SSD to a 2TB SSD, as it was causing more pain than any RAM bounds.
Do you find that you fully utilize the 2080 Ti? Would you be ok with a GPU with 8gb ram? I will be getting into augmented reality and other computer vision problems.
3d image segmentation for medical purpouse.