Recommendations on new 2 x RTX 3090 setup

Don’t expect the 2xA4500’s to appear as 1 big card with 40gb of memory after you have installed the nvlink bridge. That’s not how nvlink works for deep learning. You would potentially have better performance for model parallel workloads, but it’s not necessarily trivial to set that up and get it working optimally. If you truly need > 24gb of ram for your models then you’re probably best of stepping up to the A6000 but it’s an expensive card so you’d want to be sure.

Previously you mentioned the purpose of this machine is for personal use for learning and experimenting. You have spec’d out a > $10,000 machine. If you’re just getting started I would personally start off with something a lot cheaper with 1 or 2 cards (3080 or better). If you’re not worried about the money, then what you’ve spec’ed is fine but probably overkill. If you’re planning on running this constantly for business use then I’d go with the pro A series cards, if it’s just for learning for you then I’d go with the consumer grade cards and go with a consumer grade cpu and motherboard combo or used workstation grade gear.

You might find some useful tips in this thread as well: For those who run their own AI box, or want to

1 Like