Recommendations on new 2 x RTX 3090 setup

Correct.

Based on a discussion here: https://www.reddit.com/r/nvidia/comments/kdro5w/comment/gg2futw/
it seems it’s a 15-20% increase in speed due to GPU->CPU->GPU vs GPU->GPU transfers going on… also seems that the A6000 card counterpart comes with a slightly better (and compatible) bridge for even faster speeds, both cards are GA102 based.

End of the day 2x 3090 (used) and waterblocked with a NVLink would very much trump the hell out of a 4090 simply due to the overall VRAM issue, let alone the almost linear scaling via DDP and some adjustments…

To anyone interested, here is a performance comparison between the 3090 and 4090 — and here is a comparison between 3090 and A6000.

The performance gap between 3090 and 4090 is quite noticeable, but I personally will stick to my 3090 and wait for the next generation of cards (and CPU/mobo). I don’t think it is worth chasing the latest hardware innovation, especially considering the several issues with melting power connectors.

2 Likes

Same here. Apart from the Sh*tty power plugs, the A6000 is quite capable, despite the already presented Lovelace 6000. BTW, the amounts of VRAM are the same as Ampere.

Just be careful out there…I’ve seen reports of a6000’s with melting power cables as well. To be honest, when I saw this a few months ago, it was enough to make me think twice about continuing this hobby at home. And then the 4090 reports started coming out a couple weeks ago…

Fire hazard aside, even if it was safe - I’m not even sure which AIB I would buy from now that EVGA is gone, as I’m not a fan of fighting companies over warranty issues. For me, I’m not sure the future is as bright for home-built setups vs. a prebuilt machine with a warranty/service contract from bizon/lambda, etc. (or an appropriately sized cloud VM for the workload.)

Thank you again @balnazzar! :slight_smile:. For now I suspect I have managed to not needing to build a data pipeline machine, however, I suspect the time may come due to just sheer amount of data.

This is something I was not aware of, thank you:

This too I didn’t know:

That is a piece of information worth tucking away, and I suspect it will come in handy. Thank you again!

1 Like

I am trying to buy computation for a total of 48 gb and would be running 4 models of size 12 gb (including batch size + model size).

My question is regarding the benchmark at lambdamark website which says that one A6000 is almost 30% better than 3090 but not twice.

What do you suggest for my scenario and what would be the price and all ?

Thanks in advance !

The man tortured the power supply with 30 days continuous training, but the connector melted indeed on the PSU side, not on the GPU side.
Note that:

  1. This bloke arranged the cables so that the connectors on the PSU got no ventilation (the cables and connectors heat up by Joule effect).
  2. This has nothing to do with the GPU, or with the connector’s engineering. That’s a standard goddamn pci-e plug. It would have melted even sooner with a 3090, which has a higher TDP than the A6000.
  3. This man doesn’t say/show the most important thing, that is, whether he has connected that power output on the PSU side to a SINGLE power plug on the GPU. It’s perfectly possible that he used the Y output on the other end of the cable to connect BOTH the power inputs of the A6000 to the single output of the PSU. In such a case, it’s a miracle that the connector lasted 30 days on full steam.
  4. This bloke is notoriously sloppy and incompetent, as my comments under other videos on his channel point out (I’m sorry if I sound arrogant, but enough is enough).

Finally, reporting that “A6000 melts the power connectors” is just very, very BAD information. The fact that none of the A6000 professional users (lambda, exxact, etc…) report any of the issues that Heaton encounters, due to his own incompetence, should have told you something.
Those people don’t encounter any issue because they know how to build a GPU workstation or at the very least know how connect a couple of godforsaken PSU cables the proper way, whereas Heaton is just a bloke messing around, like many other who infest youtube and Medium, spreading sh*t without even knowing what they are doing.
The funny thing is that while we spend tons of earned money to buy our GPUs, Nvidia just sends the same, expensive GPUs to him for free.

1 Like

@matdmiller is it connected via SLI or nvlink or interferes with the case ?

Any updates, how its going now ?

No NVlink on this machine. The kingpin has a stupid screen on top and would interfere with a wide card directly above it. My case is enormous so it doesn’t interfere with it but might on smaller cases. The main thing I like about the kingpin is that it seems to have a large metal backplate which is good for cooling and it seems to run cooler than my other 3090s.

Thanks for the information. I really appreciate your help. One more question, is it possible to run deep learning taks on both gpu with full capacity for say 7 days ?

Thanks for the information. I really appreciate your help. One more question, is it possible to run deep learning taks on both gpu with full capacity for say 7 days ?

As long as you cool them properly, yes. Crypto miners would run GPU’s at full tilt for much longer than 7 days so it’s certainly possible.

If you’re doing training, I suspect you’ll be better off just training one model at a time per GPU, even if you have enough GPU ram for multiple. If you’re just doing inference then it’s probably fine.

Hard to say without knowing what you’re trying to accomplish. If this is personal box for you to learn and experiment on, then Price/performance wise, 3090 is probably the best. If 24GB of VRAM isn’t enough then your option is the A6000. If this is for a business then maybe raw speed or reliability would play more of a factor, or maybe not.

If I were building a personal box for myself today I’d probably go with the 3090, which is what I do have in both my personal and work boxes today.

I am looking for personal box for myself to learn and experiment on. Do you think one rtx 3090 TI is sufficient as its on sale on nvidia website the FE edition for $1100. I would assume it would be unwise to have a dual system of rtx 3090 TI FE edition (due to 350W power consumption per gpu) ? [https://store.nvidia.com/en-us/geforce/store/?page=1&limit=9&locale=en-us&category=DESKTOP,GPU&gpu=RTX%203090,RTX%203090%20Ti]

I would probably start with a single GPU, but I personally think it’s a good idea to have the ability to add another one in the future if you decide it’s worth it for you. I find having multiple is handy. I have not heard of any issues with the 3090 TI FE. It’s not out of the realm of possibility to run multiple of these cards in a single machine. I’m running 2x 3090’s no problem and you can buy machines with 3x. You would need to make sure everything in your system would support multiple - ex: large enough PSU, enough slots (ideally want at least 1 open slot between cards for better cooling), enough lanes, fast enough CPU to keep them fed, large enough case, etc…

Thanks, that makes sense. If not much trouble, Please can you tell me your system configuration and (is it ok to buy parts from amazon or you bought custom made or you made it yourself ?)?

I would choose your parts with pcpartpicker.com and post here for people to comment. My latest build for work was 2x 3090 with WRX80E Mobo, TR Pro CPU, 1600W PSU, same case as below post, 980 Pro SSDs. Probably overkill for a personal build. My personal PC is 5 years old w/3090s replacing original 1080ti’s but everything else is original and you can’t buy those parts anymore.

Here is an example build from another person.

There are a lot of component selection tips in this thread, I’d recommend going back through and reading the ones from at least the last year or more.

As for where to buy parts, Amazon or any other reputable retailer is fine. Most people on this thread bought parts and assembled them rather than buying a prebuilt.

1 Like

Do you think if 3090 Founder edition will work just fine with 4-slot spacing. I guess you have EVGA rtx 3090 kinpin edition and how that works like noisy or hot or something inconvenient like that as I plan to put in my room not in lab (hall may be) ?

If i were to go for only single gpu rtx 4090 FE, how good/bad below one is:

https://www.bestbuy.com/site/clx-set-gaming-desktop-amd-ryzen-9-7950x-64gb-ddr5-4800-memory-geforce-rtx-4090-1tb-m-2-nvme-ssd-6tb-hdd-white/6523805.p?skuId=6523805