Why you probably want a "Blower" GFX for multi-GPU's

In the past, there has been a lot of discussions/questions about the best type/version of GFX for Deep Learning, mainly around “Does O/C (OverClock) help ?”.
Sometimes it came down to “What the best cooling, Dual/Triple fans, or a basic Blower fan ?”

My take was that the better/more heat is expelled from the PC case by the GPU itself, the better for performance and stability as it won’t add extra-heat to the CPU and even more to be expelled by your single 120mm case fan (while an O/C card comes with three 80mm fans of its own => that’s a lot of heat generated inside the case).
So go for a Blower version (plus it’s often +10% cheaper).

For those new to the cooling jargon:

  • Dual/Triple fans: those are typical gaming cards, often with O/C, using 2 or 3 large fans to expel the heat away from the GPU inside the computer case. The idea is that all the extra heat will flow around the case, then must be expelled by the case’s fans.
    Note: these cards tend to be rather “not so noisy” and “cooler” in terms of Max Temp. But they deliver higher FPS (Frame per second) in gaming tests.

  • Blower fan: those are typical the cheap/first version of the new GPU, they use an internal fan that will literally “blow” the heat out of the card directly outside the computer case, via an opening in the rear next to the HDMI connectors.
    Note: these cards can be really noisy, vs triple-fans, because of the air blowing through a small opening in the rear, and also reach higher Max Temp.
    If you wonder how a Blower card looks like, a Google search will provide an immediate and convincing picture.

Now the nice folks at Puget went into some trouble running multi-GPU’s, using Dual-Fan versions. The more tests they run, the more the performance dropped.
Turns out it was related to heat management, each GPU overheating the one next.
Once they switched to Blower versions, the issue went away.


Well, you could always run without the access cover on the case to cool things a bit. You could also build a custom water loop using components such as those from EK Water Blocks. Better yet, just buy an all-in-one (2080Tis shown):



EVGA should have one out as well, soon.

1 Like

Going for liquid cooling versions for Multi-GPUs seems like the logical choice.

Except most PC cases do not come with any extra case-fans’ location, beside the original 120mm case-fan, where to install an extra single liquid-cooling fan already.

If you were to run a dual liquid-cool GPUs on your PC, you’d need a case capable of having three (case-fan + 2 GPUs) 120mm fan spots
AFAIK, that’s quite an exclusive and rare product, and most likely a cheap one.

I run this case, the Corsair 540 which can be had for $100. You can run fans anywhere. If you do not have a custom cooling loop and have 3 radiators (assuming double wide), you can mount one on the top, front, and bottom, and still have the back case fan for exhaust.

1 Like

For 2x watercooled GPU’s several mid/fulltowers I can think of would work OK - I made a spreadsheet of case dimensions if may be of help to anyone:

For 3+ GPU’s, depending on waterblock thickness, spacing between PCIE slots could limit how may GPU’s you can fit - and may need to resort to PCIE risers and custom made brackets (as I have found out the hard way…). Some motherboards have extra space between the PCIE slots which would be handy.

1 Like

Lambda guys use blower gpus for their 4 gpu machines for that reason.
Edit: the above is for the 1080Ti machines they sell. Not sure if they already sell 20xx machines…

1 Like

I have a Sea Hawk 2080 AIO which is nice and quiet. I do hear a slight clicking noise from the pump when the GPU is training, but the fan doesn’t speed up. This model does have a second fan on the board for cooling things other than the actual GPU, so multiple units could restrict airflow a bit.

1 Like

Tim Dettmers, who runs the great blog Which GPU(s) to Get for Deep Learning: My Experience and Advice for Using GPUs in Deep Learning just tweeted about the issue.


The jury is still out on whether blowers will be enough for the RTX 2080 Ti for multiple cards in close proximity. It may be better to go with an AIO solution if you can get it to fit.

1 Like

The Titan RTX is not available with a blower fan, and it’ll probably never be. It’ll be problematic to stack 3/4 of them tightly, or put them into a 1U gpu server.

Specs on the Quadro RTX 6000 appear to be the same as the Titan RTX, with a blower fan (and only $4k more :grimacing:).

Good to know. I just have 4 grands and no idea about how to spend them!

Nvidia is just helpful that way…

1 Like

Does anyone have both a Asus RTX 2080ti turbo and a RTX 2080ti/80/70/ GTX 1080ti/10/70.

What I am really concerned about is the noise of the turbo (i don’t seem have good noise filter in my head), how does the turbo noise compare to a open air GPU?

To my ears, the Asus 2080ti RTX turbo blower is much louder than the 1080ti FE blower and the Titan RTX dual-fan open air heatsink. I can comfortably have a conversation while the RTX dual fan is at 100%. Not so much with the 1080ti FE and definitely not with the Asus turbo.

With that said, the Asus turbo blower moves a lot more air, exhausts that air outside of your case, and is much better at cooling the GPU in airflow constrained situations. My experience is that the open air heatsinks will immediately jump to 80C (throttling) unless there are at least 2 adjacent empty PCIe slots for airflow. Open air will also heat up your CPU and other components somewhat. In contrast, the 1080ti FE and Asus RTX turbo blowers can be stacked immediately adjacent to one another without issues.

My advice:

  1. If you plan only 1 or at most 2 GPUs, and each GPU will have at least 2 empty PCIe slots for airflow, then get the open air cooler. You may have to tweak your case fans to deal with the extra hot air.
  2. If you anticipate more than 2 GPUs, do not have good airflow, or don’t care about noise, get the blowers.

Personally I prefer the blowers’ simplicity and freedom to have 3+ GPUs. I have headphones to deal with the noise.

1 Like


I decided to go with 2x ASUS RTX2080 Turbo’s (2080ti just a bit too pricey), and sold my GTX1080ti’s, the thought being is that I can fit a waterblock on them if noise was too much. I was actually really impressed with how little noise they make. Fans never go much past 50%, but temps seem pretty high - mid 90’s. Will see If I can experiment with settings to keep temps down.

And yes earplugs/noise cancelling headphones pretty useful. Plan is to have flexibility to get 2 more if needed.

Temps in the mid 90s are too high and will lead to substantial throttling. You want to be in the mid 70s.

To get there set the fan speed higher. In Linux this must be done manually with the nvidia driver GUI or with a script from the command line. For windows, I believe you can create a custom fan curve.

1 Like

AFAIK, manual control of GPU fans under linux is only feasible for the GPU attached to a monitor. And this stands both for nvidia drivers gui and for command line.

And yes, mid-90 is definitely too high. The hardware shutdown happens at 96C.

If you want to go liquid, look for the kraken g12, it makes easy to fit a 120mm AIO to a gpu.

If you have x server running, you can manually set the gpu fan speed with nvidia-smi on ALL GPUs, regardless of whether they are plugged into a monitor. IMO this is necessary to avoid throttling in any air cooled multigpu setting. There are instructions here:


If you have a completely headless system, I believe there are DisplayPort dummy plugs that can simulate being plugged in.


Thanks for your reply. I actually read that guide, but my problem is that in spite of having the coolbits set on both gpus, The second gpu answers that its fan speed attribute is not writable. And indeed, if I open nvidia-setting GUI, the situation is the same: I have the fan speed manual bar for the gpu connected to the monitor, but no bar for the other. If I disconnect both nvidia gpus, and connect the monitor to a little ati 5450 i wanted to use to drive the monitor, I lose control over both fan speeds.

Note that one of my gpus hits 90-91C with its fan running ~80%(1080ti FE) during unfrozen training, so the automatic underclock happens heavily. The other runs marginally cooler, but well above the 84C throtting limit.

Don’t know what to do. The case has a lot of fans already… I’m seriously thinking about liquid cooling (via nzxt kraken g12).