Making your own server

(Tuatini GODARD) #446

For those who are interested I wrote 2 articles to setup your deep learning rig (software part).
Here they are:
How to setup your own environment for deep learning - Locally
How to setup your own environment for deep learning - For remote access

Don’t hesitate if you have any question to put them in the comment section of the articles, I’m usually very quick to come to help :wink:


(Jon Gold) #447

OK, here’s what I landed on in the end. :pray: it works out.

Whilst I’m waiting for it to arrive, I’m thinking about the software setup. I’d like Windows 10 on it eventually, but I’d also like to not install Windows 10 now. @slavivanov said to install Windows 10 first if you’re going to dual boot; I’m wondering if it’s possible to sidestep those issues by installing Windows on an external hard drive. Can you dual boot from 2 hard drives? I’d love to be able to just focus on getting Ubuntu running for the time being and work on Windows later, but I won’t do that if it will ruin my Ubuntu partition!

PCPartPicker part list:
Price breakdown by merchant:

CPU: Intel - Core i5-7500 3.4GHz Quad-Core Processor ($188.00 @ Amazon)
CPU Cooler: be quiet! - Pure Rock Slim 35.1 CFM CPU Cooler ($24.90 @ Newegg Marketplace)
Motherboard: ASRock - Z270 Killer SLI/ac ATX LGA1151 Motherboard ($100.98 @ Newegg)
Memory: Crucial - Ballistix Sport LT 32GB (2 x 16GB) DDR4-2400 Memory ($269.95 @ Amazon)
Storage: Samsung - 960 EVO 250GB M.2-2280 Solid State Drive ($117.60 @ Amazon)
Storage: Seagate - Barracuda 2TB 3.5" 7200RPM Internal Hard Drive ($66.88 @ OutletPC)
Video Card: EVGA - GeForce GTX 1080 Ti 11GB FTW3 GAMING iCX Video Card ($804.98 @ Newegg)
Case: Fractal Design - Define C TG ATX Mid Tower Case ($89.99 @ Newegg)
Power Supply: EVGA - SuperNOVA G3 850W 80+ Gold Certified Fully-Modular ATX Power Supply ($91.98 @ Newegg)
Total: $1755.26
Prices include shipping, taxes, and discounts when available
Generated by PCPartPicker 2017-10-09 14:35 EDT-0400


(Abhinav Mathur) #448

Hi, I am also inclining towards buying a gaming desktop from costco for deep learning projects specially kaggle competitions. Can you please share your experience with the costco computer which you purchased.



You can install Windows on an external hard drive later, and dual boot that way provided the external hd is fast enough.


(Danial K) #450

Hi guys,

With the help of this forum and the blog posts I listed below, I built my deep learning pc after a couple of months of research and been using it for a couple of weeks. It was a very fun process !!


Intel - Core i7-6850K 3.6GHz 6-Core Processor
Asus - X99-E ATX LGA2011-3 Motherboard
Corsair - Vengeance LPX 32GB (2 x 16GB) DDR4-2400 Memory
Samsung - 850 Pro Series 512GB 2.5” Solid State Drive
NVIDIA GeForce GTX 1080 Ti Founders Edition
Corsair - Air 540 ATX Mid Tower Case
Seasonic X-1250 1250W 80+ Gold Fully Modular Power Supply

Build photos
You can see the build photos here

I ran a couple of CNNs on it and it and was very happy with the performance you can have a look at the code, results and the time it took here.

Useful Blog Posts


(Pierre Guillou) #451

Hello @jeremy
I just wrote to @nolanchan that this pip install of keras does not work in my local computer (read Keras 2 Released) and the conda-forge install of keras 1.x does not exist anymore on line.
Thanks in advance for any ideas to solve my issue.



Hi guys!

Between having: i7 7700 with 32GB or i7 7700K with 16GB, what would you choose?

I’ll pick the 1080 ti and one SSD.


(Corbin Albert) #453

That one. RAM is more important than CPU. Jeremy himself says make sure you have enough room for 64GB RAM.



Hi @corbin

The motherboard supports up to 64GB.

I thought that having the 7700k would be better because later I could add more memory. I mean, it’s easier and cheaper to change the ram than the processor.

But if the processor doesn’t help much, maybe it’s not worth to get the 7700k.

I was thinking about what @EricPB said, he is using a good CPU and just 16GB and he’s able to participate in Kaggle competitions.

1 Like

(Corbin Albert) #455

Oh wait, are you not getting a GPU?



I’ll go with a 1080 ti.


(Matthijs) #457

More RAM is better, but 16 GB is plenty for many use cases. You may not be able to hold your entire dataset in RAM but for many datasets that wouldn’t be possible anyway.

I’m currently doing the Cdiscount competition on Kaggle, which is about 60GB of (compressed) data. My machine is only using 5 GB of RAM during training.

Note that many motherboards recommend that you install memory in pairs because this gives a speedup. So you’d buy 2 sticks of 8 GB instead of 1x16 GB. The downside is that now in order to upgrade to 64 GB you can’t use your 2x8 sticks anymore and you’ll have to buy 4 sticks of 16 GB, so it will end up being more expensive. (Installing memory in pairs isn’t required, but check the manual for your motherboard.)


(Richard Horton) #458

Hello. After agonizing over this for a very long time, I finally decided to buy a proper deep learning machine. Shown below are some of my experiences which I hope can help others.

PowerSpec G428 from Micro Center:

I went with an open box unit for $1,619.16. It included a 1 year warranty and space for 2 additional GPUs. I will probably need to upgrade the RAM. Don’t forget the surge protector. It did not come with much bloatware other than Windows :blush:.

Partition space in Windows – I tried to skip this step, but Ubuntu did not recognize Windows and I didn’t want to mess up the Windows OS.

While in Windows:

  • Right click start button
  • Select Disk Management
  • Right click on the Windows (C:) box in the middle and select Shrink
  • I decided to split the C: drive in half – 50% Windows and 50% Ubuntu

Download Ubuntu 16.04.03 to a disk or USB:

Boot Ubuntu from the disk or USB.

The first time I did this, Ubuntu started to load, but eventually froze up with the following errors:
5.908731 nouveau fifo sched_error
5.908732 nouveau fifo sched_error

Apparently, this had something to do with the video drivers. To fix this:

  • Reboot Ubuntu
  • Wait for the purple screen with the keyboard logo at the bottom
  • Press any key
  • Select your language
  • Press F6
  • Toggle down to nomodeset and press the spacebar to mark it with an “x”
  • Press esc
  • Select “try Ubuntu without installing” – you can actually eventually install Ubuntu from this option
  • More details here:

I found many different suggestions for Ubuntu partitions. This is what I settled on:

Everything else was pretty straightforward. However, the first time I logged in to Ubuntu, all of the icons were super huge. To fix this:

  • Click the gear and wrench icon
  • Select Software and Updates
  • Select the NVIDIA driver

Now it’s time to get to work on Part 2!


(Xi) #459

Hi everybody,

I bought a 2nd hand motherboard

2 questions:

  1. How do I know whether it supports high end GPU, like 1070, 1080, 1080ti?
  2. Does it support multiple high end GPUs?

Xi Xiao


(Corbin Albert) #461

Very nice, always good to see someone investing in their deep learning progression!!!

If it’s not too late though, I’d consider looking at a CPU with a higher # of max PCIe lanes. The 16 of the 7700K will be absolutely fine for 1 GPU, but if you ever upgrade to a second GPU, you will be bottlenecked by the PCIe lanes of the 7700K, which would then be split 8x/8x.

I just bought all my gear and fell into this same trap. Ended up returning my processor and mobo as a result. I’ve grabbed the 6850K instead, which has 40 PCIe lanes. This way 2 can run fully with no bottleneck.

Just a heads up, in case you decide to upgrade to a dual GPU setup.

Anyways, really hope you have lots of fun with your new machine!!


(Corbin Albert) #462

You can check pcpartpicker for compatibility. Automatically filters out any results that aren’t compatible.

From what I found, looks like all the high end cards are compatible :slight_smile:

As for multi-GPU support, it says on the link you provided:

Supports AMD Quad-GPU CrossFireX™ Technology

Also, PCPartPicker didn’t give me the option to add a second video card when it normally would. Which is to say, no. I don’t believe you can add multiple Nvidia GPUs. And unfortunately, deep learning requires Nvidia, as I’m sure you know :no_mouth:



The loss in performance at 8x lanes vs 16 lanes for a 1080TI is pretty small - probably less than ~5%.

I can’t find any benchmarks for 1080TI right now, but you can see almost no loss in performance for a 1080 (slightly weaker card, so slightly harder to botleneck)[1][2], and a little higher loss in performance for a titan (slightly better card so easier to bottleneck)[3] and the 1080TI is in the middle.

So if you have 2 on a 16 lane cpu, you’re not losing much, as long as you dont also share those lanes with a nvme drive.


(Xi) #464

Thanks Corbin! As long as it supports one high end Nvidia GPU, I am happy already :).

One of my friends said that it makes sense to have one extra GPU (low end is fine) so that when high end one hangs due to calculation or failure, monitor still responds.
Is this a valid concern? I didnt see people in this thread talking about such.


(Matthijs) #465

How did you determine that using the 8x/8x split was a bottleneck for using two GPUs? Did you actually run into performance problems? If so, would you mind explaining what sort of neural network you were training at the time?

I’m curious to know this, since (as I’ve pointed out before in this thread), the GPU will spend a relatively long time on doing its calculations versus the time it spends on transferring data to/from the CPU. So “only” 8x would still be fast enough.

But if you have a counter example, I’d like to see if I can reproduce that. (I think the whole “you need lots of PCIe lanes” thing is a bit of a myth, but I don’t mind being proven wrong.)

1 Like

(Matthijs) #466

Note that those tests are not about doing GPU compute on deep neural networks (as far as I can tell), but playing games or just pure bandwidth tests.

For a game you want the GPU to deliver results in less than 16 ms per frame. So the GPU does a relatively short amount of work, and therefore it becomes more important that you can copy data to the GPU quickly.

When doing deep learning, a forward & backward pass may easily take up 100 ms or more, so the GPU cannot use data at the maximum rate that you can transfer it anyway. The bottleneck here is the GPU, not the PCIe bus.

Considering that those tests show that the performance loss is small even for games, it matters even less for deep learning.