Making your own server



Two things to consider here,

First, it looks like you are trying to setup version1 of the course which uses Theano. Theano is dead. Version 2 of the course uses a different framework called Pytorch. Configuring Windows 10 for this is a bit easier and some steps can be found here. In short, you just download cuda 9.0.176? get cudnn v7, install latest anaconda and github desktop. I just set this up today on a laptop similar to yours and it appears to be working.

Second, you may find it easier in the end to go Ubuntu. I have a desktop that is dual booted with dual drives for Win10 and ubuntu. Once the OS is installed, the setup closely resembles the paperspace setup script.


Got it, thanks for your tips. I’m going to try installing Ubuntu.

(Andrea de Luca) #536

I did. I got a working w10 box (i7, gtx1070) with keras/TF and fastai/pytorch in separate conda envs.

What did go awry with your attempts?

However: How to set up Windows 10 for

(Omar Amin) #537

Thanks for the very informative thread, I’m building my own box now, can you please revise this part list and let me know if there’s anything wrong?

I may add another GPU later, that’s why I’ve chosen the x99 motherboard that supports up to 2 gpus * 16x PCI lanes

I’m in short budget, I already have the titan gpu, I’m just creating the box around it, and I noticed the warning of the incompatibility for the cpu and the motherboard, but i think it’s not on this specific version of the CPU & motherboard


i have the 6850 with an X99 msi motherboard, pcpartpicker gave me the same warning, but I did not have an issue.

I would recommend an NVME drive over the SSD you have.
I would also recommend a 1080ti over the Xp.

The HD recommendation is based on my setup. I think the nvme helps alot with file read/write operations. A 1080ti should be cheaper in theory than an Xp and may perform faster. Refer to this thread.

(Andrea de Luca) #539

I think you won’t have any issues, but check “supported CPUs” on MSI website. If you are short on budget, consider a corsair cx850, cheaper than the one you selected, but still a quality product.

If you want to run two 12Gb cards, you’ll need more memory. I recommend 64Gb, but as already recommended, nvme drives will help with swap operations in case you prefer to stick with 32Gb.

You may want to consider Xeon CPUs and ECC registered memory. For a slightly higher price, you get greater stability and not to worry about data corruption over long timespans, not to mention higher density per module, which will give you more room for future updrades.
Finally, Xeon CPUs consume way less power than their core equivalents, despite having the same official TDP.

( #540


To all the members who have configured their own servers,

I’m getting a deep learning desktop but am trying to be cheap and get out of buying a monitor to set it up since I’ll try to use ssh through my laptop for all the work

Does anyone have experience with this sort of a set up or would it just be easier to buy a cheap monitor to manage it? I won’t be assembing it myself since a pre-assembled is the cheapest thing I can find with current GPU prices. I have linux on my laptop and a spare old laptop with windows lying around I can try to use to configure the desktop


(RobG) #541

Yes just install openssh-server and you can terminal in remotely.

You will probably need some monitor to set up the machine (and ssh) in the first place, unless you can manage to create a configured image with ssh on the box at the outset. So check out the card’s ports and any monitor you currently have, so you can get the right adapters.

Similar story for wifi, if your box will have it, you may need to connect via ethernet first to set up wifi.

(Cynosure) #542

Not sure what you mean. if the new machine has already Ubuntu/Linux setup so then you can connect via ssh and would not need a monitor for the DL box most of the time.

I have a similar system too with monitor which i use rarely. in the beginning i need it definitely to setup the stuff.

( #543

It comes with a trial version of windows I’ll have to remove and install Ubuntu. I gave in and got a cheap monitor

(Matthijs) #544

You can probably use your TV for this (if you have one), using the HDMI output on your GPU.

(pommier) #545

Did you need to buy a specific dell PSU?

(Christina Young) #546

No, I bought mine 875W unit from KDMPOWER on eBay:
Looks like they are on vacation or something – If you aren’t sure, send them a message went they reopen on 3/19 and ask about your specific machine. My 875W is still going strong! :slight_smile:


An item of interest the latest in Deep Learning boxes in the UK. I guess you have similar where you are :slight_smile:

The entry level box can be yours for around £86,000, $120150 no pennies or cents here.
Or you can hire by the week, month, or year in the cloud

(Jon Gold) #548

Considering selling my rig - I built it in Oct 2017, but since then I’ve been doing most of my experiments on work resources + there’s not too much point having a great setup gathering dust when someone else could be using it. / $1900 / Bay Area only



I have real estate to host, available power, and a fiber optic connection on the East Coast United States.

Cost of Electricity is relatively low

We are building our own deep learning rigs and we are mining. We currently have around 40 - 1080Ti GPUs among others. We charge around $95 per kW per month to our mining clients, and I am sure that we can work out something similar for any of your machines. OR we could build you a machine to spec.

Please DM me for more information if you would like to discuss further.

(Pawel) #550

I have serious question - why to build your own PC if you could use cloud serwer?
You have to spend $2000+ for decent build + cost of electricity ~$400/year (in CA) so total cost is around $2500 for hardware that might be outdated next year and you wish to upgrade again = spend more $$$ and time on it.

For $2500 you can get 6250h = 260days of computing power at paperspace or 2777h = 155days on AWS if you choose cheapest options and for bigger tasks you can always hire bigger GPUs.

I am seriously thinking about DL but somehow so far I cannot find any economical reason for building my own server except that great urge in the back of my head: “MINE, ALL IS MINE! LOOK AT ME, I DID IT MYSELF!” :wink:

I will appreciate honest answer.


I will appreciate honest answer.

From renting, you get nothing in the end. From buying, you can still resell your rig after six months - one year - eighteen months for, say, 3/4 - 2/3 - 1/2 of the original cost, all the components still within the 24 months warranty we get in the EU. The consumer market for GPUs has plateaued, in the last couple of years, the best pound-for-pound GPU is still the GTX 1070 with 8 GB RAM, which is and will be more than enough for learning purposes and applied domains usage.

So you want to build a Deep Learning AI computer system?
(Pawel) #552

Thank you for the answer, but if you want to sell it why to buy it at all? Isn’t it better to use dedicated server? In the worst case scenario you will spend the same amount of money on it, ant the most probably much less…

I honestly don’t understand what could be economical reason for building your own DL PL… Since I moved my paperspace to P4000 it is working really smooth.


With $2500 you can build a high end rig with a 1080Ti, so we are looking at a P6000 instead of a P4000, which brings you to 2777h(155days) computing power for $2500. It really depends on your needs, if you’re doing it as a profession, for the long run, your investment starts paying off past the 155th day.

Besides, your own personal DL box gives flexibility of adding more GPU and RAM which is very beneficial where I sometimes run multiple tasks on different GPUs. Plus, judging from experience, moving files between the cloud and your local machine is a daunting process. Not that it’s hard, but it’s not as convenient as I like it do be.