You can probably use your TV for this (if you have one), using the HDMI output on your GPU.
Did you need to buy a specific dell PSU?
No, I bought mine 875W unit from KDMPOWER on eBay: http://stores.ebay.com/KDMPOWER
Looks like they are on vacation or something – If you aren’t sure, send them a message went they reopen on 3/19 and ask about your specific machine. My 875W is still going strong!
An item of interest the latest in Deep Learning boxes in the UK. I guess you have similar where you are
The entry level box can be yours for around £86,000, $120150 no pennies or cents here.
Or you can hire by the week, month, or year in the cloud
Considering selling my rig - I built it in Oct 2017, but since then I’ve been doing most of my experiments on work resources + there’s not too much point having a great setup gathering dust when someone else could be using it.
https://pcpartpicker.com/b/dtQZxr / $1900 / Bay Area only
I have real estate to host, available power, and a fiber optic connection on the East Coast United States.
Cost of Electricity is relatively low
We are building our own deep learning rigs and we are mining. We currently have around 40 - 1080Ti GPUs among others. We charge around $95 per kW per month to our mining clients, and I am sure that we can work out something similar for any of your machines. OR we could build you a machine to spec.
Please DM me for more information if you would like to discuss further.
I have serious question - why to build your own PC if you could use cloud serwer?
You have to spend $2000+ for decent build + cost of electricity ~$400/year (in CA) so total cost is around $2500 for hardware that might be outdated next year and you wish to upgrade again = spend more $$$ and time on it.
For $2500 you can get 6250h = 260days of computing power at paperspace or 2777h = 155days on AWS if you choose cheapest options and for bigger tasks you can always hire bigger GPUs.
I am seriously thinking about DL but somehow so far I cannot find any economical reason for building my own server except that great urge in the back of my head: “MINE, ALL IS MINE! LOOK AT ME, I DID IT MYSELF!”
I will appreciate honest answer.
I will appreciate honest answer.
From renting, you get nothing in the end. From buying, you can still resell your rig after six months - one year - eighteen months for, say, 3/4 - 2/3 - 1/2 of the original cost, all the components still within the 24 months warranty we get in the EU. The consumer market for GPUs has plateaued, in the last couple of years, the best pound-for-pound GPU is still the GTX 1070 with 8 GB RAM, which is and will be more than enough for learning purposes and applied domains usage.
So you want to build a Deep Learning AI computer system?
Thank you for the answer, but if you want to sell it why to buy it at all? Isn’t it better to use dedicated server? In the worst case scenario you will spend the same amount of money on it, ant the most probably much less…
I honestly don’t understand what could be economical reason for building your own DL PL… Since I moved my paperspace to P4000 it is working really smooth.
With $2500 you can build a high end rig with a 1080Ti, so we are looking at a P6000 instead of a P4000, which brings you to 2777h(155days) computing power for $2500. It really depends on your needs, if you’re doing it as a profession, for the long run, your investment starts paying off past the 155th day.
Besides, your own personal DL box gives flexibility of adding more GPU and RAM which is very beneficial where I sometimes run multiple tasks on different GPUs. Plus, judging from experience, moving files between the cloud and your local machine is a daunting process. Not that it’s hard, but it’s not as convenient as I like it do be.
You can build it with much, much less.
For example, I built mine for less than 500 bucks, plus the cost of the 1080 ti:
It is a valuable skill by itself. You got to build it to be functional, efficient, having a good price-per-teraflop ratio, and reasonably fault tolerant.
Moreover, you will have total control about what happens inside.
I have build my first PC 20 years ago, so it is nothing new for me…
Actually knowing how much time it can take I decided that it is better to spend this time on self learning and focus on the course instead spending hours on searching for the best possible money/performance build.
Thank you for all the answers!
I am offering free 1080Ti GPU instances for deep learning.
Sign up at https://dashboard.tensorpad.com/signup
- The instances have 16GB RAM, 4 CPUs cores and one 1080Ti GPU and you can run multiple instances in parallel.
- Instances run on JupyterLab
- You can access the command line and use it as a dedicated server for training
If you are having any issues launching the free instance, here are the additional instructions: https://www.dropbox.com/s/i5bgpx9g6jfubm7/Guide.pdf?dl=0
Feel free to contact me at firstname.lastname@example.org
Can you share the code how to set up? because I an not able to do it.
Hi everyone, I am planning to build a server for the fastai V3 with following specs,
|CPU||AMD - Threadripper 1900X 3.8GHz 8-Core Processor||$319.89 @ OutletPC|
|CPU Cooler||Noctua - NH-U12S TR4-SP3 55.0 CFM CPU Cooler||$69.90 @ Amazon|
|Motherboard||MSI - X399 SLI PLUS ATX TR4 Motherboard||$302.93 @ B&H|
|Memory||Corsair - Vengeance LPX 32GB (2 x 16GB) DDR4-3200 Memory||$309.49 @ Amazon|
|Storage||Samsung - 960 EVO 1TB M.2-2280 Solid State Drive||$298.82 @ Amazon|
|Video Card||MSI - GeForce GTX 1080 Ti 11GB GAMING X Video Card||$749.99 @ B&H|
|Case||Phanteks - Enthoo Pro M Tempered Glass (Black) ATX Mid Tower Case||$97.98 @ Newegg|
|Power Supply||EVGA - SuperNOVA G3 1000W 80+ Gold Certified Fully-Modular ATX Power Supply||$135.00 @ Amazon|
|Monitor||Dell - S2715H 27.0" 1920x1080 60Hz Monitor||$179.00 @ B&H|
|Prices include shipping, taxes, rebates, and discounts|
|Total (before mail-in rebates)||$2,523.00|
I am building this with future possible expansion in mind (extra GPU or RAM). Please let me know if you have any suggestions. Thanks you!
I would also add 1 or more HDDs for “cold” storage of data.
Thanks Maxim. Yes, with time I plan to add 1 HDD, more RAM and another GPU.
Thanks for the likes. This post is very old and outdated. Typically, you won’t install TF or PyTorch by compiling from source anymore.
You’d rather follow the installation instructions discussed in the recent course threads.
And you typically will want to use the model versions in the course, because they incorporate the latest features. For example vgg16 didn’t include batchnorm originally.
Any experience with Ubuntu 18? I only see references to Ubuntu 16 above.
Looking to do new partition install with Nvidia 2070 GPU on 256 GB SSD. Very excited