Let me first say that I don’t know the first thing about building a Deep Learning AI computer system, but what I do know is that we absolutely require NVIDIA GPU’s, and GeForce 1080 8GB is the flavor of the day. Limit 2 per customer.
NVIDIA’s Scalable Link Interface (SLI) can support multiple GPU’s; e.g., four to a tower. “SLI is a parallel processing algorithm for computer graphics, meant to increase the available processing power.” - Wikipedia
In order to build a SLI-capable system, you need the following:
A motherboard with at least two free PCIe x16 slots, operating in at least in x8 mode (Nvidia does not support SLI on x4 links). Pretty much all LGA 2011, LGA 2011-v3 and LGA 1150 motherboards satisfy this requirement.
Two (or more) identical Nvidia-based cards that support SLI, or a dual-GPU card like the GeForce GTX 690 or Titan Z. Generally, different cards won’t do the trick.
A suitable power supply. Increasing the number of GPUs in a system rapidly increases its power requirements. Take that into account when you choose your PSU.
An SLI bridge. This is generally provided by your motherboard’s manufacturer as a bundled accessory.
The latest Nvidia drivers. If you’re reading this article, we’re pretty sure that you know that you can grab these from Nvidia’s website.
In the meantime, hopefully application-specific integrated circuit (ASIC) mining will replace the GPU as the primary means of acquiring cryptocurrency.
As for the CPU, I understand the Intel® Core™ i7-8750H Processor with 6 cores should suffice for initial Deep Learning needs, available for approximately $300 through an online search.
For $425 Intel also sells its Limited Edition Intel® Core™ i7-8086K Processor featuring substantially increased clock rates and a 3MB larger smart cache.
NVIDIA Multi-GPU Technology intelligently scales the performance of your application and dramatically speeds up your production workflow. This delivers significant business impact across industries such as Design and Manufacturing, Media and Entertainment, and Energy Exploration.
Substantial time savings—A multi-GPU system—A multi-GPU system helps address the pressures of delivering a high-quality product to market more quickly by providing ultra-fast processing of your computations, renderings, and other computational and visually intensive projects.
Multiple iterations—The ability to revise your product multiple times in a resource and time-constrained environment will lead to a better end result. Completing each iteration of your automobile, animated movie, or seismic data processing faster, leads to additional refinements.
This cutting-edge V-NAND-based NVMe SSD supports PCI Express® Gen 3 x4 lanes, providing a higher bandwidth and lower latency to process even more massive amounts of data than our previous-generation NVMe SSD.
Building on the PCIe Gen. 3.0 x 4 lane interface and support of the NVMe 1.2 protocol, the 960PRO, in combination with our advanced 3rd generation V-NAND and newly developed Polaris controller is able to offer sequential read performance of 3,500MB/s and sequential write speeds of 2,100MB/s. The 960PRO achieves random performance of up to 440,000IOPS and 360,000IOPS for read and write operations respectively for industry leading performance across key metrics.
The 1TB model will set you back $590 on sale, while the 2TB weighs in at $1,165.
Thank you very much for some of the interesting links. They are quite helpful for me as I too was planning on getting a computer system for myself in the near future.
By the way, what budget are you aiming at? I think according to this http://www.trustedreviews.com/reviews/nvidia-geforce-gtx-1080-ti-performance-and-overclocking-page-2 the GTX 1080ti has quite a few more advantages than 1080 in performance and the extra 150$ seems worth spending (again only when it is restocked). On that note, anyone knows how long does it take to re-stock 1080ti ? I think since the past two weeks it has been out of stock. Any other place where one can get 1080ti?
When working with live audio and video for real-time processing, how best to handle the input to the black box?
Would digital signal processing (DSP) music software take advantage of GPU? I really miss MAX, and wish I could have been able to use it on the NeXT with real-time frequency modulation (FM) synthesis on a card back in the day. I just wrote them about how fast.ai might use a front-end to help automate the process of finding the best slope for the loss function.
As my plan unfolds for the brain in a box, so far there are a number of custom setups we should strive for:
Thinking of potential builds, for the professional is the black box ssh I/O to crunch numbers 24/7, the type of system that might reverse engineer planarian regeneration after the boss takes the first stab at it, security bots needing real-time I/O, and game bots where “While in Rome” I have a chip with which to toy.
“Garbage in, garbage out” they say in computer programming. After teaching 39 generations of fruit flies how to count, the 40th were born already knowing how. This works in people, too, be it a predilection for music or domestic violence. Intelligence, or lack thereof, is inherited. Now how do you program it?
Here’s your storage medium.
QUERY: I know there’s a web page listing all the DNA shorthand codes. I’m wracking my brain trying to locate it again.
The 1950’s found Robert G. Heath putting brain implants into human subjects (not a nice guy, by the way), while the 1960’s had Dr. José Delgado at Yale make them remote-controlled, his famous front page New York Times article letting folks know the future (whether they’d realize it or not).
Fast forward to NASA’s 2004 subvocal speech BCI that I heard Delta Force uses which is being tallied at the moment for popularity points among a commercial product.
In related news are modulated microwave bursts. (My condolences go out to our diplomats overseas currently experiencing “health attacks” as the assaults read in the papers.)
We’re at war, sir, despite outward appearances, and terrorist activities are thwarted thanks to AI pattern-spotting software that needs to be improved upon.
All it’d take is taking out the NVIDIA factory(s) to shut down the US AI lead.
Taking out a few factories would not really change any fundamental flaws in AI strategy. For example, preventing computer imports into Russia during the cold war, actually had such a profound effect on their maths research that it put them in the lead for many years.
It doesn’t really make sense to answer a complex question with simple cliches.
In short, this is not the forum to sell your stuff. If someone wanted an NVIDIA (and they are for the moment the only game in town), there are a hundred websites where they can be found.
This is questionable. The top AI scientists still reside and work in the US. The vast majority of money thrown at AI comes by US companies. (Please note I am european).