So you want to build a Deep Learning AI computer system?

Let me first say that I don’t know the first thing about building a Deep Learning AI computer system, but what I do know is that we absolutely require NVIDIA GPU’s, and GeForce 1080 8GB is the flavor of the day. Limit 2 per customer.

GEFORCE® GTX 1080 TI Founders Edition
NVIDIA: $699

  • 3584 NVIDIA® CUDA® cores and 12 billion transistors
  • no Base Clock listed (assuming 1607 MHz)
  • 1582 MHz Boost Clock

GEFORCE® GTX 1080 Founders Edition
NVIDIA: $549

  • 2560 NVIDIA® CUDA® cores
  • 1607 MHz Base Clock
  • 1733 MHz Boost Clock

Interesting that the Boost Clock specification decreases with price.

NVIDIA’s Scalable Link Interface (SLI) can support multiple GPU’s; e.g., four to a tower. “SLI is a parallel processing algorithm for computer graphics, meant to increase the available processing power.” - Wikipedia

tom’sHARDWARE writes in 2015:

In order to build a SLI-capable system, you need the following:

  1. A motherboard with at least two free PCIe x16 slots, operating in at least in x8 mode (Nvidia does not support SLI on x4 links). Pretty much all LGA 2011, LGA 2011-v3 and LGA 1150 motherboards satisfy this requirement.
  2. Two (or more) identical Nvidia-based cards that support SLI, or a dual-GPU card like the GeForce GTX 690 or Titan Z. Generally, different cards won’t do the trick.
  3. A suitable power supply. Increasing the number of GPUs in a system rapidly increases its power requirements. Take that into account when you choose your PSU.
  4. An SLI bridge. This is generally provided by your motherboard’s manufacturer as a bundled accessory.
  5. The latest Nvidia drivers. If you’re reading this article, we’re pretty sure that you know that you can grab these from Nvidia’s website.

NOTE: @balnazzar has discovered an alternative multi-CPU configuration.

NVIDIA provides a list of motherboards that are Quadro SLI Ready in a dual GPU configuration.

In the meantime, hopefully application-specific integrated circuit (ASIC) mining will replace the GPU as the primary means of acquiring cryptocurrency.

As for the CPU, I understand the Intel® Core™ i7-8750H Processor with 6 cores should suffice for initial Deep Learning needs, available for approximately $300 through an online search.
intel-i7-8gen

For $425 Intel also sells its Limited Edition Intel® Core™ i7-8086K Processor featuring substantially increased clock rates and a 3MB larger smart cache.

2 Likes

Check the existing threads on here for much more rich discussions. :slight_smile:

1 Like

NVIDIA Multi-GPU Technology intelligently scales the performance of your application and dramatically speeds up your production workflow. This delivers significant business impact across industries such as Design and Manufacturing, Media and Entertainment, and Energy Exploration.

  • Substantial time savings—A multi-GPU system—A multi-GPU system helps address the pressures of delivering a high-quality product to market more quickly by providing ultra-fast processing of your computations, renderings, and other computational and visually intensive projects.
  • Multiple iterations—The ability to revise your product multiple times in a resource and time-constrained environment will lead to a better end result. Completing each iteration of your automobile, animated movie, or seismic data processing faster, leads to additional refinements.

nvidia-multi-gpu-diagram-graphic

2 Likes

SSD M.2 is new to me too, but NewEgg has the SAMSUNG 960 PRO M.2 512GB NVMe PCI-Express 3.0 x4 Internal Solid State Drive (SSD) MZ-V6P512BW for $295 saying:

  • This cutting-edge V-NAND-based NVMe SSD supports PCI Express® Gen 3 x4 lanes, providing a higher bandwidth and lower latency to process even more massive amounts of data than our previous-generation NVMe SSD.
  • Building on the PCIe Gen. 3.0 x 4 lane interface and support of the NVMe 1.2 protocol, the 960PRO, in combination with our advanced 3rd generation V-NAND and newly developed Polaris controller is able to offer sequential read performance of 3,500MB/s and sequential write speeds of 2,100MB/s. The 960PRO achieves random performance of up to 440,000IOPS and 360,000IOPS for read and write operations respectively for industry leading performance across key metrics.

The 1TB model will set you back $590 on sale, while the 2TB weighs in at $1,165.

2 Likes

Thank you very much for some of the interesting links. They are quite helpful for me as I too was planning on getting a computer system for myself in the near future.

I will just link another thread Build your deep learning box: wiki thread which also has a lot of useful stuff.

By the way, what budget are you aiming at? I think according to this http://www.trustedreviews.com/reviews/nvidia-geforce-gtx-1080-ti-performance-and-overclocking-page-2 the GTX 1080ti has quite a few more advantages than 1080 in performance and the extra 150$ seems worth spending (again only when it is restocked). On that note, anyone knows how long does it take to re-stock 1080ti ? I think since the past two weeks it has been out of stock. Any other place where one can get 1080ti?

3 Likes

Throw enough eyes on the problem, and we’ll get an itemized list of exactly what to order, where to order it from, and who will assemble it for us.

You don’t need SLI for a multi-GPU deep learning system.

I wrote about mine: https://medium.com/@ernststavroblofeld/yet-another-deep-learning-build-yes-but-this-one-wont-break-my-bank-account-8af7c9c1357a

3 Likes

@balnazzar I posted a plea to pcpartpicker in which I’ve included your link.

1 Like

When working with live audio and video for real-time processing, how best to handle the input to the black box?

Would digital signal processing (DSP) music software take advantage of GPU? I really miss MAX, and wish I could have been able to use it on the NeXT with real-time frequency modulation (FM) synthesis on a card back in the day. I just wrote them about how fast.ai might use a front-end to help automate the process of finding the best slope for the loss function.

2 Likes

Thank you for the interesting insights, there is another quite active topic about this: http://forums.fast.ai/t/making-your-own-server/174/551

1 Like

Another interesting thread about hardware-related stuff was started by Jeremy. Search for it. Do not buy unless you are well informed.

2 Likes

As my plan unfolds for the brain in a box, so far there are a number of custom setups we should strive for:

Thinking of potential builds, for the professional is the black box ssh I/O to crunch numbers 24/7, the type of system that might reverse engineer planarian regeneration after the boss takes the first stab at it, security bots needing real-time I/O, and game bots where “While in Rome” I have a chip with which to toy.

“Danger, Will Robinson, Danger!”

“A planaria can be cut into 279 pieces, and each piece with regrow into a complete worm.” — bluedoorlabs

“Garbage in, garbage out” they say in computer programming. After teaching 39 generations of fruit flies how to count, the 40th were born already knowing how. This works in people, too, be it a predilection for music or domestic violence. Intelligence, or lack thereof, is inherited. Now how do you program it?

Here’s your storage medium.

QUERY: I know there’s a web page listing all the DNA shorthand codes. I’m wracking my brain trying to locate it again.

U.S. Introduces New DNA Standard for Ensuring Accuracy of Genetic Tests, The New York Times, 14 May 2015.

The 1950’s found Robert G. Heath putting brain implants into human subjects (not a nice guy, by the way), while the 1960’s had Dr. José Delgado at Yale make them remote-controlled, his famous front page New York Times article letting folks know the future (whether they’d realize it or not).

The bible of brain implant technology is Delgado’s 1969 treatise “Physical Control of the Mind: Toward a Psychocivilized Society”, though I understand the later editions are updated.

In 1985 to CNN Delgado would speak of using electromagnetic energy frequencies instead of direct brain stimulation by electrodes.

Fast forward to NASA’s 2004 subvocal speech BCI that I heard Delta Force uses which is being tallied at the moment for popularity points among a commercial product.

NASA-subvocal-speech

In related news are modulated microwave bursts. (My condolences go out to our diplomats overseas currently experiencing “health attacks” as the assaults read in the papers.)

Note to self: write tune: “Ode to Lida [machine]”

Anyone know the specifics of this build?

Seriously - this is not a forum for SPAM.

We’re at war, sir, despite outward appearances, and terrorist activities are thwarted thanks to AI pattern-spotting software that needs to be improved upon.

All it’d take is taking out the NVIDIA factory(s) to shut down the US AI lead.

You’re conflating too many things.

  1. Being at war does not justify your spam.
  2. Terrorist activities does not justify your spam.
  3. The US arguably no longer has the AI lead
  4. Taking out a few factories would not really change any fundamental flaws in AI strategy. For example, preventing computer imports into Russia during the cold war, actually had such a profound effect on their maths research that it put them in the lead for many years.

It doesn’t really make sense to answer a complex question with simple cliches.

In short, this is not the forum to sell your stuff. If someone wanted an NVIDIA (and they are for the moment the only game in town), there are a hundred websites where they can be found.

I agree.

This is questionable. The top AI scientists still reside and work in the US. The vast majority of money thrown at AI comes by US companies. (Please note I am european).