It's Alive! My Deep Learning Rig for Part 1

I think you misunderstood my tone. You do good in being proud of your rig: it is quite a solid build, and I even said the CPU was ok (in fact, overkill) when you said you deemed it insufficient (try to underclock it at half its operating frequency. Use indicator-cpufreq if under linux. I bet you will get the same timings with any dataset).

All I said is that maybe a 1060 is a bit limited in terms of memory. I am glad to hear you managed to run all your projects with it, but if I am to report my personal experiences, I scratched the limits of my 1070/8Gb many times, so I decided to go for the big sibling as I bought my second gpu.

Indeed. I even wrote a blog post to point out how I built my rig to be as cheap as possible. Ask if you are interested in it, I don’t want to spam your thread.
The point is, I’m not “the big shot with big money”. The only expensive thing in my rig is the 1080 ti, all the rest was grabbed crap cheap on ebay.
You decided to invest a bit more in the bulk of your rig, sacrificing a bit in the gpu compartment, and that’s ok. Like you said, we try to report our diverse POVs.

Respectfully,
A.

Edit: I’d like to add some benchmark/review:

The author (Justin Johnson from Stanford) says about Resnet 200: “Even with a batch size of 16, the 8GB GTX 1080 did not have enough memory to run the model.”

Tim Dettmers, a well-known (and for a long time, the only) reviewer of GPUs applied to DL, says:

"I have little money: GTX 1060 (6GB)
I have almost no money: GTX 1050 Ti (4GB)
I do Kaggle: GTX 1060 (6GB) for any “normal” competition, or GTX 1080 Ti for deep learning competitions

1 Like

It’s the “barely follow the lectures” that comes of as dismissive. Which I quoted directly in my reply. Not only that, but it’s inaccurate. Other than that, you have my permission to have the last word.

It seems you really took offence for my words, and I sincerely don’t understand why.
I even complimented with you for your build (compliments which I strongly confirm) among the first, while you ignored mine.

Coming to my words, I accept, although with some reservation, your complaint about inaccuracy (but please appreciate I reject the one about dismissiveness). Let me rectify:

A 1060 is a fine card for this course, and, given you are not in a hurry and your dataset and/or NN architecture are not too large, for doing a lot of diverse deep learning stuff”.

To be honest, and without hard feelings, I did’t like your statement about last words, too. It indicates you are beginning to get personal, and like I said, I truly cannot understand why.

That said, I apologize for any offence you may have taken from my posts on this thread.

Hi all, great post Phil. I too am vacillating in the uncertain, legacy hampered world of pc hardware in trying to build a ripper (Australian term, not AMD) AI grunt box. The motivation is to be ready for Jeremy’s new course starting soon. I wish to be bright eyed and bushy tailed with a machine to match; well, at least it’s a plan. Having staggered through most of the first course via youtube ( sorry Jeremy, but the 30 second or so lip-sound sync lag was far too distracting). A great snippet of info from Phil was his mention of Jeremy’s ‘PC Part Picker’, which I will certainly get into.
I too became frustrated at the very concept of using rented-time from AWS et.al. My old tower was unuseable with its Gigabyte GTX 295 which graciously, by pure luck, gave up the ghost and provided me with the perfect wifely excuse to put a ‘RTX 2080 Ti’ onto my must-have list; the seed is planted, germinated and is growing fir harvest to coordinate with the start of Jeremy’s first utterance in the new fast.ai course; what a buzz.
I will write up the hardware list and performance when completed, and post accordingly.
Thanks Phil, and thanks to Jeremy for his great contribution to the global AI (and benefiting community) challenge and progress.
Cheers to all,
Peter Kelly (Australia).

1 Like