You only have 1 nvidia GPU, so you can only set_device(0).
I have come across this gpu based pc configuration, which costing around 100,000 Indian rupees.
Is this good enough for our fastai course and running kaggle data sets.
Thanks
If you built it with a custom box, can you successfully make it so the water wouldnāt drip on any other components if a junction leaks? Can the water attachments always face down so that it doesnāt drip on the connected component? I think most cases Iāve seen have the water filling from top, so this is a great idea, as I would be terrified of ruining hardware from a leak
Due to memory consumption error (cuda out of memory) on a model Iām trying, I had to upgrade from a 960 today. I was able to get a 1080 which has double the memory at ānearā msrp from bestbuy (https://www.bestbuy.com/site/nvidia-founders-edition-geforce-gtx-1080-8gb-gddr5x-pci-express-3-0-graphics-card-black/5330600.p?skuId=5330600). Also, it looks like right now nvidia has titain xpās in stock at msrp of $1200, which is around what some places are trying to sell 1080tiās for.
This links might be helpful for others dealing with gpu inflation costs:
- http://www.findgpu.com/gputracker.html?store=nvidia
- http://www.nowinstock.net/computers/videocards/nvidia/gtx1080ti/
- https://www.reddit.com/r/buildapcsales/comments/7uxgei/gpu_stock_thread_quick_guide_for_gpu_hunting_in/
hope that helps others
Hi, What weāve done till lesson 10 of part 2, you should be ok with a 1050ti if you donāt mind training models overnight. You can also get a desktop or a laptop wtih a 1060 with a budget of INR 100k
I think older posts on this thread have all the blog posts and insights on config that youāll need
Check out these links if youāre looking for parts that go well together and are buying in India
http://www.ant-pc.com/product/detail/ant-pc-pharaoh-cl8400-pg/ODM=
http://www.ant-pc.com/product/detail/ant-pc-pharaoh-cl8400-cg/ODc=
thanks for the reply,
another question, if I buy a low-end pc for now, can I upgrade/add the latest RAM, GPU and SSD memory, in future to the CPU cabinet.
what is the difference in the role of system RAM and GPU memory for training our models?
You wonāt have a problem upgrading the ram/SSD but youāll have to change the power supply to handle a second or upgraded GPU. Maybe even get a bigger ups (I made the ups mistake)
The GPU memory is where the data (think of a mini batch) and models will reside for the models to train using cuda. This can be loaded either through your system ram from a pandas dataframe or the hard disk for lazy loading of images from a folder using fastaiās ImageClassifierData. More GPU memory will let you work on larger mini batches and train more epochs
I am planning to go with 96k costing pc from above link that u have posted. So I should get a bigger ups power supply, so that I can go with future GPU upgrades with out any problems.
Is there any thing else that should know before buying this.
Thank you for all the help.
Hi, you can contact the company regarding the power supply. Theyāll know how much you need and will change it for you. Iād say stick to the default if youāre not sure about getting another gpu (prices donāt increase linearly)
The writer of this blog post got a 750W for 2x1080ti
Regarding this company, make sure they give you the serial number of all parts in the bill. Thatās how you get warranty on the gpu and other parts
Good luck
I just put together my DL box, thanks to this forum. Here is my blogpost sharing my experience.
congrats
Thanks, this looks like a great build!
Iām curious if youāre fully utilizing the features of your MSI Godlike Motherboard? The MSI M5 gaming model looks good for a single video card, Iām considering the tradeoff between cost and future-proofing.
For those of you comfortable waiting on a backordered product, I see 1080tiās on B&H at pre-crypto-craze prices (750 usd).
Use a 1000W PS for two GTX 1080 GPUs.
I hate cross-posters too, but since this topic is all about building you own machine, learn from my mistake. I built a box with two GTX-1080s that worked fine if you ran each separately with a 750W Platinum PS. But when I ran with pytorch nn.Dataparallel to use both at the same time, my machine crashed and tracking down why it crashed was no fun. Now I know better and you do too, use a 1000W PS if you have two GPUs in your box to avoid these problems.
With command like āsudo nvidia-smi -pl 200ā, you can limit the power consumption of GPU.
(Actually performance doesnāt drop that much with power-limitation.)
Iām not sure whether this can actually fix the issue, but I guess itās worth trying.
Not acording to ASUS, see this post of mine in a different topic here
Wow, thanks!!
Would suggest buying components in the US and bringing to India. Its much cheaper that way. I was able to build my computer rig for 1.1lac (which would have been 1.6lac had I purchased in India)
Write up on the components and prices - https://medium.com/@Stormblessed/building-my-own-deep-learning-rig-for-under-1-lac-in-india-4ade685b8c56
The threadripper pcie errors in Linux are fixed by adding pcie_aspm=off to grub. See here PCIe errors
My threadripper system has been 24/7 stable for months, and I am completely satisfied. The platform simply offers a lot more for the money (cores, pcie lanes). The extra CPU cores are not necessary for DL, however they are very useful for other applications (compiling code, simulations, etcā¦)
With that said, if this is your first build I think you are better off with Intel. As a relatively new platform threadripper requires more tweaks and planning:
- Zen v1 is pickier with memory than intel, so I would only buy RAM from the QVL list
- If installing Ubuntu 16.04 LTS, you should upgrade to a more modern kernel (4.16) as there are many new drivers and tweaks for the zen platform
- The TR4 socket can be a PITA to physically Install
Ryzen v2 was just released and supposedly deals with a lot of the memory issues. Threadripper v2 will be released in August, and is expected to have similar stability and~10% performance improvements.
After getting pretty distracted by a ānewā build and looking into watercooling, Iāve written a post here. Soon Iāll be done with the Xeon build and will be able to focus again on some DL work.