I discovered that I don’t have the requisite GPU hardware in my laptop, and I wonder if there are options for starting this course without a Nvidea GPU?
(I apologize if this is too newbie for even this beginner group. Please advise if there’s a more appropriate place to ask.)
Thanks in advance for any suggestions.
I don’t think any genuine questions are ‘too newbie to ask’ around these forums so no worries
Completing the DL course without access to a GPU would not work very well. There is a lot of hands on exercises that are part of the learning process.
Many people go the machine in the cloud route (that is how I have started myself). GCP now gives you 300$ to use over a 12 month period and I think both Kaggle kernels and google colab now let you use a GPU. But probably spinning up your own machine is the preferred option. In that case it is GCP, AWS or providers such as paperspace.
This info might be a bit date - if anyone would be kind enough to share how they went about completing part 1 recently without having access to a GPU on local computer, that would be great
Thanks, @radek !
This is maybe the excuse I need to splurge on a new laptop. Due to upcoming travel, I will need to wait a month or so, but in the meantime I can start shopping around, so if anyone can follow up with ideas as to a minimal setup (i.e. best bang for the buck on the low end) that’ll be helpful. My use case is scientific programming with Python and this course.
Thanks in advance, and feel free to divert me to a more appropriate forum for this sort of information, as I assume “laptop recommendations” may be a bit off topic here.
Actually, you may find this Laptop Thread Useful.
However, do note that I went for a laptop only because that was my only option. You can get much much better options if you go for a ‘Rig’ setup.
GCP could be a good go since they’d give you 300$ credits.
If you’re a student, you can get 150$ credits on AWS Educate.
If these aren’t good options, you could check out the ML For practitioners course for the month.
GCP looks to be the easiest solution to start with. I’ll search the forums to see if there are others who’ve gone this route, investigate potential pitfalls, etc.
Can you provide a link to the ML for Practitioners course you referred to?
Also, GCP might ask you to debit your account with 70$ credits upfront (for security reasons), it’s still a much better option than no GPU or using a collab (Collab is unreliable and sometimes very slow)
Very helpful. Thanks again, Sanyam!
Jeremy mentions some good options in lesson 1.
I’d go for Paperspace if I were you.
Did you happen to use GCP?
I set up an account on GCP and launched a Jupyter NB (with GPU) by following this tutorial
I was able to set it up. I tried running the minst_cnn example provided in keras github repo (link).
This took around 70 sec per epoch (I had set up my instance with 1 GPU (Tesla K80) and 8 vCPU.
I also tried out the Crestle service. The same example ran much faster - just 9 sec/epoch.
I am unable to figure out why there is such a huge difference. Any thoughts?
Sorry - I am not sure. I have not used GCP. Not really sure how it works. Also, I think that if you use collaboratory the GPU is shared. On the other hand, pending on other specs as well, I wouldn’t be too surprised to see an order of magnitude difference between K80 and a V100 (if that is what you used with Crestle).