The the link to access the recommended options for GPU servers redirect to the courses website. Where can I find the list?
Hi
@kritirakheja I see you are reading the book Deep Learning for coders. It depends to be honest. They are a couple of discussions that have been had about this. For example: Original post
Building local
—-
Building local GPU server - #8 by josca42 and Build a Pro Deep Learning Workstation... for Half the Price
Cloud
—
You have a couple of options depending on how much you are willing to spend. I recommend the following:
Colab
Runpod
Vast
lambda
Huggingface
I personally use google colab since most the classes we are using notebooks and it gives an easy interface to switch runtime from GPU, TPU with an easy way to get more compute credits. Runpod and vast can be good for running long running jobs and mimics some VMs you could get for major cloud providers and huggingface especially jobs could be useful in running some quick jobs, inference endpoints and spaces to practice making demos makes it a good suite for tooling.
All the best.
Thank you so much!
