Hi, I just want to share this small remark to beginners: Google Colab is nice for freebie!
In lesson, you will see that Google Colab and Paperspace Gradient are recommended as two cloud platforms to run your Jupyter Notebook. In both sites, you can utilize the free GPU, but the GPU con Google Colab is much, much stronger!
Here, I trained my
convnext_large_in22k model on Google Colab with the data from the Paddy Doctor Competition. Each run took ~7 minutes to complete:
While on Paperspace Gradient, the
model took ~ 15 minutes already.
If you start out, Google Colab may provide you with more than what you need!
Thanks for the great tip. Is there any way I can find out what kind of GPU google colab is using?
I used to use neofetch, but it doesn’t seem to show GPU anymore.
And would you recommend getting colab pro?
The easiest way is to use
torch.cuda.get_device_name() after importing
It seems that Colab uses Nvidia Tesla T4.
import timm or
import fastai then you can just call
torch is installed by default with
fastai. You can then check the properties of device with
For the pro of anything, it really depends on your budget. With Colab or Paperspace Pro, you get access to fancier GPUs and also a terminal. If anything, I lean more towards Paperspace Pro, because they store your data for you and also because of the referral credit you get. You can use the link on course site, or form a group and refer each other!
Kaggle new upgraded HW even better. We can have one P800 or two T4 GPUs.
Besides the lesson on computer vision, we do not even need a GPU, so any platform should be fine to practice the lesson.
For other usage, I recommended Colab because I did not want to use up the 40 hours GPU time (must save some for competition). But I have never hit 40 hours so it should be fine haha.