Thanks so much for all the fast answers.
I have read through large parts of the thread on Recommendations on new 2 x RTX 3090 setup (thanks for the tip matdmiller)
And thanks for your thoughts on cloud vs local server Fahim . Really appreciate it!.
I have gone through lots of the same experiences as you. Essentially, I always ended up using quite a bit of time on setup on colab or paperspace. With paperspace I could have a virtual machine that I ssh’ed into, which automatically connected to a SSD with all my data but that quickly became somewhat expensive and then I had trobules of not being able to get GPUs, when I needed them.
I’ll try and create the same setup as you Fahim. Get an RTX 3090 with 24 GB RAM and then connect to the server from my MacBook.
If there is something you wished you had known before building the server with respect to components etc. then I would of course love to know .
I’ll use the thought process outlined in this article and take it from there: Build a Pro Deep Learning Workstation... for Half the Price