How to remote access gpu from friend’s computer without interrupting him?

Hi All!

My best friend is letting me use his desktop computer for deep learning whenever I need it. I couldn’t afford to build a new one, so I added a graphics card to his. I can also add extra ram if it’s needed.

He said he doesn’t do much on it or use it very often, but he hopes there’s a way to set everything up so that it doesn’t interrupt him when he does.

Is there a way to remotely access his computer for deep learning without it interrupting his usage?

Basically, without him noticing?

Would a regular server accomplish this? I’ve been reading through the forum and various articles, but I can’t tell whether or not the suggestions would solve this problem.

I’m using a 2019 MacBook Pro.

His computer has:

  • Windows 10
  • AMD 1920x 12-Core CPU
  • S24 Cooling System
  • 32GB DDR4 RAM
  • 1080 Ti 11gb GPU
  • 250GB SSD
  • 1 TB HDD

Hi,

The thing is I think you cannot avoid to interrupt him when he uses the pc and you also want to use the gpu for deep learning at the same time :frowning:
When I train a network on my pc I can barely scroll a web browser sometimes, so it’s noticeable.
(Okey I use my gpu on 97% load most of the cases, so yeah - guilty ^^)

As I saw that pc config it has server-like threadripper cpu and if he only use the cpu and like a server somewhere not for personal use, maybe, just maybe you can use the gpu and some cpu cores like 4 of them to feed data to the gpu and he can still use the other 8 cores for whatever reason and maybe not that noticeable, but I still have doubts about it.
And we didn’t talk about the ssd and hdd usage yet - that’s what makes things really noticeable when the pc touches them - which is even more problematic than the cpu part.

I think you can try 2 other options in your case.

    1. use his pc for DL when he doesn’t use the pc, so not the same time
      if it’s a server than always runs, so you only need to know when he uses it to not interrupt him
    1. use some cloud solution for entry level which can be free
      and don’t use his pc at all
      like this:
      https://www.paperspace.com/pricing
      there is a plan for free to practice DL with fastai and with jupyter notebooks
      and you can surely access it remotely from your laptop

Hope it helps ^^

That’s helpful, thanks for sharing your experience @AmorfEvo!

I’m new to everything hardware related, so your help is greatly appreciated. I’m reading through everything I can find - trying to figure this out. Currently, I’m following a guide to create a Jupyter Notebook server.

Also, I read several articles about creating a server that used Ubuntu. I googled the the benefits of using Ubuntu, but I still don’t understand how that is better than just running a server on Windows 10 - I’m a Mac user

His computer had an older GPU that we took out. Is there a way to use both? Could he use the older GPU for web browsing and watching Netflix while I’m using the new GPU to train a model?

I can also add more RAM if that would make a difference.

The SSD contains the Windows 10 OS. HDD is used only for long-term storage. I planned to transfer all the deep learning data to and from his computer after each use. I was hoping that would help things run smoother.

I’m really hoping to find a way to make this work out because I can’t afford to pay the long-term for cloud GPU, and I’ve already bought and installed the new GPU.

Thanks again for your help!

Okey I understand, so you still want own stuffs, that’s the spirit :slight_smile:
+there is the gpu already - so no turning back

Well, I also think linux is a better platform for DL than windows, because win10 doesn’t let you fully use the whole gpu, 10% gpu memory is reserved for the win10, so remained 90% and now the 2nd hit is the DWM which reserves the 10% of the 90%, so 81% remained :slight_smile:
On win10 you cannot fully use pytorch in terms of performance (or any other libs), because of compilation difficulties, some things run 2-3x slower on win10 when you compare to linux.
Also when you want to use docker for example with nvidia accelerated gpu capabilites then you can realize nvidia did it on linux and no windows support for that - ok, its not ur problem, it was my problem once ^^.
But the list goes on…
On linux you can monitor things better in terminal, like ‘watch -n 5 nvidia-smi’, ‘watch -n 5 sensors’ and you can pipe outputs, on windows it’s harder to do or impossible in some cases.
Linux is more like Mac, Win is a different beast - but it doesn’t mean you cant use win10 for DL - you can, just sometimes need more workarounds/can be less performant and I wanted you to know that, but your friend will watch netflix on win10, so no need to disturb him with linux if he doesn’t want to.

So when you have 2 gpus, ofc you can use both if the motherboard has 2 pcie slots, and you can specifically use 1 of them for DL - my suggestion to put the stronger gpu in the 16x slot, because the other slot more than likely only 8x, so you can use the full memory bandwidth for DL, and for netflix and web browsing there is in the pcie8x the older gpu :slight_smile:

It can work, so gogo :slight_smile:

Btw I didn’t want to discourage you, I want you to succeed and you will :slight_smile:
I just gave full information what I experienced when I tried some iterations before you - and I also don’t have money, so we are in the same shoes in this regard :slight_smile:
But I also wanted to do DL and when I win some competition then I will buy my own stuff ^^

Thanks @AmorfEvo, appreciate it!