Hi,
I used Jarvislabs a couple of times.
I tried out at its start in last year december, then checked again and again especially after I saw a great news in this forum 
So I also checked again this year after the internet speed upgrade news.
Actually it got way more upgrades than that as you write the key features in this forum thread, I will write about them in next post.
But first, internet speed.
I don’t compare its speed vs my home machine, I like to compare it with other solutions like Paperspace (which is more reasonable comparison in my opinion).
I uploaded many datasets and downloaded network weights many times, so I have a picture about their speed distribution (I also checked some other solutions in Europe, but more on that maybe another time).
Why I’m talking about speed distribution and not single speed - because there is no such thing as single speed 
The speed will vary and depends when you use it and where you are - but it’s true for all providers around the world, so when you use them in the busy hours they tend to be slower, it’s that simple.
Here is a comparison table for Jarvislabs and Paperspace with a 1GB zip file, same time tested parallel outside both of their busy hours.
(I decided to run the test around midnight for Europe, so it’s 4.5 hours more in India and 6 hours less in New York)
Paperspace webbrowser, directly from JupyterLab:
avg download speed: 6.5 MB/sec (1 GB down 2.62 mins)
Jarvislabs webbrowser, directly from JupyterLab:
avg download speed: 7.3 MB/sec (1 GB down 2.34 mins)
I run the test 3 times, the avg is based on that.
So they have a similar range, but I also note if you run this again on another day it can give the opposite order so we should thinking only in ranges.
I also note I would write avg speed even for a single download, because on Paperspace it’s changing between 5.5-7.3 MB/sec for a single file, and it also gives around 6.5 MB/sec but it has a larg standard deviation.
Let’s see another kind of comparison:
Jarvislabs webbrowser, drag-and-drop into JupyterLab:
avg download speed: 7.3 MB/sec (1 GB down 2.34 mins)
avg upload speed: 2.4 MB/sec (1 GB up 7.11 mins)
Jarvislab WinSCP copy:
avg download speed: 4.5 MB/sec (1 GB down 3.79 mins)
avg upload speed: 8.5 MB/sec (1 GB up 2.01 mins)
The picture is not that simple 
So both Paperspace and Jarvislabs use JupyterLab and you can just drag-and-drop files into their windows or right click for download.
I really like these features and I note the download is simple and fast this way so I will always use it on both (because don’t need to login to the server and I really like to use just the browser for everything if I can).
BUT you should not upload big files with drag-and-drop, because even if it’s simple and convenient it can be really slow. (I still use this for small files like a jupyter notebook file or a small csv)
The interesting thing is the upload speed in WinSCP is larger than the download speed in any way
(so I can really suggest to use it)
BUT, another interesting thing is the download speed in WinSCP is slower than browser o.O
(so it’s another reason I will still use the JupyterLab right click for download - not just because it’s more simple, but it’s also faster ;))
(note: I used WinSCP from Win10, but I also have Manjaro linux, I use both OS interchangeably)
For completeness, I live in Hungary, in Europe, but the Paperspace servers are in USA and Jarvislabs servers are in India. (I know the question is why I don’t use something in Europe if I’m in Europe - the answer is: there’re not so many options here, maybe 1, Genesis Cloud, but they use only 1080 Ti or Radeon MI25 cards and not so convenient setup process)
I know Paperspace servers are near New York on East Coast and another pack is near San Francisco on West coast - and both are near Google Cloud Datacenters.
So the speed also can depend which is used when you use the service.
Someone wrote in the forum they wanted to use Kaggle datasets - this fact also sheds light why it can be fast to download Kaggle datasets there.
I can suggest you to download the dataset to your home machine if you are in Europe like me and then upload it to Jarvislabs, it will be faster. (and a lot of times the original datasets contain tfrecords too, which makes them 2x larger and you don’t need those if you use pytorch or fastai, because those are tensorflow stuffs, so remove those from the original and upload only what you really need ;))
And do not drag-and-drop into the JupyterLab window, but use ssh or similarly WinSCP or FileZilla, because it will be 3.5x faster that way 
When it comes to download the results or network weights, just compress it into 1 big file and right-click download from JupyterLab for max speed.
Why I write this in great details?
Well, if you are as poor as me and/or a maximalist, you can optimize this way your time/money to the extreme ^^
Maybe it can help somebody else too 