Anyone tried this cheaper, new AWS alternative to AWS EC2?

(Ian) #1

(Alex) #2

I have been looking into this.
But CUDA is not supported:

According to the doc it is designed for small graphics acceleration and not heavy loads like deep learning, so performance/price - G and P instances appears to be cheaper anyway.

(Ian) #3

Thanks alex! Would be great to find cheaper alternatives to $0.70 an hour which seems little, but can add up over time :slight_smile:

(Alex) #4

you can explore aws spot instances, right now they are (p2.xlarge) $0.18 per hour in us east region. There are plenty of threads how to set it up on this forum…

(Travis) #5

I’m new to the course, so haven’t done much on it, but Crestle ( looks like a great alternative. Looks like they use Kubernetes to manage resources, but you can get a dedicated NVIDIA Tesla K80 GPU for about $0.30.

It also hosts your notebooks, and gives you a terminal window to install your own stuff. Permanent free storage.

Best part, you can launch a notebook w/o GPU support for just 3 cents an hour. Try things out, play around when it’s cheaper. Then restart your notebook with GPU enabled, and train your model.

I’m not affiliated, just think that it checks all the boxes that I want.

(marc) #6

I’ve been using Paperspace and they provide a great service. The default image has all the right tools installed. Price-wise it is much better than aws. You pay $.65/hour but you get a P5000 (Pascal GPU) that’s roughly 3 times faster than a P2.
Another advantage is that you don’t have to worry about persisting your data or losing your spot instance.

(Max) #7

Have you been using the us east region instances? And does it make a big difference if I’m not based in the US?

(Alex) #8

Yes, I am using us-east-1 instances.
It makes difference when you far away from the region, the more data you transfer - the noticeable difference.
I’d suggest to try and see if the latency is acceptable for you.

(Max) #9

Thanks. I’m actually all the way in Singapore. But im cautious not to overspend. The latency is the data transfer latency yeah? I’ll give it a try. It’s much cheaper anyway.

Have you tried I noticed that quite a few members are using it.