Most cost effective platform for 2020, GCP too expensive

This morning I’ve got invoice from google for around 100$ and I’ve almost fallen from my chair…
I’ve used regular instances because preemtibles are currently unusable (the shutdown so often that most of the time it takes 3 to 5 times to start one) and I can’t afford over 100$ per month, because I don’t earn in us dollars. Google predicted usage cost for this month for 400$ :confused:

My last month usage was GPU for 47hr and 500GB of storage.

What is the most cost effective (cheap, but not really slow) option here? Are there any cloud operators that are affordable? Or is own GPU the only option?

I am thinking about buying 2070, because it is around 500$

Here are my last months costs.

1 Like

If you buy expanded storage from Google Drive (so you have 500+ gb’s), and Colab premium, that should put you at give or take <$30/mo or so, which is what I recommend if you’re trying to stay on a budget :slight_smile:

5 Likes

why didn’t you try colab or colab pro?

1 Like

That looks great. 30$ looks really affordable. I haven’t used collab for a while, can we access terminal on the machine somehow?

IIRC some people have been able to do it, through a minor ‘hack’, let me see what I can dig up

:slight_smile:

6 Likes

you can do things like !mkdir my_folder or something similar.

Most likely due to their SSH requirement, but as my previous post states, it can be done :slight_smile:

(Or as Jonathan says, you can run terminal
Commands inside of a Colab instance :slight_smile: )

I know, but that isn’t convenient, I’d rather buy GPU if I had to do that.

So default GPU for collab is K80 which is worse than my GPU… I have 1060 6GB in my PC. Nvidia has this really neat comparision tool https://technical.city/en/video/GeForce-GTX-1060-6-GB-vs-Tesla-K80

You get between a K80 and a P100 instance (pro gets the P100 easily)

But I’ll also add I’m yet to not be able to do a project with the K80.

I think Colab pro is only for US at the moment.

I used Colab last 2-3 months and I never got K80 - 90% of the time it’s p100 for the first session and T4 for the second.

1 Like

Collab looks like much hassle, I’m a developer I like to have access to terminal LOL

Thanks for your help very much.
I’ve also looked at prices in course docs and I will be using my partner google account for the free 300$ credits and try to setup DL box on my PC in the meantime. If I fail to do that then collab it is.

If it was only for fast.ai I’d be fine, but i’m trying to finish my sideproject which would probably take 72hrs+ to train and experiment.

1 Like

Oooh yeah Colab may not be the best then if it’s 72+ hours straight! Best of luck! :slight_smile:

Not in one run, but I’m doing stuff on part of iNaturalist which is kinda huge.

1 Like

That makes more sense :wink: I thought about the iNaturalist dataset, as it’s something I’d like to work with but with that large of a dataset I agree trying to figure out how to set it up best is rough. Full transparency: when I worked with it I did just use my own local box I built (I didn’t have a 2080, I had a 2060) but having it local let me run it for days on end and only having to worry about the god-awful electrical bill!

Mueller’s is right. Look at this I just ran my colab on my Android phone on the lesson2 fastaiv2. It shows I have P100. Remember, colab pro says if you often idle your colab too long without doing anything, you may have less chance to use p100!
https://colab.research.google.com/drive/1Q9rv2JHPvPVry5V3OtlyUyOOsCSUMwFj

How much would you estimate your electricity bill raised? Anything above 50$? And how much epochs did you use? I’m only learning on Plants, but it takes a lot of time anyway.

At this time of this writing you can get an AWS Spot Instance for ~$0.16 per hour for a Nvida T4.

For the $500 that you are looking at spending you can get 3125 hours and that’s 130 days of 24/7 runtime. That spend amount could last a lot longer if you turn off the instance while not using it.

I think that a spot instance might be the way to go being that we are all waiting for an announcement from Nvida to release the next gen. card.

That’s huge numbers on that bill. Is your compute engine online all the time?
Since most of the time I spend in the notebooks I work on code, preparing data, etc. I disable the Compute Engine for my VM (VM->edit, scroll down). Also when uploading data to the VM. I really only enable the CE when running the model. Since the preemptible vm is really cheap, that saves a lot of money.