How they measure cost of usage in cloud?

Hi folks,

My question could be a super naive one for many and I am sorry in advance for it.

I want to know as the subject says, how they (Google, AWS or any others) measure the cost of usage in cloud? Is it measured by the time spent only to “Run” my code or the time spent both to write code on their platform and also run it?


At least on AWS, you get charged for the time that the instance is on. So if you’re writing code in a Jupyter notebook, for example, you’ll be charged for that time because Jupyter is actually an app running on your cloud machine.

1 Like

Thanks William,

if you are using AWS, can you give an idea how much is your average monthly spending for this course?
I know it’s hard to say and should vary from person to person. I am trying to figure out how much it should cost to complete the entire course(part-1 only).


I’m not on AWS anymore, but I checked my bills from when I was using AWS for fastai and it was about ~$100/month. I think you could probably get that lower if you use spot instances, or if you’re more careful to turn the machine on just when you need it.

1 Like

AWS and Google you pay per second the instance is running. When the instance is stopped you don’t pay. For AWS you also pay for data storage (called EBS) on a GB/month basis.

You get a discount on the EBS storage if you sign up for their AWS free tier.

I think i used around ~$60 of AWS credits for this course. You can get AWS credits for free through AWS educate (requires .edu email) or you can buy them through ebay for 85% off.

1 Like

Thank you @laphi !!