Free GPU credits for Fast.ai Courses

Hi @snarkai , Ran the fast.ai jupyter notebook
(base) C:\Users\SonavexNUC3>snark start --pod_type fast.ai --jupyter
Setting up the pod…
Connecting to the pod…
Warning: Permanently added ‘[173.209.172.209]:10118’ (ECDSA) to the list of known hosts.
[I 20:45:03.943 NotebookApp] [jupyter_nbextensions_configurator] enabled 0.4.0
[I 20:45:03.944 NotebookApp] Serving notebooks from local directory: /home/pverma
[I 20:45:03.944 NotebookApp] The Jupyter Notebook is running at:
[I 20:45:03.944 NotebookApp] http://(4455e19bda17 or 127.0.0.1):8888/?token=842d8324e50c77a2345c2780e690de0ba493767d32ff3bfd
[I 20:45:03.944 NotebookApp] Use Control-C to stop this server and shut down all kernels (twice to skip confirmation).
[W 20:45:03.944 NotebookApp] No web browser found: could not locate runnable browser.
[C 20:45:03.945 NotebookApp]

Copy/paste this URL into your browser when you connect for the first time,
to login with a token:
    http://(4455e19bda17 or 127.0.0.1):8888/?token=842d8324e50c77a2345c2780e690de0ba493767d32ff3bfd

But browser doesn’t show any notebook running with following link:

    http://4455e19bda17:8888/?token=842d8324e50c77a2345c2780e690de0ba493767d32ff3bfd

Thanks in advance!

Can you try 127.0.0.1:8888 instead of 4455e19bda17:8888? @pverma

@diskandar You can run $exit in pod command line to exit the pod. However, it does not stop the pod though. All of your programs will still be running on the pod after you exit. To fully stop the pod (and avoid being charged), run $snark stop pod_xxxxx

1 Like

it did the same thing for me, not sure what caused it probably not updated python thing :frowning:
anyway the solution is to go to localhost:8888
then you have to copy and paste the token, and that is you should be inside the server.
so in this code your token is this one: 842d8324e50c77a2345c2780e690de0ba493767d32ff3bfd

hopefully that works!

Thank you! Rooting for your success

There’s a new bug when I try to start it now

Error: Couldn't successfully schedule pod execution. Please try again

It happens whether I use $snark start or $snark start --pod_type fast.ai --jupyter

Thanks @snarkai and @diskandar, both methods work. Everything working flawlessly so far! :v:

Hello, is this code still available? I tried in your website but received an message that this code is invalid. Thanks.

is invalid :cry:

Hey @snarkai,

I found a small mistake in the cli help text (snark start --help):

Weather should be Whether

Hi @snarkai,

After i ran snark start it logged me successfully but afterwards when i try to run snark start --pod_type fast.ai --jupyter it shows error.

Here is the snapshot -
image

Later on i tried snark ls snark stop pod_P106 to stop the pod but nothing works, and i can see it in my dashboard the hours keep on decreasing …

I’m new to this, If i’m doing anything wrong can anyone please guide me.

Regards,
Sumit

Hi @snarkai. I am getting “Wrong or invalid promo code error”

Your are first attaching to pod. Do not attach . Execute these commands in sequence

snark login
enter username and password
snark start --pod_type fast.ai --jupyter

Added more quota. Pls try again.

We just added more quota to the promo code. Pls try again!

Should be available now. There was a cap of number of retrieval previously and we just removed it :slight_smile:

Thanks a lot @sagar_mainkar it worked :slight_smile:

@SKS to stop the pod you will need to run snark ls and get the list of pod ids. Then you can stop each of them by running snark stop pod_id. When you run snark start ... you start a new pod on a separate GPU.

Hi @snarkai,

I noticed all three instances hours gets decreased while I’m running P106.

so, I’m thinking are those decreasing proportionally (I mean if I used up all of the GPU hours in P106, all of them becomes zero together) or it’s decrease as per the instance (I mean if i use P106 it will decrease slowly as compared to 1080)

I’m sorry, I don’t know exactly what to call P106, 1070, 1080. So I’m calling them instance.

Regards,
Sumit

Thanks a lot @davidbun it worked :slight_smile: