Beginner: Setup ✅

If I just pip install fastbook will it automatically install fastai as well?

2 posts were merged into an existing topic: For those who run their own AI box, or want to

It worked with pip. Cond seems to hav an issue

Yes it will install fastai as well.

2 Likes

Hey people,
How do you keep your work tracked/pushed on GitHub?

If I use ipynb files on Google Colab, then the same single file on GitHub cannot be updated from Google Colab.

On the other hand, it’s much more convenient to work on Google Colab rather than on PyCharm.

Any ideas?

1 Like

When using colab it’s generally best to use Google Drive for storing your work.

Life is much easier if you use paperspace.

3 Likes

Thanks!

I do store it on Google Drive, but I want my work to be tracked on Github (let alone have some versions track of it)…

By using Paperspace (https://www.paperspace.com/), will I be able to run that Colab file, but have the file stored in Github? What is it way better for me than now?

I have a question whose answer is likely both obvious and straightforward. When using Gradient, it says I have a 10 notebook max. Does that mean I can only ever have 10 notebooks stored on my account? I’m guessing it’s a “yes, duh”, but it seems to me to be a big limiting factor. I’d have to have the entire book’s worth of code in one notebook to make sure I had room for experiments. Again, I assume I’m being dense, but looking for feedback.

There is the concept of a “Paperspace notebook” and an individual .ipynb notebook aka a Jupyter notebook file. Those are two different things, unfortunately. Paperspace has it’s own naming conventions.

So you can think of a Paperspace notebook as being a sort of ‘work environment’. Inside each individual work environment (i.e. ‘Paperspace notebook’) you can have as many .ipynb files/notebooks as you like.

So it’s workable I think. You can have a single ‘Paperspace notebook’ that will cover all your work on the fastai course. (Paperspace have just confused things with how they name their abstractions.)

4 Likes

For anyone

        dls = DataBlock(blocks=(ImageBlock, RegressionBlock),
                    get_x=ColReader('path'),
                    get_y=ColReader('norm_score'),
                    splitter=RandomSplitter(0.2),
                    item_tfms=Resize(224), #pass in item_tfms
                    batch_tfms=setup_aug_tfms([Flip()])
                   )

Hi all, bit late for the party. I see that all of Jeremy’s notebooks are on kaggle. I’m using jarvislabs since I have an issue logging in with paperspace. I started a fastai instance but it only contains 2020 notebooks. And I see that 2022 notebooks are not all in the github repo. Should we just copy-paste code/markdown to the cloud notebook of our choice, or is there a good way I can import the Kaggle notebook into another place? perhaps download the notebook and open it in jarvislabs?

I might be wrong, but I think the Kaggle (2022) notebooks are not part of the fastbook repo at all. It might be a good idea to just download them from Kaggle and open them directly in JarvisLabs.

2 Likes

2 posts were merged into an existing topic: Non-Beginner Discussion

You can click on the 3 vertical … button on any Kaggle kernel and download the code.

Which downloads it as .ipynb file, and you can upload it to any JupyterLab, which is what you get while using Jarvislabs :blush:.

3 Likes

For those wanting to set up a remote backend on Paperspace Gradient while using your local VS Code editor, please see this guide I’ve created:

https://forums.fast.ai/t/beginner-basics-of-fastai-pytorch-numpy-etc/96285/36?u=n-e-w

2 Likes

Using the Paperspace IDE and the how-does-a-neural-network-really-work.ipynb kaggle notebook I noticed the interactive widget flashes up but disappears when running. I tried installing some extra modules but this didn’t fix the problem. Then I discovered I can open the notebook in JupyterLab and the widgets work fine in there. I’m including here in case there is a way to get @interact working on the Paperspace IDE.
In Paperspace IDE


In Paperspace JupyterLab

2 Likes

The paperspace IDE is… not great. I wish they’d get rid of it. I recommend never touching it!

5 Likes

I’d suggest you not use the Paperspace IDE. They don’t support all the behaviours that the normal Jupyter (or Jupyter Lab) environment allows. I’m actually not really sure who uses their IDE / notebook environment as it seems so inferior to the real thing (as you illustrated in your question).

They’ve been improving it bit by bit, but there isn’t much of a way you can influence which features they choose to build vs not.

5 Likes

For those who are interested, it’s trivially easy to connect your local vscode instance to a Jupyter notebook backend on Paperspace.

Create a Runtime on Paperspace

From your default project, open the dialog to create a notebook. You will see the following:

  1. Select the “Papespace + Fast.AI” runtime.
  2. Select your machine type according to the options of your account type. Make it private access.
  3. Choose any other “Advanced Options” as necessary. But you can leave these alone for the purposes of this example.
  4. Click “Start Notebook”. You will be taken to the notebook editor

Open a Notebook on Paperspace

By default, you will see the Jupyter notebooks from the Course / Book in the left sidebar. Open one.

Note the VS Code icon in the left sidebar. Click it. A popup will appear:

At the bottom of the popup you will see a link to connect to your Gradient backend. You will copy this and enter it into VS Code

Connect your Local VS Code Instance to the Gradient Backend

  1. In VS Code, open a new Jupyter notebook from the command palette: CMD + SHIFT + PCreate: new Jupyter Notebook

  2. At the bottom of your notebook editor in Code, click the server option:

  1. A dialog will popup at the top of the Code editor:

  1. Click “Existing” and then paste in the Gradient server address you copied earlier

  2. You are now ready to run. Enter some code. Run it.

  1. You will be presented with a kernel selection box; choose the remote Gradient kernel you’ve just created (in this instance (05_pet_breeds.ipynb)

  1. Congratulations! You’ve successfully connected your local VS Code instance to whatever powered Gradient remote backend. Run what you wish, as you wish. Don’t forget to shut down your Gradient instance when you’ve completed your local work

This is also just as easily extensible to GCP’s Vertex AI / AI Platform etc

8 Likes

Has any one tried to setup fastaienv in Azure ML studio?
I have followed the instruction from GitHub - Azure/AzureML-fastai: Example code showing how to run FastAI examples on Azure ML
and got no errors while installing.

But I am unable to import fastbook