Platform: Crestle ✅

Yep, seems the old and unofficial one is no longer working… “kg” is no longer a command. It worked really well in the past though.

You could also upload kaggle.json file using jupyter notebook upload button. And then move the file to .kaggle folder using terminal from jupyter notebook
mv kaggle.json ~/.kaggle

1 Like

Yea, I don’t know why I thought I wouldn’t have access to move files to that folder… This would be a lot easier.

Oh nice, I haven’t used jupyter notebooks much in the past but that definitely works too. I’m glad you got that sorted out @RogerMao, and sorry for the confusion with the quotation marks!

Hi, is there a way to see how much disk space is left available?
I ran into an error which prompts disk full when unzipping.

I’ve deleted the unzipped files which caused the disk full. However, it seems that the space was not freed up somehow?

df -h

Thanks, @hucius, I should have searched for the command firstly.

image

It seems that removing the files aren’t freeing up the space…

You could get few GBs back by “conda clean -n”.

“conda clean -n” poped an error on the argument, so I used --all instead of -n.
That did give me a couple of GBs. However, the disk is still full and it seems that I can’t find out what consumed the space…
I only have “courses” and “lost+found”, they are just 137M. The ‘data’ folder, is just 37GB. With that I should still have like 30G available.
I’ve saved and deleted quite many files under the data folder, just feel like the space wasn’t freed up after the files are deleted.

Any further suggestion where I should look for things that took up the 30G of space?


We will investigate but as a quick option you can move the .datasets directory to maybe /tmp directory for now?

So something like -

sudo mkdir /tmp
sudo mv ~/.datasets /tmp

This may take a while though.

You should also run du just on home directory ~/ to see where the space is consumed.

1 Like

I tried the du option on home directory, and there was too much file to display. So then I tried to make it into a txt file to analysis, however the disk was too full then. But now I’ve cleaned up a bit and can finally generate the file. With some priliminary analysis, it seems that there are a few “Trash” that are taking up the spaces.

1 Like

Looks like Jupyter is not deleting but is moving it to Trash instead. You should be able to delete the contents in Trash through the terminal to clear space.

The crestle team is going to make the default new config delete instead of move to trash. I’ll ask the team if they can inject this into already existing volumes. In the mean time, you can also add a config option to your jupyter config to delete instead of moving to trash. You’ll also still need to clear your currently existing trash manually.

In ~/.jupyter/jupyter_notebook_config.py add:

FileContentsManager.delete_to_trash = False

https://jupyter-notebook.readthedocs.io/en/stable/config.html

Thanks for the feedback!

1 Like

I am trying in Crestle but dont see the notebooks from class 3 and 4. It has only till class 2 and the nbs folder was updated 21 days ago. Is there any way i can trigger the update for this notebook file or i need to manually upload the files into this folder?

Thanks
Amit

https://course-v3.fast.ai/update_crestle.html

Did you go through this? It might help.

2 Likes

how to install tensorflow in crestle … i guess there is no tf version for python 3.7 ?.

Thanks…It worked for me.

Is it just me or does anybody else have trouble opening the machine? Unfortunately it has just been loading infinitely for the past three days.

A brief google search brought me to this command: python3 -m pip install --upgrade https://storage.googleapis.com/tensorflow/mac/cpu/tensorflow-0.12.0-py3-none-any.whl That may not be the right version, I haven’t researched this really, but it’s definitely a step in the right direction.

An update from my side: Once you have trouble connecting to a notebook after starting your instance, you might want to connect to a VPN/change your IP. From my university WIFI it just doesn’t work while VPN & at home I don’t have any issues…

how to do fastai developer install at creslte?
developer install tutorial - confirmed works on crestle

with so many changes to the library the pypi is often not up to date…