creates a folder called ‘%userprofile%.kaggle’ and then errors on the second line withh the message “/bin/sh: 1: move: not found”
I’m not sure what this is supposed to do, even after researching it a little bit. Is it supposed to create a folder called .kaggle in my user directory or in the dl1 directory with the rest of the course materials? If I knew this, at least I could maybe do it manually.
I am facing the same issue
I have put the path right but still getting the error /bin/bash: move: command not found.
So i did it manually. I created .kaggle folder in my user directory and then moved manually the kaggle.json file into the .kaggle folder, but i can’t download the dataset from kaggle.
Any solution?
Thanks
This error appears. OSError: Could not find kaggle.json. Make sure it’s located in /home/ubuntu/.kaggle. Or use the environment method.
Here is the full message.
Traceback (most recent call last):
File "/home/ubuntu/anaconda3/bin/kaggle", line 6, in <module>
from kaggle.cli import main
File "/home/ubuntu/anaconda3/lib/python3.7/site-packages/kaggle/__init__.py", line 23, in <module>
api.authenticate()
File "/home/ubuntu/anaconda3/lib/python3.7/site-packages/kaggle/api/kaggle_api_extended.py", line 149, in authenticate
self.config_file, self.config_dir))
OSError: Could not find kaggle.json. Make sure it's located in /home/ubuntu/.kaggle. Or use the environment method.
Traceback (most recent call last):
File "/home/ubuntu/anaconda3/bin/kaggle", line 6, in <module>
from kaggle.cli import main
File "/home/ubuntu/anaconda3/lib/python3.7/site-packages/kaggle/__init__.py", line 23, in <module>
api.authenticate()
File "/home/ubuntu/anaconda3/lib/python3.7/site-packages/kaggle/api/kaggle_api_extended.py", line 149, in authenticate
self.config_file, self.config_dir))
OSError: Could not find kaggle.json. Make sure it's located in /home/ubuntu/.kaggle. Or use the environment method.
unzip: cannot find or open /home/ubuntu/.fastai/data/planet/train_v2.csv.zip, /home/ubuntu/.fastai/data/planet/train_v2.csv.zip.zip or /home/ubuntu/.fastai/data/planet/train_v2.csv.zip.ZIP.
Thank you. Where should the json file be uploaded to “Colab”, where should this be uploaded - in the same directory as the Notebook? I am asking because, it is not able to find this file
Also, I am assuming that I need to mount the Google Drive before running these commands. Let me know if this is not the case.
The other thing to note is that !ls ~ does not show anything so I am not sure if that is used for mkdir, it will work.
But !ls … seems to be pointing to some sort of root on unix. Not sure if this should be used instead or not.
You don’t have to mount your google drive. For large dataset, it’s faster to train your model by downloading your dataset in the Colab virtual machine (your data and your fastai program are on the same virtual machine). The downside is downloading them every time you open your notebook.
So grateful I came across your instructions above (I’m working in Colab because I had similar issues with Paperspace). I still have issues progressing from the step above. I can see the uploaded file but when I run the commands that follow, I get these errors
Basically it looks like the data files are no longer on the site.
I cut and pasted the screenshots into 2 separate word documents but I don’t see a way to attach them,
It’s not possible anymore to directly download the zip file using the kaggle api. Here below is a solution that was proposed in the following topic that gets around this problem:
Hi all! For those of you that want to download directly on their notebook server: This assumes you are logged in into Kaggle and you have accepted the conditions of the competition. Open the Chrome browser on your local machine Install the cookie.txt extension from this link Go to the Kaggle dataset Locate the download button button of the dataset you want (see example in image below) Copy the link (right-click on that button) Export your cookies using the newly added plugin Go to your rem…