path=untar_data(URLs.PETS)
path.ls()
i am getting an output as:
[WindowsPath(‘C:/Users/rajve/.fastai/data/oxford-iiit-pet/images’)] instead of [mages,annotation]
what to do?
path=untar_data(URLs.PETS)
path.ls()
i am getting an output as:
[WindowsPath(‘C:/Users/rajve/.fastai/data/oxford-iiit-pet/images’)] instead of [mages,annotation]
what to do?
Hello! I have a problem runing SageMaker instance from a template.
Where should I post to get help on that?
My problem is that start script seems to hang up during ‘update fast ai library’ step.
I already tried modifying the script like so:
https://aws.amazon.com/premiumsupport/knowledge-center/sagemaker-lifecycle-script-timeout/
What could be the problem?
this is the suggested link to create large databases from the course. please check it out
Hi bagnica I hope you are having a wonderful day!
Unfortunately I use Google Colab not Sagemaker so I can’t help you with the specifics.
However I had a similar problem when trying to deploy an app to a cloud provider.
It turned out that anaconda wouldn’t let me install certain prerequisites for the version of fastai I was using as it was an old version of fastai. The problem was the cloud provider service was hanging so I couldn’t see or check what was happening.
As I have anaconda on my desktop machine I was able to run the scripts locally and saw the error messages rather than the frozen screen. If the screen froze I used ctrl+z to break out of it.
I then used pip freeze to see what libraries had been installed and used pip install to install the libraries that anaconda was hanging on.
Hopefully someone who uses Sagemaker can give you some more help.
Cheers mrfabulous1
Hello all!
Just started the course yesterday. I was doing the lesson2-download suggested practice from the first lecture, and scraped 27 images of each of mountains, rivers and deserts from google. After calling learn.recorder.plot()
I get a graph like this:
which is generally decreasing as the learning rate increases, unlike in the lecture where the learning rate increases. Any pointers as to why this is occurring for my model?
Hey,
The solution is to change the script recommend
replace-> conda install -y fastai -c fastai
with-> nohup /home/ec2-user/anaconda3/bin/conda install -y fastai -c fastai -v &**
I cant upload a patch for smmeresean into GitHub. I asked someone to help their and its documented under: https://github.com/fastai/course-v3/issues/518
My dataset doesn’t have a normalized distribution, and I am struggling with getting a low validation loss with collaborative filtering. I want to use collaborative filtering, so what can I do to lower the loss?
Thanks
I have some image data from csv with names train.csv, valid.csv, test.csv. So how can we load the training data only from train.csv and validation set only from valid.csv
Hi,
I’m on lesson 2 stuck on the part with “Cleaning up”
I entered :
db = (ImageList.from_folder(path)
.split_none()
.label_from_folder()
.transform(get_transforms(), size=224)
.databunch()
)
and received an error:
NameError Traceback (most recent call last)
in
----> 1 db = (ImageList.from_folder(path)
2 .split_none()
3 .label_from_folder()
4 .transform(get_transforms(), size=224)
5 .databunch()
NameError: name ‘ImageList’ is not defined
I am not sure how to proceed from here and searched the forums. Any help appreciated, thank you!
There must be a command above that cell that loads in the ImageList class via the fastapi library.
I imagine it would be this:
from fastai.vision import *
I would run all the cells above and run this again.
With resnet50, on applying learning rate finder i got this graph. From this graph it feels like using this range (1e-04 to 1e-02) will give better result compared to (1e-06 to 1e-04). But it is not the case. I am confused about it, can anyone explain ? This is from Lesson1
I am new to fast.ai and was trying to download the dataset ‘https://www.kaggle.com/karthikaditya147/junctions-images/Junctions-test’ present in Kaggle. But I was not able to figure out how I am supposed to use untar_data function to import the data in the notebook. Besides, that data have separate folders for testing and training, so how am I supposed to specify to my model to use the data present in the test folder to test the model accuracy and use the data in the train folder for training?
I started course (v3) and am using Paperspace. I can set up the and run the notebook once, but when I try a second time, it doesn’t work. I deleted the first attempt, and tried again, and same thing happened. Does anybody know what is going wrong?
Here is the error message:
Error downloading files from ‘s3://ps-notebooks/tej0q9dl7/nreu5cvs’ to directory ‘/var/lib/docker/volumes/j40gtpjv3eref/_data’: BatchedDownloadIncomplete: some objects have failed to download.
caused by: failed to perform batch operation on “tej0q9dl7/nreu5cvs/course-v3/nbs/swift/FastaiNotebook_08a_heterogeneous_dictionary/Sources/FastaiNotebook_08a_heterogeneous_dictionary/05_anneal.swift” to “ps-notebooks”:
RequestError: send request failed
caused by: Get “https://ps-notebooks.s3.amazonaws.com/tej0q9dl7/nreu5cvs/course-v3/nbs/swift/FastaiNotebook_08a_heterogeneous_dictionary/Sources/FastaiNotebook_08a_heterogeneous_dictionary/05_anneal.swift”: dial tcp: lookup ps-notebooks.s3.amazonaws.com on 1.1.1.1:53: server misbehaving
Hi everyone,
I have a question.
I have just started a course and after lesson 1 tried to create my own model with recommended theme (defining the type of bear). I did it all as described in lesson 2, but got a bit strange result on top_loss call.
They’re the ones it was least sure about
Ok, got it.
Thank you.
BTW, when I have increased amount of photos I have received nearly same error_rate, but not top_loss results are real fails.
Maybe one of reasons was that I used 200 photos of each type at the beginning, which is quite small data set.
Thanks for your response!
Can i use fast.ai to build a physics informed neural network(PINN)?
I have just started with fast.ai course v3.
As a very first step I need to install an Colab Jupyter Notebook as described here:
When trying to install the dependencies using
" ```
!curl -s https://course.fast.ai/setup/colab | bash
I am getting following error:
"bash: line 1: syntax error near unexpected token `newline'
bash: line 1: `<!DOCTYPE html>'"
Can anybody help?