Oh I didn’t realize there is a contest for this type of data I just did this dataset: https://www.kaggle.com/paultimothymooney/chest-xray-pneumonia Have to try the contest one now //edit sadly pt 1 of the contest is over, so kaggle won’t accept new participants
Let me know if I can help ,
I was working on same data and also noticed it is after deadline but its actually very impressive even without not understanding x-ray pictures not even know what is pneumonia we can detect health changes in 98%+ probably better then trained radiologist
This dataset looks interesting (15 classes):
After I finish setup my machine, I’m going to play with it
Crestle is free until mid November as per this post, if you are looking for some GPU resources, storage is limited to 75GB though but no credit card needed for now : https://forums.fast.ai/t/platform-crestle/28028/33
@sayko did you have any issues with temperature of your GPU on linux?
Do you have any experience in GPU cooling management on headless machines?
I think there is some issue with fans speed control, but I’m not sure if it is only mine problem
Nope. Maybe @radek ??
I have ordinary PC box, not a headless. But make notes on your struggle, would be great blogpost.
Have not experienced any issues - went for EVGA FTW3 because of its cooling.
Different GPUs will have different heat characteristics - for instance, a ZOTAC mini will nearly instantly heat up to ~90C.
I would not worry too much about fan control, etc unless you are seeing genuine issues (restarts, etc)
Issueas are the main reason I worry about fan control and I think that fan doesn’t work on max speed when it should
maybe this article will help you:
you have access to nvidia-smi? It should show you GPU-fan usage
I’ve already read this blog post (just minutes ago) I’ll try that scripts this evening. Thx for help. Wish me luck…
You can get GCP with regular card (debit card) no credit card required. I’ve got up to watch the lecture and was half-awake most of the time, eeh.
When I run learn.unfreeze() it unfeezes all the layers from what I understand, but it doesn’t learn from scratch just adjusts the existing weights? Because when I do unfreeze, learn, save, weights and load them and then run the sequence again I have different error rate.
this is what i understand too. that is unfreezing doesn’t ‘reset’ pretrained weights (for this there is pretrained=False argument).
when you load weights and run more epochs, the error decreases? i think it should unless it overfits?
Yes it decreases, I think I will experiment more on dataset that is processed faster (I did that on quick draw, using Radek tutorial, thanks for that Radek).
@Blanche @Michal_w @sayko
was good meetup! thanks for joining today! however there was a huge echo on my side i could barely hear you today. i hope next week will be better!
maybe interesting for GCP users, additional $500 credit: https://forums.fast.ai/t/platform-gcp/27375/140
if someone is training of colab there is a easy way to use own dataset, if it is on Google Drive.