Lesson 2: further discussion ✅

Hi there. Unfortunately, I did not figure this out. I could not reproduce the particular way that the library calculates its derivatives.

Having the same issue with tabular data. Repeatedly re-running the same code after having restarted sometimes gets the plot to show up. Not sure what’s going on.

hi everyone i am trying this for a while please check this out …
and it is giving me file not found error…


initially my data is present in the d directory so what i am doing is … in this way …
the path.ls()
gives me the correct directory structure only…
but…now check… this …

it is showing me the file not found error and the path is now changed why i am fighting with this problem from a couple of hours i tried all possible ways like i hard coded the path and set it to the function and
i executed the cell download_image the scroll bar started but finally the folder created but there are no photos… in it just empty folder
every one can see that there is a folder created … but it is empty please help me… because i tried all the possible ways of hardcoding so please let me know…

Hi, the ImageCleaner is hanging. This is the line I’m executing ImageCleaner(ds, idxs, path).
I’m using colab. Let me know what I’m doing wrong ?

Hi! Is urls_black.txt file in bears folder?

Hi @apoorv16,
ImageCleaner doesn’t work on colab unfortunately.
See ImageCleaner doesn't render for possible workarounds

HTH.

Butch

Thanks for the info @butchland , will check them out.

Can anyone tell me what am I doing wrong here?

What do I need to give as a positional argument to the ImageCleaner function as shown in the screenshot?

Can Someone tell me what does this system error mean? How to correct this?error

hey thanks @kat6123 the problem has been solved… the problem is the path of the folder…

Question: I have training data in an NPY file whose shape is (814, 440, 440, 1), which consist of 814 images of size 440 x 440 x 1, and the label is an (814, 1) NPY file containing ones and zeroes. Is there a way to create a data bunch from these data? I used to train this with Keras, but I don’t quite know how to do this with the fastai library.

image

image

Thank you!

1 Like

@jeremy In the lesson 2 video lecture jeremy talks about unbalanced data (1.10.10), you say do nothing about unbalanced data classes and says to try it training the network with unbalanced data.But when I read the paper found in the resources section (https://arxiv.org/pdf/1710.05381.pdf) you say “The effect of class imbalance on classification performance is detrimental” in the concusion section of the paper.Can you clarify on this issue.

So I was going through lesson 2 sgd and on running the code for the same i get this error

Code:
def update():
y_hat = x@a
loss = mse(y, y_hat)
#if t % 10 == 0: print(loss)
loss.backward()
with torch.no_grad():
a.sub_(lr * a.grad)
a.grad.zero_()
for i in range(0,100):
update()
lr = 1e-1

And the following error:

RuntimeError Traceback (most recent call last)

in ()
1 for i in range(0,100):
----> 2 update()
3 lr = 1e-1

2 frames

in update()
3 loss = mse(y, y_hat)
4 #if t % 10 == 0: print(loss)
----> 5 loss.backward()
6 with torch.no_grad():
7 a.sub_(lr * a.grad)

/usr/local/lib/python3.6/dist-packages/torch/tensor.py in backward(self, gradient, retain_graph, create_graph)
105 products. Defaults to False.
106 “”"
–> 107 torch.autograd.backward(self, gradient, retain_graph, create_graph)
108
109 def register_hook(self, hook):

/usr/local/lib/python3.6/dist-packages/torch/autograd/init.py in backward(tensors, grad_tensors, retain_graph, create_graph, grad_variables)
91 Variable._execution_engine.run_backward(
92 tensors, grad_tensors, retain_graph, create_graph,
—> 93 allow_unreachable=True) # allow_unreachable flag
94
95

RuntimeError: element 0 of tensors does not require grad and does not have a grad_fn

Can anyone help me out?

This a question regarding, train/test/validation split. I notice the accuracy is being measured in the validation and was wondering if the validation set is used to fine tune the model? If so, wouldn’t that lead to overfitting? Isn’t it better to save a piece of the data and never use it while training and only after the model has been fine tuned? I might be missing something. Thank you!

Yes you can save a subset of your data and treat it as your test set. And what you would do is evaluate your model on the test set at the very end, as a result of which you get a result that can be considered a pretty fair judgement of your model. But then again, the same care that needs to be taken when creating a validation set has to be considered when creating a test set as well.

1 Like

@dreambeats Thank you for the response! Now, is it fair then to compare the results to the state of the art even if they are not measured on the test set?

Can you expect them to be the exact same? Probably not. As long as your test set is a decent representation of the general population of observations, you can probably take the results on the test set quite seriously. One caveat however, is that even if your model does generalise pretty well, its hard to say how well it does on the subset of data that is in the official test set, because its entirely possible that your model happens to do slightly worse on the data that is in there. This is pretty common in Kaggle competitions, the placings on the public leaderboard and private leaderboard tend to vary (sometimes by quite a bit) due to the exact reason that I mentioned above. Just because you did well on one test set doesn’t mean that you’ll do equally well on the other test set, but you usually won’t do too much worse.

1 Like

@dreambeats makes sense! Thank you again for the response.

1 Like

I’m worried about how much of the math I should know. If I complete the SGD notebook and make sure I understand everything in it will I be OK to continue?

Welcome to the forum @nole

You do not need to worry too much about maths behind the seen . In fact that is the beasuti of FastAI top-down approach. First you develop general intuition of the concept & keep moving with with lesson & practice. In the end every thing will gel together & you will be more curious about maths behind. I would say keep moving ahead without worry too much about unknown.