Sorry, I should have mention it It’s already set up and running, I can run the Intro notebook with it.
How did Jeremy hide code output?
Folks don’t run the nbs in parallel; Watch the lecture first!
Ensure you have the GPU enabled by going to Runtime → Change Runtime Type and selecting GPU
Widgets FileUpload was discussed. Learn more at https://ipywidgets.readthedocs.io/en/latest/examples/Widget%20List.html#File-Upload
Also, don’t worry if you aren’t familiar with GPUs, or why you (probably) need a cloud server. I answer this and some related questions in this post:
Edited to add: The suggestions of which services to use are dated (post is from 2017), but the underlying ideas are the same
A handy chunk to print outputs from multiple variables in a notebook:
from IPython.core.interactiveshell import InteractiveShell
InteractiveShell.ast_node_interactivity = 'all'
This would give output from all the following variables, as opposed to just the last one by default
10+2 # now, this will also be printed
10+8 # traditionally, only this is printed
You can try running with num_workers=0
and see if it gives you a better error (as it doesn’t use subprocesses).
@sgugger and/or @rachel just to clarify one thing. For this year’s version of the course, there is no AWS support. Therefore your recommendation of using the options listed in the Setup help channel.
Is the “update” a sort of backpropagation?
This, so called “automatic means of (error correcting the weights)”… what is the current thinking about what is behind this?
Do we say, “It just works”?? Or is there a theory or explanation behind the effectiveness of these processes?
Updating the weights is back propagation + your optimizer step. Will learn more in future lessons.
There is plenty of theory, and we will come back to it in further lessons.
Hello,
Probably a dumb question on Jupyter Notebook:
How to get those collapsible buttons next to the markdown headings? I’m using chrome on MacOS and I don’t see them in the notebooks (course v4, 01_intro.ipynb).
Thanks.
Is there a difference bettew parameters and hyper-parameters?
Google for nbextensions
Hyper Params - Manually set by us. Params - Automatically calculated during the training process.
This is not a dumb question, and this is because Jeremy is using the collapsible headings extensions. See here how to install jupyter nbextensions, then activate collapsible headers.
Parameters are learned by the model. Hyper-parameters are set by you to help the model learn effectively
Hyperparameters and parameters are often used interchangeably but there is a difference between them. You call something a ‘hyperparameter’ if it cannot be learned within the estimator directly. However, ‘parameters’ is more general term. When you say ‘passing the parameters to the model’, it generally means a combination of hyperparameters along with some other parameters that are not directly related to your estimator but are required for your model.