I’ve successfully set up on both GCP and Paperspace. Paperspace works fine. My GCP instance starts fine but then as I am using jupyter, the interface becomes completely unresponsive except in one regard. That one thing is that the kernel still appears to be running because I am able to execute code and that code output is still displayed in the result.
I find this behavior very strange.
Even though I am able to access my Jupyter notebooks on GCP immediately after it boots, I can run jupyter notebook list in the command line and it returns that no notebooks are running.
I can utilize the Jupyter notebook server for a time but then it becomes completely unresponsive.
Any ideas would be helpful about how to solve or further troubleshoot the issue.
@arunoda, see image for memory and CPU usage. CPU usage varies on if the kernel is actively executing code or not. In the moment I took the screenshot, it obviously wasn’t.
CPU usage hovers around 5% when the line of code that is causing problems is executed.
Reloading the Jupyter file browser never finishes (looks like the Jupyter server is not receiving or not sending back to the browser. The notebook appears to connect to an IPython kernel but that is it.
The suspected problem line of code is interp.plot_top_losses(9, figsize=(15,11)).
It’s non responsive because may be your internet connection is slow and that line of code pulls image from the cloud-server. You may further investigate this monitoring internet speed.
Hope this helps.
The image appears just fine when executing the suspect line of code but all basic functionality of the notebook dies. I cannot save or open any notebook. The notebook server is totally unresponsive until restart.
I haven’t narrowed down the cause like you did, but after using fast.ai for a while I can’t save my notebook (which has caused some lost time!) or interact with Jupyter any other way, but the kernel continues to run.
My workaround is when I see it’s not saving (by looking in the top left corner of notebook):
Save the page in browser as a backup
Restart the instance (I find restarting Jupyter alone wasn’t fixing it for me)
Open the notebook in a new tab
Copy and paste code from the old tab/backup into the new tab
It’s very painful! I don’t see this complaint in other forums so I may have to switch out of GCP.
Is there a chance jupyter notebook may be more stable than jupyter lab?
I am also having the same issue of localhost:8080/tree getting irresponsive (though the current notebook is running, but I can’t able to save it) for most of the time and issue does not resolve even after rebooting the instance.
I am from India, and using the set up as described in https://course-v3.fast.ai/start_gcp.html and using the us-west2-b region.
The terminal also a bit slow also.
This need to be fixed somehow!! Any suggestions?
I am also having the same issue of localhost:8080/tree getting irresponsive (though the current notebook is running, but I can’t able to save it) for most of the time and issue does not resolve even after rebooting the instance.
I’ve had a lot less problems since switching to Jupyter notebook instead of Jupyter lab. It’s still slow to save, especially if your notebook has lots of images, but it will and the kernel doesn’t freeze (and it’s totally fine to keep working while it saves).
The terminal also a bit slow also.
This is just a fact of being on the other side of the world from the server you’re connecting to. I’ve just gotten used to it (I’m in Australia).
Still having issues! Not able to access localhost:8080/tree, showing Server error: error except the current notebook is running fine (but not able to save it).
Here is it:
It’s happening from the very beginning, and it’s very much frustrating as I am not able to open/upload anything or save the current notebook !!!
I am on GCP (us-west2-b) using n1-highmem-8 instance.