Is there any way to have some sort of intellisense for python in kaggle jupyter notebooks , I was pretty used to this in vs code but not having it available in kaggle feels like a major source of friction while coding.
I tried google colab and while it does have intellisense built in , it’s really slow. So while it’s better than kaggle , it’s still no where near the experience you have while coding in your local dev environment.
Alternatively , is there anyway I can just interact with the juptyter notebooks on my local machine but have them execute code on kaggle/google colab servers.
I am having an issue about installing miniai in colab. this problem was not showing at first some weeks. But suddenly it appears and I cant even progress now. Please save me from here.
I am unfamiliar with the miniai and torchaudio libraries so I can’t help with specifics, but I searched around and found these two issues/solutions that might be related to your problem:
Hi, experts.
I’m working on course22p2 with Colab. I notice that project needs to use command like
nbdev.nbdev_export()
to export module as part of miniai. However, It always shows the error when running the command in /content/drive/MyDrive/Colab Notebooks path:
InterpolationMissingOptionError: Bad value substitution: option 'lib_name' in section 'DEFAULT' contains an interpolation key 'repo' which is not a valid option name. Raw value: '%(repo)s'
After searched around, it seems that i need to config settings.ini file in google drive. So i used nbdev_create_config command to create a new config file and saved. However, that issue still happened.
Then, i noticed that i had hidden config file which can be checked with command get_config(). The .config_file shows Path('/content/drive/MyDrive/setting.ini'), but i can’t find this file in my folder at all. I think that file was applied when running nbdev_export().
Can anyone help with that? How can i apply the new config file when running export? How can i modify the old one? Thank you!
If anyone encounters a similar issue, please share your experience. The system was functioning smoothly, and I was progressing through the stable diffusion section of the course. However, suddenly one morning I have encountered this problem, and I’m attaching a screenshot for reference. It got stucked in the miniai.learner part.
Initially, I cloned the repository and performed a pip installation with !pip install e. . After that, I executed the cell with ‘#|default_exp init’ and after that ran the cell as shown in the attached screenshot. This process used to work smoothly in the past, but now it’s not functioning as expected.
Alternatively, if someone is familiar with the stable diffusion part and can provide guidance on how to install the required libraries and run the models in colab, I’d greatly appreciate it. If you install miniai and ran the stable diffusion part please share your codes. It will help me
stable diffusion in Colab : Please if anyone ran the stable diffusion part with proper installation with miniai in “Colab”, then please share his or her code here.
miniai module is generated from the previous course with nbdev.nbdev_export().
In nbdev config file setting.ini, it sets the lib_name = miniai, so we can use it in the future.
I’m facing the similar error by using nbdev with the correct config file.
After spending two nights for 10 hours, i finally resolved it on Colab
TLDR
You have to have similar folder structure with nbs folder, miniai folder, setup.py and settings.ini in your folder
Before running nbdev.nbdev_export(), you have to be the nbs folder. So that get_config() can be setup correctly and generate module for further course.
I’m missing something about Colab I think… While I’m reading the notebook and executing code, it randomly times out and says I’ve been disconnected from the runtime. It then reconnects but I have to run every code snippet from the beginning to where I was because it’s a new instance that has lost the context.
Why is this happening to me? Is Kaggle better with this than Colab? I really don’t mind not using Google’s stuff.
I’m going through the colab jupiter notebook over at this url: Google Colab
that regards the first chapter of the book, however when i go to the “Running your first notebook” section, i execute the code cell with a commented “CLICK ME” phrase, that downloads pet images and ends with a cal to fine_tune() method, takes forever to execute (I’m talking over an hour and possibly more, i quit the execution because it just doesn’t make sense).
I’m fully connected to the internet, have 600 mpbs speed, why am i facing this issue?
one more thing, i don’t know if you could help me with this as well but I’m at the “Deep learning is not just for classification” section where I must fine tune a sentiment of a movie model. I’m using the hardware accelerator you provided but the cell execution is taking forever, on the line saying:
learn.fine_tune(4, 1e-2)
my RAM and Disk usage seem to be at acceptable levels too