Thank you so much for the information. I will check them out I would like to pay a year lump sum and be able to engage off if my circumstances change. I hope they make ending the engagement easy. If you have any info on this please share, and thanks again. I.
Does anyone have a reliable setup recommendation for cloud GPU offerings?
Iβm trying to follow the text transfer learning section of the docs which requires training a language model on IMDB reviews prior to the classifier step.
In the last 72+ hours, Iβve tried Google Colab (free & purchasing computing units), Codespace & Kaggle. Hereβs my issue: each epoch takes ~50minutes! and I need to run 10 epochs. 1) This seems very slow to me 2) The Colab & Kaggle notebooks timeout
Open to advise and/or corrections.
Okay, Iβve found a temporary workaround that wonβt break my wallet. I found an old laptop in storage with a 2GB GPU. Itβs mighty slow but itβs free and I donβt have to worry about timeouts. I will stick to Kaggle for the lessons (considering the GPU limit per week) and use the laptop for long training exercises.
Hello Everyone, I am trying to install fastai as mentioned by Jeremy in his second video in 31 minutes and 54 minutes. I have 2 questions
- I believe the Windows terminal is installed but it always shows as βWindows Powershellβ is it okay and the same thing?
- I downloaded vim added to the path as it was not recognising it earlier and it gives me same screen as in video but when I re-run the terminal, it does not recognise the mamba command?
Anyone who has done this setup on the system and would like to share their steps? Thank you
hi @mab.fayyaz, Thanks for including those time-stamps. Even more convenient (so answers are more likely) would be having a direct to-the-second link into video, which you can get by right-clicking the video and choosing βCopy video URL at current timeβ.
Its not clear to me if you download vim:
- βin windowsβ
- or βin Windows Subsystem For Linux (WSL)β.
I suspect the former, where-as you want the latter.
To the right of PowerShell is a pull down arrow where you should be able to select Ubuntuβ¦
If you donβt see Ubuntu, install it from the Windows Store.
This looks like the latestβ¦
Hi @bencoman thank you for the explanation and sure thing that make sense. I shall do screenshots from now on.
So I need to download vim via Ubuntu or Ubuntu does not require vim at all?
Thank you for this I had the same issue but with the library platformdirs so I did mamba install platformdirs and then mamba install -c fastchan fastai -y
it worked ! Thank you
A separate βdownloadβ is usually not something you do to install software on Linux.
You just install it directly and it also goes out and gets all the dependencies.
e.g. sudo apt install vim
Hi all, apologies in advance if this was asked/resolved earlier, I couldnβt find it but I may have missed it!
I have two questions:
Question/Issue 1: How do you get a localhost endpoint in Paperspace to be exposed externally (preferably without making it a public notebook)?
Here is a screenshot from the Lesson 3 video [timestamp: 21:21]:
Following along on my own Paperspace instance, I used the terminal to get the local IP using curl -4 icanhazip.com
, and tried to navigate to that URL instead of the localhost version (http://127.0.0.1:3000/ as shown in the screenshot) at port 3000, and it wasnβt accessible. Let me know if thereβs any other method, or if this is simply not possible!
Question/Issue 2: mamba installing fastai on personal machine
Following Jeremyβs advice of getting a local setup working even if youβre a Paperspace user, I tried to get a local environment set up.
My machine: MacBook Pro (13-inch, 2020); Processor (2.3 GHz Quad-Core Intel Core i7); OS (Catalina 10.15.6) => This is NOT one of the newer M1 or M2 macs, in case this is relevant/problematicβ¦
I did the following steps and ran into the following error:
- Clone
fastsetup
repo - Download and setup mamba:
./setup-conda.sh
- Create new mamba environment:
mamba create -n fastai python=3.10
- Activate mamba environment:
mamba activate fastai
- Install fastai:
mamba install -c fastchan fastai
(fastai) adityapalepu@Adityas-MBP fastsetup % mamba install -c fastchan fastai
__ __ __ __
/ \ / \ / \ / \
/ \/ \/ \/ \
βββββββββββββββ/ /ββ/ /ββ/ /ββ/ /ββββββββββββββββββββββββ
/ / \ / \ / \ / \ \____
/ / \_/ \_/ \_/ \ o \__,
/ _/ \_____/ `
|/
ββββ ββββ ββββββ ββββ βββββββββββ ββββββ
βββββ ββββββββββββββββββ βββββββββββββββββββββ
ββββββββββββββββββββββββββββββββββββββββββββββ
ββββββββββββββββββββββββββββββββββββββββββββββ
βββ βββ ββββββ ββββββ βββ ββββββββββββββ βββ
βββ ββββββ ββββββ ββββββββββ βββ βββ
mamba (1.4.2) supported by @QuantStack
GitHub: https://github.com/mamba-org/mamba
Twitter: https://twitter.com/QuantStack
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
Looking for: ['fastai']
conda-forge/osx-64 Using cache
conda-forge/noarch Using cache
pkgs/main/osx-64 No change
pkgs/r/osx-64 No change
pkgs/main/noarch No change
pkgs/r/noarch No change
fastchan/noarch 88.7kB @ 372.5kB/s 0.2s
fastchan/osx-64 725.3kB @ 390.1kB/s 1.9s
Pinned packages:
- python 3.10.*
warning libmamba Problem type not implemented SOLVER_RULE_STRICT_REPO_PRIORITY
warning libmamba Problem type not implemented SOLVER_RULE_STRICT_REPO_PRIORITY
warning libmamba Problem type not implemented SOLVER_RULE_STRICT_REPO_PRIORITY
*[... repeats many times...]*
warning libmamba Problem type not implemented SOLVER_RULE_STRICT_REPO_PRIORITY
Could not solve for environment specs
The following packages are incompatible
ββ fastai is installable with the potential options
ββ fastai [2.1.10|2.6.3|...|2.7.9] would require
β ββ pillow >6.0.0 with the potential options
β ββ pillow [10.0.0|9.4.0|9.5.0] would require
β β ββ libzlib >=1.2.13,<1.3.0a0 , which conflicts with any installable versions previously reported;
β ββ pillow [10.0.0|9.4.0|9.5.0] would require
β β ββ python >=3.11,<3.12.0a0 , which conflicts with any installable versions previously reported;
β ββ pillow [9.1.1|9.2.0] would require
β β ββ libzlib >=1.2.12,<1.3.0a0 , which conflicts with any installable versions previously reported;
β ββ pillow 9.1.1 would require
β β ββ libzlib >=1.2.11,<1.3.0a0 with the potential options
β β ββ libzlib 1.2.11 would require
β β β ββ zlib 1.2.11 *_1014, which can be installed;
β β ββ libzlib 1.2.11 would require
β β β ββ zlib 1.2.11 *_1013, which can be installed;
β β ββ libzlib 1.2.11 would require
β β β ββ zlib 1.2.11 *_1012, which can be installed;
β β ββ libzlib 1.2.11 conflicts with any installable versions previously reported;
β β ββ libzlib 1.2.12 conflicts with any installable versions previously reported;
β β ββ libzlib 1.2.13 conflicts with any installable versions previously reported;
β β ββ libzlib 1.2.13 conflicts with any installable versions previously reported;
β ββ pillow [8.2.0|9.1.1] would require
β β ββ python >=3.7,<3.8.0a0 , which can be installed;
β ββ pillow [8.2.0|9.1.1] would require
β β ββ python >=3.9,<3.10.0a0 , which can be installed;
β ββ pillow 8.2.0 would require
β β ββ python >=3.6,<3.7.0a0 , which conflicts with any installable versions previously reported;
β ββ pillow [8.2.0|9.1.1] would require
β β ββ python >=3.8,<3.9.0a0 , which can be installed;
β ββ pillow [10.0.0|6.1.0|...|9.5.0] conflicts with any installable versions previously reported;
ββ fastai 1.0.61 would require
β ββ pillow with the potential options
β ββ pillow [10.0.0|9.4.0|9.5.0], which cannot be installed (as previously explained);
β ββ pillow [10.0.0|9.4.0|9.5.0], which cannot be installed (as previously explained);
β ββ pillow [9.1.1|9.2.0], which cannot be installed (as previously explained);
β ββ pillow 9.1.1, which can be installed (as previously explained);
β ββ pillow [8.2.0|9.1.1], which can be installed (as previously explained);
β ββ pillow [8.2.0|9.1.1], which can be installed (as previously explained);
β ββ pillow 8.2.0, which cannot be installed (as previously explained);
β ββ pillow [8.2.0|9.1.1], which can be installed (as previously explained);
β ββ pillow [10.0.0|6.1.0|...|9.5.0] conflicts with any installable versions previously reported;
β ββ pillow [3.0.0|3.2.0|...|6.0.0] conflicts with any installable versions previously reported;
β ββ pillow 3.2.0 would require
β ββ freetype 2.5* , which does not exist (perhaps a missing channel);
ββ fastai [2.7.10|2.7.11] conflicts with any installable versions previously reported.
Any tips/pointers? Thanks!
I just wanted to report that as of today, M1 Macbook Pro seems to work with GPU acceleration out of the box.
Seriously, the setup couldnβt be easier.
pip install -r .devcontainer/requirements.txt
Is all I needed to do in a pyenv installed python 3.10.
Here are the training times from an unmodified 02-saving-a-basic-fastai-model.ipynb
.
And here is the same on a Kaggle P100 GPU
I donβt know how will it be on later examples, but it seems that at least this example works out of the box as of today.
Hi!
Iβm having some problems to use Kaggle Notebook for the Lesson 1 and Iβm not sure what I am not doing well. Here you can see the errors which are seen when I try to play the first codes (below the message).
I would be grateful if you could help me.
Thank you very much!!
Josep Mencion
Possibly you havenβt enabled external internet access.
Thank you very much!! Now it is working. Best.
Josep Mencion Seguranyes
Hey guys, Iβm trying to do lesson 1 but am having some trouble with the setup in Kaggle. I try to run the various code snippets and receive errors for effectively everything. Itβs as if Iβm not importing the requisite library(s), but not sure what exactly is up. Any help would be appreciated.
Hey, following up from my previous post β having trouble with the setup. When I try to import the fastbook library I get the following error. Appreciate some help!
It looks like you havenβt defined urls
yet. If you are following Jeremyβs notebook βIs it a bird?β, urls
is defined as a list of image URLs returned by the search_images
function:
#NB: `search_images` depends on duckduckgo.com, which doesn't always return correct responses.
# If you get a JSON error, just try running it again (it may take a couple of tries).
urls = search_images('bird photos', max_images=1)
urls[0]
Here is the code for the search_images
function:
from duckduckgo_search import ddg_images
from fastcore.all import *
def search_images(term, max_images=30):
print(f"Searching for '{term}'")
return L(ddg_images(term, max_results=max_images)).itemgot('image')
The likely problem here is that there is that [ -e /content ]
returns false, so it doesnβt run pip install -Uqq fastbook
because of the &&
conditional (it will only run the pip install
if [ -e /content ]
is truthy). Instead, try running just the following:
! pip install -Uqq fastbook
And see if that resolves your issue.
I found this forum post which explains this in more detail.
Hey, thanks for getting back! Iβm actually just following along in the Kaggle notebook so all the code is there (i.e. everything, including urls, is predefined). Not sure why itβs not working!
Getting the below 403 error when trying to run the below cell. I used the run all function to ensure all prior cells had been executed, ensured internet was turned on, and ran it a few times even though this isnt a JSON error. New to this and not sure what to do here. Thanks!
#NB: search_images
depends on duckduckgo.com, which doesnβt always return correct responses.
#If you get a JSON error, just try running it again (it may take a couple of tries).
urls = search_images(βbird photosβ, max_images=1)
urls[0]
See message by Devesh2000 on May 23 for the fix