I am getting status error as I run the cell:
urls = search_images(‘bird photos’, max_images=1)
urls[0]
Can anyone please help and I am so much confused between whether to use Kaggle or Colab are the content in both notebooks same?
If I remember correctly, I’ve faced the same issue before. I’m not sure if this will help for your case, but try running this instead:
searches = 'photo of a damaged car','photo of a car'
path = Path('car_damage')
for o in searches:
dest = (path/o)
dest.mkdir(exist_ok=True, parents=True)
try:
download_images(dest, urls=search_images(f'{o} photo'))
resize_images(dest, max_size=400, dest=dest)
except Exception as e:
print(f"An error occurred: {e}")
In my case, I had to run it multiple times to get it to work, but there might be a more efficient way to solve it…
As for the Kaggle vs Colab, I believe there are some lessons in Kaggle Notebooks as you progress through the course, and the textbook is available on Colab. I personally use both platforms. When working through the textbook, I fork the notebook and use my own copy in Colab. And the ease with Kaggle is that you have easy access to the datasets.
Hi, I’m having an issue with the “Is it a bird?” exercise.
The !pip install command is failing… I don’t know why and how to fix this. The internet is connected.
I ran into the same issue and stumbled about the notebook from @chizkidd (see his comment below). I tested his notebook and it worked. Despite his code changes the only difference was that the environment preferences were set to “Pin to original environment”. I checked my copy of this notebook and it was set to “Always use latest environment”.
After switching back to “Pint to original environment” the error was gone and I was able to run the whole notebook ![]()
Hope this works for you, too.
I am using python 3.12. I installed fastai using pip in my virtual environment in vscode. while running the statement:
fastai.vision.all import *
I am getting the following error
ModuleNotFoundError: No module named ‘torch._namedtensor_internals’
I am running the notebooks on my personal laptop.
I had my fastai setup using pip on linux ![]()
# Install required packages first
sudo apt install python3-venv python3-full
# Create virtual environment
python3 -m venv ~/fastai_env
# Activate it
source ~/fastai_env/bin/activate
# Upgrade `pip`, `setuptools`, and `wheel`
pip install --upgrade pip setuptools wheel
# Install ipython & Jupyterlab
pip install ipykernel
pip install jupyterlab
# Install Pytorch from its official site installation manual
pip3 install torch torchvision torchaudio
# Install Fastai , fastbook
pip install fastai
pip install fastbook
pip install sentencepiece
Now , Open the jupyter notebook via terminal (and start doing fastai course):
# Activate the virtual environment
source ~/fastai_env/bin/activate
# To use Jupyter lab
jupyter lab
# To use Jupyter notebook
jupyter notebook
# To use Jupyter notebook Classic (Jeremy's favorite)
jupyter nbclassic
While setup in my local environment I faced the below issue.
URLError: <urlopen error [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1000)>
The error is due to the missing Certificate Authority (CA) certificates that are used to verify SSL/TLS connections.
To fix just ran the below command from terminal
/Applications/Python\ 3.12/Install\ Certificates.command
this worked, thankslegend!!!
Finally got through Chapter 2. To run notebooks for the first two chapters, I tried Paperspace, JarvisLabs.ai, Kaggle, and Colab. In the end, only Colab worked perfectly. The others all had issues running widgets or running code in the notebook. I spent hours researching workarounds, (re)installing dependencies, upgrading/downgrading jupyter, etc. Those obstacles stalled my learning process quite a lot. My goal is just to learn the material, not learn the tech stack (yet), so I went with the platform that just worked for fastai.
I have an existing google account, so here were my steps:
- Go to https://colab.research.google.com/
- The “Open Notebook” prompt appears.
- Click on “Github” option.
- Search for “fastai/fastbook”.
- Choose a chapter to import
(Side note: I used the tips here to use the modified code for duckduckgo after spending a couple of hours trying to get Azure image search to work – Lesson 2: Question on How to Get a Bing Image Search API key - #5 by murilogustineli. The end of chapter 2 suggested helping folks one step behind you, so hopefully this helps).
thanks , i was facing the same problem but your article help me
Hi, I’m taking the course in 2025. Whenever I try to install the dependencies, I always get this error on Kaggle Notebook:
ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.
bigframes 2.8.0 requires google-cloud-bigquery-storage<3.0.0,>=2.30.0, which is not installed.
gensim 4.3.3 requires numpy<2.0,>=1.18.5, but you have numpy 2.2.6 which is incompatible.
gensim 4.3.3 requires scipy<1.14.0,>=1.7.0, but you have scipy 1.15.3 which is incompatible.
mkl-umath 0.1.1 requires numpy<1.27.0,>=1.26.4, but you have numpy 2.2.6 which is incompatible.
mkl-random 1.2.4 requires numpy<1.27.0,>=1.26.4, but you have numpy 2.2.6 which is incompatible.
mkl-fft 1.3.8 requires numpy<1.27.0,>=1.26.4, but you have numpy 2.2.6 which is incompatible.
numba 0.60.0 requires numpy<2.1,>=1.22, but you have numpy 2.2.6 which is incompatible.
datasets 3.6.0 requires fsspec[http]<=2025.3.0,>=2023.1.0, but you have fsspec 2025.5.1 which is incompatible.
ydata-profiling 4.16.1 requires numpy<2.2,>=1.16.0, but you have numpy 2.2.6 which is incompatible.
onnx 1.18.0 requires protobuf>=4.25.1, but you have protobuf 3.20.3 which is incompatible.
google-colab 1.0.0 requires google-auth==2.38.0, but you have google-auth 2.40.3 which is incompatible.
google-colab 1.0.0 requires notebook==6.5.7, but you have notebook 6.5.4 which is incompatible.
google-colab 1.0.0 requires pandas==2.2.2, but you have pandas 2.2.3 which is incompatible.
google-colab 1.0.0 requires requests==2.32.3, but you have requests 2.32.4 which is incompatible.
google-colab 1.0.0 requires tornado==6.4.2, but you have tornado 6.5.1 which is incompatible.
dopamine-rl 4.1.2 requires gymnasium>=1.0.0, but you have gymnasium 0.29.0 which is incompatible.
pandas-gbq 0.29.1 requires google-api-core<3.0.0,>=2.10.2, but you have google-api-core 1.34.1 which is incompatible.
imbalanced-learn 0.13.0 requires scikit-learn<2,>=1.3.2, but you have scikit-learn 1.2.2 which is incompatible.
plotnine 0.14.5 requires matplotlib>=3.8.0, but you have matplotlib 3.7.2 which is incompatible.
tensorflow 2.18.0 requires numpy<2.1.0,>=1.26.0, but you have numpy 2.2.6 which is incompatible.
bigframes 2.8.0 requires google-cloud-bigquery[bqstorage,pandas]>=3.31.0, but you have google-cloud-bigquery 3.25.0 which is incompatible.
bigframes 2.8.0 requires rich<14,>=12.4.4, but you have rich 14.0.0 which is incompatible.
mlxtend 0.23.4 requires scikit-learn>=1.3.1, but you have scikit-learn 1.2.2 which is incompatible.
I have tried changing ‘duckduckgo_search>=6.2’ – to → ‘ddgs>=6.2’ to see if that changes things but the error still persists. I also already tried prompting differnt LLMs for help, but I’m still stuck. I need help.
Someone gave this solution last year and it just worked for me. I was able to run the first workbook
!pip install -Uqq fastai ddgs --use-deprecated=legacy-resolver
as well as changing the library name below for duckduckgo_search
from ddgs import DDGS #DuckDuckGo has changed the api so we need to update
I’ve been troubleshooting some compatibility issues trying to get the notebook to run for lesson 1. This is the latest hurdle. I tried to install the named library and then the error changed to “Path is never defined”
Hi for lesson 2 when I run search_images_ddg module I get connection timed out error even after trying multiple times
TimeoutError Traceback (most recent call last)
/usr/lib/python3.11/urllib/request.py in do_open(self, http_class, req, **http_conn_args)
1347 try:
→ 1348 h.request(req.get_method(), req.selector, req.data, headers,
1349 encode_chunked=req.has_header(‘Transfer-encoding’))
17 frames
TimeoutError: [Errno 110] Connection timed out
During handling of the above exception, another exception occurred:
URLError Traceback (most recent call last)
/usr/lib/python3.11/urllib/request.py in do_open(self, http_class, req, **http_conn_args)
1349 encode_chunked=req.has_header(‘Transfer-encoding’))
1350 except OSError as err: # timeout error
→ 1351 raise URLError(err)
1352 r = h.getresponse()
1353 except:
URLError: <urlopen error [Errno 110] Connection timed out>
I made the changes, but executing the next part still throws up errors on modules needing to be recompiled? Not sure if I missed a step somewhere
Hi,
I’m starting this course in 2025, new to the AI world.
In the middle of reading the first chapter of the book, it says follow the instructions to get connected to a GPU server, but I can’t find the instructions anywhere.
I registered to Kaggle and enabled internet.
Is there any further steps that I need to follow?
The book says you should have two folders after the setup, one is a full version of the notebooks and another is stripped version.
Please let me know if there are some steps to finish setup, or it’s okay to just continue with Kaggle (or setup needed for Kaggle)
TIA!
Yeah I have the same question: “We maintain a list of our recommended options on the book’s website, so go there now and follow the instructions to get connected to a GPU deep learning server.” I can’t find this anywhere!






