Lesson 2: how to ensure fast.io is installed in huggingface?

I’m trying to follow Lesson 2, deploying the application and the model to huggingface. I’ve committed and pushed my changes and huggingface is giving me the following error:

Traceback (most recent call last):
File “app.py”, line 1, in
from fastai.vision.all import *
ModuleNotFoundError: No module named ‘fastai’

How do I install the required python modules in huggingface?

Hey Thomas, :wave:

You need to tell huggingface what requirements your code needs to run.

You do this by providing a requirements.txt file in the root of your project directory. The following links will give you a little more information about how to structure a python project which you may find helpful.

If you’re on google colab you can generate a requirements.txt file by typing the following in colab terminal (icon bottom left of the window), after running all the cells in your notebook.

pip freeze > requirements.txt

You can then add the requirements.txt to your project directory and push that to huggingface.

I hope this moves you forward. Please let me know if you have any questions/issues. :slightly_smiling_face:

1 Like

Thanks, that worked and I was able to deploy my app to hugging faces. Just wondering though, pip freeze produced a file with over 200 lines in it, presumably including all the python modules in my OS installation. Wondering if there is a way to generate requirements.txt containing only the modules my python script is actually using.

I’m glad I was able to help you! :slightly_smiling_face:

If you’re using pip locally you should look at using a virtual environment. This will give you a way of installing specific packages and their versions for a given project you’re working on so you can avoid conflicting versions, etc.

Here’s a pretty comprehensive overview of what a virtual environment is.

Here’s a guide to set up a virtual environment on your machine using virtualenv

There are a number of different ways to create and manage virtual environments. I know Jeremy recommends anaconda/mamba.

Here’s an answer on stack overflow that compares different virtualenv managers.

Edit

I forgot to mention that when you install a package e.g. Gradio you’re also installing the dependencies of that package as well. These are called transitive dependencies. Below is part of the output of running pip install gradio

Installing collected packages: pydub, ffmpy, websockets, uc-micro-py, semantic-version, python-multipart, orjson, multidict, h11, frozenlist, async-timeout, aiofiles, yarl, uvicorn, starlette, mdit-py-plugins, linkify-it-py, huggingface-hub, httpcore, aiosignal, httpx, fastapi, aiohttp, gradio-client, gradio

This is one of the reasons your requirements.txt file is so large even if you use a virtual environment. :slightly_smiling_face:

The Pip Chill library can perhaps do this: pip-chill · PyPI
It only lists packages that are not dependencies of other packages.

Though it still is a good idea to create a virtual environment as @Grey-Shirt said. Though Pip Chill can still be useful in virtual environments.

Hi! I’m wondering if you could also help me out.

I was previously getting the same error message as Thomas, then I ran “pip freeze > requirements.txt” and pushed to huggingface. I then got an error saying:


→ RUN --mount=target=requirements.txt,source=requirements.txt pip install --no-cache-dir -r requirements.txt
Defaulting to user installation because normal site-packages is not writeable
Processing /home/conda/feedstock_root/build_artifacts/aiofiles_1677173365185/work
ERROR: Could not install packages due to an OSError: [Errno 2] No such file or directory: ‘/home/conda/feedstock_root/build_artifacts/aiofiles_1677173365185/work’

→ ERROR: process “/bin/sh -c pip install --no-cache-dir -r requirements.txt” did not complete successfully: exit code: 1


Running “pip list --format=freeze > requirements.txt” instead helped solve that problem, but now I’m getting:


ERROR: Could not find a version that satisfies the requirement conda==23.3.1 (from versions: 3.0.6, 3.5.0, 3.7.0, 3.17.0, 4.0.0, 4.0.1, 4.0.2, 4.0.3, 4.0.4, 4.0.5, 4.0.7, 4.0.8, 4.0.9, 4.1.2, 4.1.6, 4.2.6, 4.2.7, 4.3.13, 4.3.16)
ERROR: No matching distribution found for conda==23.3.1

[notice] A new release of pip available: 22.3.1 → 23.1.2
[notice] To update, run: python -m pip install --upgrade pip

→ ERROR: process “/bin/sh -c pip install --no-cache-dir -r requirements.txt” did not complete successfully: exit code: 1


I’d really appreciate any thoughts you might have! In case it’s helpful, I’m using mamba. Not sure why I’m getting the notice to upgrade pip, I already tried upgrading it and pushing app.py again.
conda==23.3.1 is indeed in my requirements.txt file, but I’m not sure why it’s an issue.

My guess is that hugging face is expecting your dependencies from pip and not conda.

As I understand it, when you export your dependancies from conda/mamba you put them in a environment.yml file. See the mamba docs.

Hugging face is expecting a requirements.txt file so the simplest thing to do would be to create a pip virtual environment and install all the dependancies to that and then export them to a requirements.txt file and push that to hugging face.

Hopefully that will get you up and running! :slightly_smiling_face:

As I understand it you can use pip inside a conda env but you can’t install conda packages inside a pip env, as they are two different package managers.

However, using pip to install packages as part of a conda env is discouraged
Only after conda has been used to install as many packages as possible should pip be used to install any remaining software.

1 Like

Working! Thank you so much. I realized I hadn’t moved off of my “base” environment in mamba, so create a new mamba environment and then pip venv environment inside of that, then downloading everything and freezing it to a requirements.txt solved my issue.

Can you give more details of what you did here? I am getting the same error. I am currently in a jupyternotebook and exporting the py script and now running into this issue after uploading the requirements.txt to hugging face.