Trouble Installing Fastbook Locally

Hello!

So I’m having trouble installing Fastbook locally on my machine. I’ve tried searching the forums to see if someone has had a similar error to mine, but it seems not.

I’m simply running:
pip install fastbook

I have also tried pip install -Uqq fastbook and it produces the same error. Judging by the logs below, do I need to install a Rust compiler?

I’m using a Mac with an M1 chip, Python 3.10.4, and macOS Monterey 12.3.1.

Any help would be appreciated!

The following is my output log:

Collecting fastbook
  Using cached fastbook-0.0.26-py3-none-any.whl (719 kB)
Collecting pandas
  Using cached pandas-1.4.2-cp310-cp310-macosx_11_0_arm64.whl (10.1 MB)
Collecting graphviz
  Using cached graphviz-0.20-py3-none-any.whl (46 kB)
Collecting sentencepiece
  Using cached sentencepiece-0.1.96-cp310-cp310-macosx_11_0_arm64.whl
Collecting fastai>=2.6
  Using cached fastai-2.6.3-py3-none-any.whl (197 kB)
Collecting datasets
  Using cached datasets-2.2.2-py3-none-any.whl (346 kB)
Collecting packaging
  Using cached packaging-21.3-py3-none-any.whl (40 kB)
Collecting transformers
  Using cached transformers-4.19.2-py3-none-any.whl (4.2 MB)
Collecting requests
  Using cached requests-2.27.1-py2.py3-none-any.whl (63 kB)
Requirement already satisfied: pip in /Users/[REDACTED]/PycharmProjects/DDGImageScraper/venv/lib/python3.10/site-packages (from fastbook) (21.3.1)
Collecting torch<1.12,>=1.7.0
  Using cached torch-1.11.0-cp310-none-macosx_11_0_arm64.whl (43.1 MB)
Collecting torchvision>=0.8.2
  Using cached torchvision-0.12.0-cp310-cp310-macosx_11_0_arm64.whl (1.2 MB)
Collecting scikit-learn
  Using cached scikit_learn-1.1.1-cp310-cp310-macosx_12_0_arm64.whl (7.7 MB)
Collecting spacy<4
  Using cached spacy-3.3.1-cp310-cp310-macosx_11_0_arm64.whl (6.3 MB)
Collecting fastcore<1.5,>=1.3.27
  Using cached fastcore-1.4.4-py3-none-any.whl (60 kB)
Collecting fastdownload<2,>=0.0.5
  Using cached fastdownload-0.0.6-py3-none-any.whl (12 kB)
Collecting fastprogress>=0.2.4
  Using cached fastprogress-1.0.2-py3-none-any.whl (12 kB)
Collecting scipy
  Using cached scipy-1.8.1-cp310-cp310-macosx_12_0_arm64.whl (28.7 MB)
Collecting matplotlib
  Using cached matplotlib-3.5.2-cp310-cp310-macosx_11_0_arm64.whl (7.2 MB)
Collecting pillow>6.0.0
  Using cached Pillow-9.1.1-cp310-cp310-macosx_11_0_arm64.whl (2.8 MB)
Collecting pyyaml
  Using cached PyYAML-6.0-cp310-cp310-macosx_11_0_arm64.whl (173 kB)
Collecting responses<0.19
  Using cached responses-0.18.0-py3-none-any.whl (38 kB)
Collecting dill<0.3.5
  Using cached dill-0.3.4-py2.py3-none-any.whl (86 kB)
Collecting tqdm>=4.62.1
  Using cached tqdm-4.64.0-py2.py3-none-any.whl (78 kB)
Collecting fsspec[http]>=2021.05.0
  Using cached fsspec-2022.5.0-py3-none-any.whl (140 kB)
Collecting pyarrow>=6.0.0
  Using cached pyarrow-8.0.0-cp310-cp310-macosx_11_0_arm64.whl (16.2 MB)
Collecting aiohttp
  Using cached aiohttp-3.8.1-cp310-cp310-macosx_11_0_arm64.whl (552 kB)
Collecting huggingface-hub<1.0.0,>=0.1.0
  Using cached huggingface_hub-0.7.0-py3-none-any.whl (86 kB)
Collecting numpy>=1.17
  Using cached numpy-1.22.4-cp310-cp310-macosx_11_0_arm64.whl (12.8 MB)
Collecting xxhash
  Using cached xxhash-3.0.0-cp310-cp310-macosx_11_0_arm64.whl (30 kB)
Collecting multiprocess
  Using cached multiprocess-0.70.13-py310-none-any.whl (133 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.3-py3-none-any.whl (61 kB)
Collecting charset-normalizer~=2.0.0
  Using cached charset_normalizer-2.0.12-py3-none-any.whl (39 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.9-py2.py3-none-any.whl (138 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2022.5.18.1-py3-none-any.whl (155 kB)
Collecting pyparsing!=3.0.5,>=2.0.2
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting pytz>=2020.1
  Using cached pytz-2022.1-py2.py3-none-any.whl (503 kB)
Collecting python-dateutil>=2.8.1
  Using cached python_dateutil-2.8.2-py2.py3-none-any.whl (247 kB)
Collecting tokenizers!=0.11.3,<0.13,>=0.11.1
  Using cached tokenizers-0.12.1.tar.gz (220 kB)
  Installing build dependencies: started
  Installing build dependencies: finished with status 'done'
  Getting requirements to build wheel: started
  Getting requirements to build wheel: finished with status 'done'
  Preparing metadata (pyproject.toml): started
  Preparing metadata (pyproject.toml): finished with status 'done'
Collecting regex!=2019.12.17
  Using cached regex-2022.6.2-cp310-cp310-macosx_11_0_arm64.whl (281 kB)
Collecting filelock
  Using cached filelock-3.7.1-py3-none-any.whl (10 kB)
Collecting typing-extensions>=3.7.4.3
  Using cached typing_extensions-4.2.0-py3-none-any.whl (24 kB)
Collecting six>=1.5
  Using cached six-1.16.0-py2.py3-none-any.whl (11 kB)
Collecting pathy>=0.3.5
  Using cached pathy-0.6.1-py3-none-any.whl (42 kB)
Collecting typer<0.5.0,>=0.3.0
  Using cached typer-0.4.1-py3-none-any.whl (27 kB)
Collecting spacy-loggers<2.0.0,>=1.0.0
  Using cached spacy_loggers-1.0.2-py3-none-any.whl (7.2 kB)
Requirement already satisfied: setuptools in /Users/[REDACTED]/PycharmProjects/DDGImageScraper/venv/lib/python3.10/site-packages (from spacy<4->fastai>=2.6->fastbook) (60.2.0)
Collecting wasabi<1.1.0,>=0.9.1
  Using cached wasabi-0.9.1-py3-none-any.whl (26 kB)
Collecting langcodes<4.0.0,>=3.2.0
  Using cached langcodes-3.3.0-py3-none-any.whl (181 kB)
Collecting pydantic!=1.8,!=1.8.1,<1.9.0,>=1.7.4
  Using cached pydantic-1.8.2-py3-none-any.whl (126 kB)
Collecting spacy-legacy<3.1.0,>=3.0.9
  Using cached spacy_legacy-3.0.9-py2.py3-none-any.whl (20 kB)
Collecting blis<0.8.0,>=0.4.0
  Using cached blis-0.7.7-cp310-cp310-macosx_11_0_arm64.whl (1.1 MB)
Collecting cymem<2.1.0,>=2.0.2
  Using cached cymem-2.0.6-cp310-cp310-macosx_11_0_arm64.whl (30 kB)
Collecting thinc<8.1.0,>=8.0.14
  Using cached thinc-8.0.17-cp310-cp310-macosx_11_0_arm64.whl (584 kB)
Collecting srsly<3.0.0,>=2.4.3
  Using cached srsly-2.4.3-cp310-cp310-macosx_11_0_arm64.whl (457 kB)
Collecting murmurhash<1.1.0,>=0.28.0
  Using cached murmurhash-1.0.7-cp310-cp310-macosx_11_0_arm64.whl (19 kB)
Collecting catalogue<2.1.0,>=2.0.6
  Using cached catalogue-2.0.7-py3-none-any.whl (17 kB)
Collecting preshed<3.1.0,>=3.0.2
  Using cached preshed-3.0.6-cp310-cp310-macosx_11_0_arm64.whl (101 kB)
Collecting jinja2
  Using cached Jinja2-3.1.2-py3-none-any.whl (133 kB)
Collecting async-timeout<5.0,>=4.0.0a3
  Using cached async_timeout-4.0.2-py3-none-any.whl (5.8 kB)
Collecting aiosignal>=1.1.2
  Using cached aiosignal-1.2.0-py3-none-any.whl (8.2 kB)
Collecting yarl<2.0,>=1.0
  Using cached yarl-1.7.2-cp310-cp310-macosx_11_0_arm64.whl (118 kB)
Collecting multidict<7.0,>=4.5
  Using cached multidict-6.0.2-cp310-cp310-macosx_11_0_arm64.whl (29 kB)
Collecting frozenlist>=1.1.1
  Using cached frozenlist-1.3.0-cp310-cp310-macosx_11_0_arm64.whl (34 kB)
Collecting attrs>=17.3.0
  Using cached attrs-21.4.0-py2.py3-none-any.whl (60 kB)
Collecting cycler>=0.10
  Using cached cycler-0.11.0-py3-none-any.whl (6.4 kB)
Collecting kiwisolver>=1.0.1
  Using cached kiwisolver-1.4.2-cp310-cp310-macosx_11_0_arm64.whl (63 kB)
Collecting fonttools>=4.22.0
  Using cached fonttools-4.33.3-py3-none-any.whl (930 kB)
Collecting multiprocess
  Using cached multiprocess-0.70.12.2-py39-none-any.whl (128 kB)
Collecting joblib>=1.0.0
  Using cached joblib-1.1.0-py2.py3-none-any.whl (306 kB)
Collecting threadpoolctl>=2.0.0
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting smart-open<6.0.0,>=5.0.0
  Using cached smart_open-5.2.1-py3-none-any.whl (58 kB)
Collecting click<9.0.0,>=7.1.1
  Using cached click-8.1.3-py3-none-any.whl (96 kB)
Collecting MarkupSafe>=2.0
  Using cached MarkupSafe-2.1.1-cp310-cp310-macosx_10_9_universal2.whl (17 kB)
Building wheels for collected packages: tokenizers
  Building wheel for tokenizers (pyproject.toml): started
  Building wheel for tokenizers (pyproject.toml): finished with status 'error'
Failed to build tokenizers

  ERROR: Command errored out with exit status 1:
   command: /Users/[REDACTED]/PycharmProjects/DDGImageScraper/venv/bin/python /Users/[REDACTED]/PycharmProjects/DDGImageScraper/venv/lib/python3.10/site-packages/pip/_vendor/pep517/in_process/_in_process.py build_wheel /var/folders/6_/v4ftgpss5hg7cn0bdj3dr02h0000gn/T/tmphaz_a2wx
       cwd: /private/var/folders/6_/v4ftgpss5hg7cn0bdj3dr02h0000gn/T/pip-install-aoe_4xqg/tokenizers_5536a17e45db481dbc345808907fa754
  Complete output (18 lines):
  running bdist_wheel
  running build
  running build_py
  warning: build_py: byte-compiling is disabled, skipping.
  
  running build_ext
  running build_rust
  error: can't find Rust compiler
  
  If you are using an outdated pip version, it is possible a prebuilt wheel is available for this package but pip is not able to install from it. Installing from the wheel would avoid the need for a Rust compiler.
  
  To update pip, run:
  
      pip install --upgrade pip
  
  and then retry package installation.
  
  If you did intend to build this package from source, try installing a Rust compiler from your system package manager and ensure it is on the PATH during installation. Alternatively, rustup (available at https://rustup.rs) is the recommended way to download and update the Rust compiler toolchain.
  ----------------------------------------
  ERROR: Failed building wheel for tokenizers
ERROR: Could not build wheels for tokenizers, which is required to install pyproject.toml-based projects
WARNING: You are using pip version 21.3.1; however, version 22.1.2 is available.
You should consider upgrading via the '/Users/[REDACTED]/PycharmProjects/DDGImageScraper/venv/bin/python -m pip install --upgrade pip' command.

You might want to try installing from the prebuilt wheel? I have no experience with the new fangled M1 chips all the cool kids are using these days, I still have my 2015 MBP with a creaky old intel i7 :smiley:

Not a Mac user, but maybe if you have mamba already installed, try
mamba install -c fastchan fastbook sentencepiece
if you have conda already installed, try
conda install -c fastchan fastbook sentencepiece

For those seeking the solutions, scroll down to the bottom of this comment.

@wyquek Your solution worked!

However, I ended up going with the Rust solution because I don’t really like Conda and don’t see any point in using it over the Pip installer that comes with Python. If there are, please do enlighten me!

@mike.moloch (-‸ლ); I don’t know why I glossed over that. So I did upgrade Pip and then tried to install fastbook. It put out the same error message.

Searched online what wheels are and I think they are prebuilt packages? And later versions of Pip (which many would have I assume) already directly install packages from their wheel. So the install command I was executing already was installing from the wheel.

I then found this command for installing a wheel to a specific location:
python -m pip download --only-binary :all: --dest . --no-cache <package_name>

However, this also failed and I got a different error this time round relating to the Sentencepiece package.

I got most of my information relating to wheels from here: How to install, download and build Python wheels - ActiveState.

In the end, I ended up installing the Rust compiler and the package installed flawlessly then.

Also, hey, you have a 2015 MBP! Just before the dark years of the MacBook from 2016-2018. But yes, these M1 chips are bogglingly good, even the basic one I have.

Solution 1:
As @wyquek stated, and if you have Conda or Mamba installed, try one of the following commands:
mamba install -c fastchan fastbook sentencepiece
or
conda install -c fastchan fastbook sentencepiece

Solution 2:
Install the Rust compiler. To do so, install from following the guide from this link, which Pip also recommends: https://rustup.rs.

Other ways to install Rust are here, if need be: Install Rust - Rust Programming Language.

After installing Rust, simply rerun pip install fastbook and it should work now.

1 Like

Interesting… I’m surprised that it still needed the rust compiler after you had a pre-compiled wheel. My hunch is that it moved down the line, tried to build another wheel and couldn’t (naturally) find the Rust compiler and so complained again.

Good to know it’s working for you! and thanks for posting your findings!

P.S. I really like the 2015 MBP (dodged a bullet, lucked out and got this refurb from Apple in 2016, instead of the then current 2016 model!! :smiley: ) I’ve been waiting for the M2 MBA and it’s “almost. perfect” … I would’ve appreciated a bigger screen but 13.6" isn’t bad. Hopefully by the time I have mine, pytorch will be able to use some of the GPU capabilities on the M series processors.

What do you mean by “moved down the line”. Is it that Pip ignored that the wheel was already built? I’m thinking that perhaps the fastbook package that Pip downloads isn’t prebuilt perhaps.

Oh yes, you did luck out with that refurb :smile:. Before my M1 MBA, I was on a potato 2012 iMac and the biggest thing I miss from it was the large screen. I learned one thing: never take screen real estate for granted heh. Moving from 24 inches to 13 inches was oof; but it can still be managed. At least the M2 MBA has a slightly larger screen which should help.

I’m also waiting for the official release of PyTorch 1.12 so I can start doing some ML stuff locally. I don’t really like Gradient; granted, its free GPU is useful and probably will use it for more demanding stuff. I could always use Colab/Kaggle though, but they are a slightly different version of Jupyter Notebooks.

Yes I meant that the downloaded wheel didn’t give this rust compiler error the second time around it was because of something else that needed to be built (possibly the fastbook or some other pkgs) but can’t say definitively, but the output would probably mention which package it was was complaining about. It would indeed be weird if it tried to rebuild the wheel you specifically downloaded.

Speaking of Macs I have a 2011 MBA the thing is still going and rather useful tbh for browsing etc. I’m going to miss the 15" screen but I think it’s time for a faster machine but I’ve come to value the lightness of my MBA so I think I’ll try to get used to the MBA with some local gpu capability but mainly as a jumping off point to Paperspace/Jarvislabs or even my own janky 1070ti setup at home :smiley:

1 Like