Fastai on Apple M1

I’ve setup a condo environment on my MacBook Pro with 24 Core M1 Max. I’m able to run the first notebook from the book without any issues. The only change I’ve made is this:
import os os.environ["OMP_NUM_THREADS"] = "1"

When creating a the conda environment, use conda install -c fastchan fastai fastbook jupyterlab sentencepiece.

sentencepiece is an important dependency, but condo doesn’t install it by default for some reason. if you don’t install it, you’ll end up getting some weird dependency error.

Its training the model in reasonable time. Its not as fast as 3080/3090 but its decent, I feel like I can run this whole thing locally on my Mac.

It is in-fact faster than running it on a free paperspace GPU. It took paperspace 1:37 for the same training.

Here’s a video of GPU memory and cores being triggered when training starts:

I don’t actually understand how/why this works with the OMP_NUM_THREADS = 1. It in fact is confusing. The last time I tried this setup about a year ago, this exact setting only made it possible to train the model with CPU, that too on a single thread, which took forever. I know since then PyTorch has officially started supporting M1 Macs, so maybe that’s what has fixed it. If I don’t use this setting, I end up getting an error like this, and it seems to be getting stuck into some infinite loop(sorry they’ll only let me upload one image). I might share more later.

Not sure how it works, but its working, so I’ll take it :slight_smile:

1 Like