Best practices for setting up Fastai on Mackbook Pro with M1 Max Chip

Hi,

I recently got the MacBook Pro with M1 Max chip :star_struck:. I’d like to have local Fastai installed on the system for quick prototyping. What is the best practice of setting up the Fastai.

Shall I install Fastai directly on the Mac OS or use virtual platforms like docker? Any ideas, hints or links to get Fastai configured on the MacBook Pro with apple chip from the past experience will be much appreciated.

Thanks in advance

Kind regards,
Bilal

3 Likes

Hi, I just installed fastai on macbook with M1 Pro by running

mamba create -n fastai
mamba activate fastai
mamba install -c fastchan fastai

Now in Python I see that torch.backends.mps.is_available() is True and I can set models to device like .to('cpu') or .to('mps')

Let me know if you are using M1 GPU in your training

1 Like

Hi Kasiannenko,

It worked for me as you said so no need for additional configs. I tried running a model using mps and it is training faster than cpu. Cool.

Thanks.

For those using pip, pip install fastai also works.

1 Like

I’ve spent A LOT of time trying to get my M1 Mac working with the fast.ai course using mps…

Had different breakages at Lesson 1, 2 and 4 sadly.

Issues with versions mostly around protobuf, pytorch, python etc.

My solution for the time being is to resign and say that Macbook’s aren’t currently the ideal machine for the fast.ai course (and tbh probably DL) → currently using VSCode with remote environment in Paperspace. Seems to work pretty well now not dealing with setup issues and hoping I can focus now on learning vs. installing dependencies haha.

For anyone else who is interested, it’s super easy - just go here: Remote Jupyter Kernel | Paperspace

1 Like