Fast.ai on Apple M1

Hi forum folks, I have been using paperspace.io for all my tinkering with fast.ai.

I have recently been enticed by Apple’s new M1 chip/mac book air and I am debating trading in my current Macbook Pro (2019, 6-Core Intel Core i7) for the new mac book air with the M1 chip. Has any else made this switch and been using the new Apple chip for basic ML/DL tinkering?

I have a M1 Macbook Air which I use mostly for my frontend dev and any DL work related to TF since TF is optimised for M1. Feel free to ask me any questions on the same, have been using this device for over 1 month now.

Note that at this point almost no ML software uses the features of the M1, except for that version of TF @Diganta mentioned. But even that only uses the GPU at the moment, not the built-in NPU (Apple’s Neural Engine).

Until the ML software you’re using is optimized to use all of the power of the M1, these computers offer no advantages over renting compute in the cloud. (Unless all you’re doing is inference, but that’s another story.)

5 Likes

True, tho the M1 optimized TF is ridiculously fast and efficient. Check this report on the same.

2 Likes

I was just able to train the whole model from lesson 1 on my MBP with M1 Max 24 GPU Cores:

1 Like

:open_mouth: whoa…!

You can use pytorch on M1 but I have hit the occasional bug which then becomes a blocker. Mostly I recommend using an nvidia GPU if you want to get things done

1 Like