Lesson 2 official topic

fastai.model will return a PyTorch model. So you can save it any way you want.

3 Likes

Jeremy spoke about this in the last iteration of the lesson. He recommends hosting the model on the server and accessing the model via api on the mobile device. Idea is the server will have more processing power, and this solution is feasible where internet connectivity is not an issue.

Yes, you can save the model weight seperately.

We use Streamlit in production to evaluate models. Combined with things like altair, matplotlib or bokeh, it becomes quite powerful. Also, I used it to make an app to preview huge DICOM images with sliding window approach. So it is definitely very flexible. At first glance, it feels that Gradio has similar functionality. But maybe more straightforward to use?

5 Likes

3 posts were merged into an existing topic: Help: Basics of fastai, PyTorch, numpy, etc :white_check_mark:

2 posts were merged into an existing topic: Help: SGD and Neural Net foundations :white_check_mark:

On default it will be pth. But you can specify that see learner.export docs here Learner, Metrics, and Basic Callbacks | fastai

2 Likes

The flags look different than those used in nbdev. I don’t remember seeing the pipe operator. Is that something new in nbdev?

5 Likes

I think Jeremy may be using a new version of nbdev which is under development right now…

3 Likes

For setting up with fastsetup, how much memory is required. Can it be used for CPU installation also?

1 Like

The Lex Fridman podcast with Travis Oliphant creator of Numpy and Conda is really interesting, re the topic of Conda (or Mamba) vs Pip. Opened my eyes to some of the issues using Pip with libraries that rely on non-Python dependancies.

Conda/Mamba is a sensible default for data science tooling in Python. Favour it over Pip if you want to keep your sanity imo.

9 Likes

For the code we have seen so far, CPU based setup is sufficient. This is because we are using small model (+ transfer learning), with small image size (224x224), so don’t acutally need an accelerator.

4 Likes

A handy fastai func for checking the installed versions. Creates a string for sharing when debugging or reporting issues.

from fastai.test_utils import show_install
show_install()

=== Software === 
python        : 3.7.11
fastai        : 2.5.4
fastcore      : 1.3.27
fastprogress  : 0.2.7
torch         : 1.9.1
nvidia driver : 471.41
torch cuda    : 11.1 / is available
torch cudnn   : 8005 / is enabled

=== Hardware === 
nvidia gpus   : 1
torch devices : 1
  - gpu0      : NVIDIA GeForce RTX 3090

=== Environment === 
platform      : Linux-5.10.43.3-microsoft-standard-WSL2-x86_64-with-debian-bullseye-sid
distro        : #1 SMP Wed Jun 16 23:47:55 UTC 2021
conda env     : fastaidev
python        : /home/laplace/miniconda3/envs/fastaidev/bin/python
sys.path      : /mnt/d/projects/blog/wip/
/home/laplace/miniconda3/envs/fastaidev/lib/python37.zip
/home/laplace/miniconda3/envs/fastaidev/lib/python3.7
/home/laplace/miniconda3/envs/fastaidev/lib/python3.7/lib-dynload
/home/laplace/miniconda3/envs/fastaidev/lib/python3.7/site-packages
/mnt/d/projects/fastcore
/mnt/d/projects/fastai
/home/laplace/miniconda3/envs/fastaidev/lib/python3.7/site-packages/IPython/extensions
/home/laplace/.ipython
17 Likes

One issue with github pages is now you need to make your repo public to use it for free. Is there any good alternatives for static website hosting which you have tried out?

1 Like

I also started with Jekyll and GitHub Pages for my personal website, but eventually migrated to a custom Go solution hosted on Digital Ocean. I didn’t mind keeping the source open; however, I found it a bit inconvenient and somewhat limited for my purposes. Still, a great way to start! You can rewrite it later. Especially, if you have a domain name that you can easily redirect to another platform.

5 Likes

Another great lesson! Thanks for the major shout-out Jeremy, was not expecting it :sweat_smile:

13 Likes

5 posts were merged into an existing topic: Help: Creating a dataset, and using Gradio / Spaces :white_check_mark:

Hi all, really enjoyable lesson again today, all the gradio/streamlit chat is inspiring me to try and plan a project with fast.ai. so on to my main question:

Has anyone done any pose estimation with fast.ai before?

Tim

1 Like

5 posts were merged into an existing topic: Help: Using Colab or Kaggle :white_check_mark:

There is an example of head pose (see Computer vision | fastai Points).