Have you followed the steps in the Fastbook Production Notebook?
{
"cells": [
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"#hide\n",
"! [ -e /content ] && pip install -Uqq fastbook\n",
"import fastbook\n",
"fastbook.setup_book()"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
This file has been truncated. show original
There are a couple of threads that might put you in the right direction:
How to Deploy Fast.ai Models? (Voilà , Binder and Heroku)
Medium article:
code:
I hope this may help.
Following fastbook 02_production.ipynb I wanted to clarify something:
You need the export.pkl file in your GitHub repo, correct? Otherwise, when Binder builds your app and deploys it, there won’t be any weights to load for inference.
But GitHub frowns upon large files, and will even reject them if they are too big (mine is).
So how do we get our inference model loaded?
https://forums.fast.ai/search?q=voila
Alternatively, you can deploy your model to the web, iOS and Android using the SeeMe.ai quick guide:
https://course.fast.ai/deployment_seeme_ai
No setup required, and models are converted to also run locally/offline on the mobile devices.
Disclaimer, I’m the creator or SeeMe.ai, so I’m obviously biased, but here to help.
1 Like