Deployment to Sagemaker broken

Hello everyone,

the deployment to Amazon Sagemaker seems to be broken.
When creating a Sagemaker instance from the Cloudformation Stack, the environment comes with package versions that are not compatible with the current sagemaker package.
e.g. requests=2.22.0 but sagemaker needs <=2.20.0.
I have downgraded the necessary packages, but now when trying to build the model as explained in the documentation, the deployment fails, as load_learner cannot be found, although fastai is installed.

algo-1-k8580_1  | Successfully built serve
algo-1-k8580_1  | Installing collected packages: serve
algo-1-k8580_1  | Successfully installed serve-1.0.0
algo-1-k8580_1  | You are using pip version 18.1, however version 19.2.1 is available.
algo-1-k8580_1  | You should consider upgrading via the 'pip install --upgrade pip' command.
algo-1-k8580_1  | [2019-07-25 07:47:45 +0000] [28] [ERROR] Error handling request /ping
algo-1-k8580_1  | Traceback (most recent call last):
algo-1-k8580_1  |   File "/usr/local/lib/python3.6/dist-packages/sagemaker_containers/_functions.py", line 85, in wrapper
algo-1-k8580_1  |     return fn(*args, **kwargs)
algo-1-k8580_1  |   File "/usr/local/lib/python3.6/dist-packages/serve.py", line 14, in model_fn
algo-1-k8580_1  |     learn = load_learner(model_dir, fname='resnet50.pkl')
algo-1-k8580_1  | NameError: name 'load_learner' is not defined

I can’t figure out how to solve it. Any ideas?

Hi @faib
Can you make sure that load_learner is also imported in your serve.py?

from fastai.basic_train import load_learner

We deployed a quick way to deploy to Sagemaker without the need of setup endpoint/model and etc in AWS Sagemaker. Just one command call.

You can find the example notebook and readme at https://github.com/bentoml/BentoML/tree/master/examples/deploy-with-sagemaker

After you trained your model, you can deploy to Sagemaker with 3 additional cells of code.

First spec out your model service.

%%writefile my_fastai_service.py

from bentoml import api, env, artifacts, BentoService
from bentoml.artifact import FastaiModelArtifact
from bentoml.handlers import DataframeHandler

@env(conda_pip_dependencies(['fastai']))
@artifacts([FastaiModelArtifact('learner')])
class MyFastaiService(BentoService):
      @api(DataframeHandler)
      def predict(self, df):
            return self.artifacts.learner.predict(df)

Second, save it to local directory

from my_fastai_service import MyFastaiService

svc = MyFastaiService.pack(learner=learner)
saved_path = svc.save('/tmp/bentoml')

In the last cell, we will just make deploy command

!bentoml deploy {saved_path} --platform=aws-sagemaker --region=us-west1

And that’s it!

You can find notebooks working with fastai at https://github.com/bentoml/gallery/tree/master/fast-ai
I post two examples, one with Pet classification(lesson1) and Tabular csv (lession4)

Let me know how it goes

Cheers

Bo

Hey @yubozhao!
Thank you for your answer! I tried using

from fastai.basic_train import load_learner

However even when importing explicitly Sagemaker doesn’t work. Taking a look at CloudWatch it gives me a similiar error:

Traceback (most recent call last):
File "/usr/local/lib/python3.6/dist-packages/sagemaker_containers/_modules.py", line 246, in import_module
module = importlib.import_module(name)
File "/usr/lib/python3.6/importlib/__init__.py", line 126, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
File "&lt;frozen importlib._bootstrap&gt;", line 994, in _gcd_import
File "&lt;frozen importlib._bootstrap&gt;", line 971, in _find_and_load
File "&lt;frozen importlib._bootstrap&gt;", line 955, in _find_and_load_unlocked
File "&lt;frozen importlib._bootstrap&gt;", line 665, in _load_unlocked
File "&lt;frozen importlib._bootstrap_external&gt;", line 678, in exec_module
File "&lt;frozen importlib._bootstrap&gt;", line 219, in _call_with_frames_removed
File "/usr/local/lib/python3.6/dist-packages/serve.py", line 12, in &lt;module&gt;
from fastai.basic_train import load_learner

I will have to try out BentoML, thanks for pointing me in that direction.

I’m having the same issues now as well. It looks from what i’ve found thus far that there might be a version issue with the fastai dependencies that are in the default container provided by sagemaker. the solution i found is to include a requirements.txt file in the same path as the entry point file. did you find a different solution or were you able to get this working?

I’m having the same problem too now.
I did find people saying !pip install --upgrade git+https://github.com/fastai/fastai.git worked for them. But it didn’t for me.
If anyone has solved this then any guidance would be really great.
Thanks in advance