Platform: Amazon SageMaker - AWS

Anyone having problems importing fastbook? I’m getting stuck when using the fastai2 kernel.

#hide
!pip install -Uqq fastbook 
import fastbook 
fastbook.setup_book()

This yields the following error:

---------------------------------------------------------------------------
ModuleNotFoundError                       Traceback (most recent call last)
<ipython-input-7-2b820b2b946f> in <module>
  1 #hide
  2 get_ipython().system('pip install -Uqq fastbook')
----> 3 import fastbook
  4 fastbook.setup_book()

ModuleNotFoundError: No module named 'fastbook'

I get this exact same error when I try to build everything with the script provided by @FraPochetti
Still thanks a lot for the template though. I tried to add certain extra lines of code like conda updata conda --all but it didn’t help to get rid of the error message. I ended up just using one of the initially provided templates. Thank you all for your help :slight_smile:

@_Nils Could you point to the template you used? Thanks.

@FraPochetti I also get the 5 minute timeout error when launching the cloudformation stack using the template you shared. Can you confirm this template still works?

@ganesh.bhat were you ever able to get this to work? Did adding ‘nohup’ to the pip commands fix the issue?

@matt.mcclean can you confirm this setup process still works?
I can report that the cloudformation stack deploys with no issues, and able to access the Sagemaker notebook instance/jupyter. But, the notebook instance it creates does not work with the fastai notebooks.

The very first notebook: https://github.com/fastai/fastbook/blob/master/01_intro.ipynb

cannot be completed without quite a few errors, all of which suggest graphviz is not configured correctly during the setup process in the cfn script.

Sorry guys, I will maybe have time to look into this next week.
Right now it is a bit hectic.

You can create a stack using the template from here https://course.fast.ai/start_sagemaker, changing the two references to ‘course-v4’ to ‘fastbook’

pip install -r /home/ec2-user/SageMaker/fastbook/requirements.txt

DefaultCodeRepository: https://github.com/fastai/fastbook

Notebook 01 runs without errors.

this fixes it. Thanks @AlisonDavey

For anyone else who finds this thread with similar problems, I will submit a PR to fix.

1 Like

PR to fix: https://github.com/fastai/course20/pull/28

2 Likes

I selected the Frankfurt, Germany link since I live in Northern Europe, and got this response when trying to create the stack.

CREATE_FAILED The requested resource notebook-instance/ml.p2.xlarge is not available in this region (Service: AmazonSageMaker; Status Code: 400; Error Code: ResourceLimitExceeded; Request ID: 12ba24b0-cf28-4e37-aec0-e04a199ef168)

Hmm, that instance type is definitely supposed to be available in the Frankfurt region. Check out the pricing page under the “On-Demand Notebook Instances” tab and Frankfurt region.

You might need to request a quota increase.

Yes, AWS is insane.

I tried Ireland instead, but ran into having to request a quota increase. It took hours to get a response, which only repeated my request back to me. By that time I was up and running with Paperspace, so I cancelled AWS. Not going back to AWS anytime soon.

Has anyone seen this error while trying to deploy the model?
sagemaker_containers._errors.ClientError: name ‘load_learner’ is not defined

The error is coming from:
File “/usr/local/lib/python3.6/dist-packages/inference.py”, line 15, in model_fn
learn = load_learner(model_dir, fname=‘export.pkl’)
The file inference.py contains the 4 methods: model_fn , input_fn , predict_fn & output_fn

and it imports:

import logging, requests, os, io, glob, time
from fastai.vision import *

I followed the instructions in this link https://github.com/fastai/course-v3/blob/master/docs/deployment_amzn_sagemaker.md

Could it be the instance where the model is deployed does not have the right version of fastai?

This is my deploy instruction (from the link above):
predictor = model.deploy(initial_instance_count=1, instance_type=‘ml.t2.medium’)

@sujatha - The same question seems to be addressed in the previous version of course category.

1 Like

If you are looking for a low-cost way to deploy your fastai models into production, AWS Lambda just announced support for Container images to package your code and dependencies such as PyTorch & fastai libraries and your exported fastai model. I have setup an example project using the SAM CLI here: https://github.com/mattmcclean/fastai-container-sam-app.

Would love to hear your feedback!

This is great. I will try it out. How can this be used when the training data keeps changing on a daily or regular basis like in recommendation engine or forecasting?

Hi, I’m Sal and am trying to start the course. I have been following the Sagemaker instructions but AWS has told me that ml.p3.2xlarge is not available in the US-West, which seems odd to me. Is there any way to use the Notebook with other available instances?
Thanks

I was eventually able to get this working by using the template found at: https://course.fast.ai/start_sagemaker
and then adding the nohup prefix to the pip install commands.

Thanks @AlisonDavey and @ganesh.bhat

You do need use Cloud Formation: create-stack and enter the template manually in designer or upload.
Perhaps someone could add this to the official button-click launches. It would save some confusion for people not experienced with Cloud Formation

For reference here is my working version:

AWSTemplateFormatVersion: 2010-09-09
Parameters:
InstanceType:
Type: String
Default: ml.p2.xlarge
AllowedValues:
- ml.p3.2xlarge
- ml.p2.xlarge
Description: Enter the SageMaker Notebook instance type
VolumeSize:
Type: Number
Default: 50
Description: Enter the size of the EBS volume attached to the notebook instance
MaxValue: 17592
MinValue: 5
Resources:
Fastai2SagemakerNotebookfastaiv4NotebookRoleA75B4C74:
Type: AWS::IAM::Role
DeletionPolicy: Delete
Properties:
AssumeRolePolicyDocument:
Statement:
- Action: sts:AssumeRole
Effect: Allow
Principal:
Service: sagemaker.amazonaws.com
Version: “2012-10-17”
ManagedPolicyArns:
- Fn::Join:
- “”
- - “arn:”
- Ref: AWS::Partition
- :iam::aws:policy/AmazonSageMakerFullAccess
Metadata:
aws:cdk:path: CdkFastaiv2SagemakerNbStack/Fastai2SagemakerNotebook/fastai-v4NotebookRole/Resource
Fastai2SagemakerNotebookfastaiv4LifecycleConfigD72E2247:
Type: AWS::SageMaker::NotebookInstanceLifecycleConfig
DeletionPolicy: Delete
Properties:
NotebookInstanceLifecycleConfigName: fastai-v4LifecycleConfig
OnCreate:
- Content:
Fn::Base64: >-
#!/bin/bash

          set -e


          echo "Starting on Create script"


          sudo -i -u ec2-user bash <<EOF

          touch /home/ec2-user/SageMaker/.create-notebook

          EOF


          cat > /home/ec2-user/SageMaker/.fastai-install.sh <<\EOF

          #!/bin/bash

          set -e

          echo "Creating dirs and symlinks"

          mkdir -p /home/ec2-user/SageMaker/.cache

          mkdir -p /home/ec2-user/SageMaker/.fastai

          [ ! -L "/home/ec2-user/.cache" ] && ln -s /home/ec2-user/SageMaker/.cache /home/ec2-user/.cache

          [ ! -L "/home/ec2-user/.fastai" ] && ln -s /home/ec2-user/SageMaker/.fastai /home/ec2-user/.fastai


          echo "Updating conda"

          conda update -n base -c defaults conda -y

          conda update --all -y

          echo "Starting conda create command for fastai env"

          conda create -mqyp /home/ec2-user/SageMaker/.env/fastai python=3.6

          echo "Activate fastai conda env"

          conda init bash

          source ~/.bashrc

          conda activate /home/ec2-user/SageMaker/.env/fastai

          echo "Install ipython kernel and widgets"

          conda install ipywidgets ipykernel -y

          echo "Installing fastai lib"

          nohup pip install -r /home/ec2-user/SageMaker/fastbook/requirements.txt

          nohup pip install fastbook sagemaker

          echo "Installing Jupyter kernel for fastai"

          python -m ipykernel install --name 'fastai' --user

          echo "Finished installing fastai conda env"

          echo "Install Jupyter nbextensions"

          conda activate JupyterSystemEnv

          nohup pip install jupyter_contrib_nbextensions

          jupyter contrib nbextensions install --user

          echo "Restarting jupyter notebook server"

          pkill -f jupyter-notebook

          rm /home/ec2-user/SageMaker/.create-notebook

          echo "Exiting install script"

          EOF


          chown ec2-user:ec2-user /home/ec2-user/SageMaker/.fastai-install.sh

          chmod 755 /home/ec2-user/SageMaker/.fastai-install.sh


          sudo -i -u ec2-user bash <<EOF

          nohup /home/ec2-user/SageMaker/.fastai-install.sh &

          EOF


          echo "Finishing on Create script"
  OnStart:
    - Content:
        Fn::Base64: >-
          #!/bin/bash


          set -e


          echo "Starting on Start script"


          sudo -i -u ec2-user bash << EOF

          if [[ -f /home/ec2-user/SageMaker/.create-notebook ]]; then
              echo "Skipping as currently installing conda env"
          else
              # create symlinks to EBS volume
              echo "Creating symlinks"
              ln -s /home/ec2-user/SageMaker/.fastai /home/ec2-user/.fastai
              echo "Updating conda"
              conda update -n base -c defaults conda -y
              echo "Activate fastai conda env"
              conda init bash
              source ~/.bashrc
              conda activate /home/ec2-user/SageMaker/.env/fastai
              echo "Updating fastai packages"
              nohup pip install fastai fastcore sagemaker --upgrade
              echo "Installing Jupyter kernel"
              python -m ipykernel install --name 'fastai' --user
              echo "Install Jupyter nbextensions"
              conda activate JupyterSystemEnv
              nohup pip install jupyter_contrib_nbextensions
              jupyter contrib nbextensions install --user
              echo "Restarting jupyter notebook server"
              pkill -f jupyter-notebook
              echo "Finished setting up Jupyter kernel"
          fi

          EOF


          echo "Finishing on Start script"
Metadata:
  aws:cdk:path: CdkFastaiv2SagemakerNbStack/Fastai2SagemakerNotebook/fastai-v4LifecycleConfig

Fastai2SagemakerNotebookfastaiv4NotebookInstance7C46E7E0:
Type: AWS::SageMaker::NotebookInstance
DeletionPolicy: Retain
Properties:
InstanceType:
Ref: InstanceType
RoleArn:
Fn::GetAtt:
- Fastai2SagemakerNotebookfastaiv4NotebookRoleA75B4C74
- Arn
DefaultCodeRepository: https://github.com/fastai/fastbook
LifecycleConfigName: fastai-v4LifecycleConfig
NotebookInstanceName: fastai-v4
VolumeSizeInGB:
Ref: VolumeSize
Metadata:
aws:cdk:path: CdkFastaiv2SagemakerNbStack/Fastai2SagemakerNotebook/fastai-v4NotebookInstance
CDKMetadata:
Type: AWS::CDK::Metadata
Properties:
Modules: aws-cdk=1.60.0,@aws-cdk/aws-iam=1.60.0,@aws-cdk/aws-sagemaker=1.60.0,@aws-cdk/cloud-assembly-schema=1.60.0,@aws-cdk/core=1.60.0,@aws-cdk/cx-api=1.60.0,@aws-cdk/region-info=1.60.0,jsii-runtime=node.js/v14.8.0
Condition: CDKMetadataAvailable
Conditions:
CDKMetadataAvailable:
Fn::Or:
- Fn::Or:
- Fn::Equals:
- Ref: AWS::Region
- ap-east-1
- Fn::Equals:
- Ref: AWS::Region
- ap-northeast-1
- Fn::Equals:
- Ref: AWS::Region
- ap-northeast-2
- Fn::Equals:
- Ref: AWS::Region
- ap-south-1
- Fn::Equals:
- Ref: AWS::Region
- ap-southeast-1
- Fn::Equals:
- Ref: AWS::Region
- ap-southeast-2
- Fn::Equals:
- Ref: AWS::Region
- ca-central-1
- Fn::Equals:
- Ref: AWS::Region
- cn-north-1
- Fn::Equals:
- Ref: AWS::Region
- cn-northwest-1
- Fn::Equals:
- Ref: AWS::Region
- eu-central-1
- Fn::Or:
- Fn::Equals:
- Ref: AWS::Region
- eu-north-1
- Fn::Equals:
- Ref: AWS::Region
- eu-west-1
- Fn::Equals:
- Ref: AWS::Region
- eu-west-2
- Fn::Equals:
- Ref: AWS::Region
- eu-west-3
- Fn::Equals:
- Ref: AWS::Region
- me-south-1
- Fn::Equals:
- Ref: AWS::Region
- sa-east-1
- Fn::Equals:
- Ref: AWS::Region
- us-east-1
- Fn::Equals:
- Ref: AWS::Region
- us-east-2
- Fn::Equals:
- Ref: AWS::Region
- us-west-1
- Fn::Equals:
- Ref: AWS::Region
- us-west-2

1 Like

Here’s a guide on how to deploy a fastai v2 model to a sagemaker endpoint using torchserve (pytorch >= 1.6). Largely based on great prior work by @matt.mcclean

Feel free to use this as a template for deploying your own models. I suffered through a lot of issues getting this working so hopefully I can save you some of the pain.

3 Likes