Puting the Model Into Production: Web Apps

(WG) #43

So with version 2 now … how to deploy docker given that the type parameter is now deprecated?

(Ben Mainye) #44

Just posted in Now.

(Ben Mainye) #45

Thank you! Will try once more.

(Christian Werner) #46

Already posted in another thread but here goes:

I used Dokku to a cloud instance… Works great

Blog post is here:

(Nikhil Utane) #47

Snap is used to install heroku CLI locally.
sudo snap install --classic heroku

(M) #48

Ah, thank you. I see. I already has CLI installed. I get as far as step E under Deploying to Heroku but then when I try to enter the command for release I get a weird error that “container:release” is not a heroku command. Any ideas?

(Nikhil Utane) #49

What’s the exact command you using?

(Arunoda Susiripala) #50

Use your now.json like this:

    "features": {
        "cloud": "v1"
    "version": 1,
    "type": "docker"

(Arunoda Susiripala) #51

Use your now.json like this:

    "features": {
        "cloud": "v1"
    "version": 1,
    "type": "docker"

(WG) #52

Reading the v2 docs and the forums, it looks like the ultimate plan is to eventually end support for docker on zeit.

This is something folks here are probably going to be interested in.

(Arunoda Susiripala) #53

Eventually yes.
But only if we can support almost all of these use cases.
Currently we can’t deploy this kind of apps with v2.
Once we can do that with v2, then we’ll look into deprecating v1.

(Jeremy Howard (Admin)) #54

That’s the opposite of what you told me earlier. We need to be able to rely on platforms that we support to maintain compatibility, or at least provide consistent and accurate information. :frowning:

(Arunoda Susiripala) #55

Oh! You got it wrong on the above note.
See this: https://twitter.com/rauchg/status/1060581580303810561
(This is our CEO)

I also talked to him as well on this matter.

We are going support docker based deployments on v2 the API.
So, once we do that we can deprecate v1.

Even when we do that, there’ll be pretty simple changes to the now.json.
Other than that, there’s nothing we need to change.

(Jeremy Howard (Admin)) #56

OK that’s encouraging. I saw also in the HN discussion there was a lot of concern about v2 (including thoughtful comments from @simonw ). Hopefully your CEO is realizing that people like the current product and continuing to support it is important!

(WG) #57

That’s not what I’m hearing. This is a quote from Matheus Fernandes (a Zeit employee) from a day ago:

yes, ultimately we do plan to remove Docker support. We thought a lot about containers for the past 3 years, analyzed thousands of use cases and spoke with thousands of customers, and we concluded that functions/lambdas are superior to containers for the vast majority of workloads.

This is not how you do upgrades. If anything, given the prevalence of dockerized apps on Zeit … docker deployments should have been a first class priority and an out-of-the-box option for v2. Instead, its relegated to an afterthought … and that, along with the conflicting statements coming out of the company, makes me highly suspect going forward.

Until there is some clear communication on what the docker roadmap is for ZEIT … I’m recommending folks to avoid it. If there are other docker-friendly/docker-first vendors out there folks have had good experience with, I’d love to hear about it.

(Arunoda Susiripala) #58

Here’s a quote from the CEO: https://twitter.com/rauchg/status/1060581580303810561

Our v2 API is based on serverless model which we helps 90% a lot on scaling, pricing an etc.
But what we are not going to remove docker support unless we’ve solid support for that in the v2 api.

Anyway, I think we should not argue about this here :smiley:

Just talking as a student, I think we should add more types of production deployment solutions.
And not rely 100% on ZEIT.


Is zeit free for our size of apps?

(WG) #60

I’m bookmarking this :slight_smile:

Yah, agree that this is not the place to argue about it and I really appreciate all your help to folks here on using Zeit.

Speaking of serverless approaches, is there a write-up in the docs (or elsewhere) that demonstrates how you would deploy a lambda to utilizes a fastai/pytorch model to make a prediction and return the results as JSON?

I’m also interested if there is a way to persist uploaded images, for example, in a serverless environment? For example, lets say I want folks to upload an image, return a prediction, save the image and then allow the end user to let the app know if it got the prediction right or not.

(Arunoda Susiripala) #61

Actually I tried to that.
It sometimes works.

The problem is Lambda has a limitation of 250MB. (This is on AWS, but other cloud services also do the same)
When we add the model and all the deps, it’s pretty hard to set it to 250MB. (and 50MB zipped)

Here’s our serverless python example: https://github.com/zeit/now-examples/tree/master/python
Give it a try.

But with our current example app, we download the model when the server loads. This is problematic for serverless as it loads a new server for each and every requests.
It’ll add some latency.

(WG) #62

Yah that is going to be an issue unless you guys can create something like an S3 bucket where we can upload and use our models from (that would save 82MB right there).