Deployment Platform:

Hi everyone,

I’m the creator of, an AI marketplace (currently in private beta).

We focus on letting you train, use and share AI models with your friends and the rest of the world. We have been a fan of since its early days, and are happy to contribute to this wonderful community.

Once you have trained your model, you can easily deploy it and use it on the web, mobile, our Python SDK or via the API . After testing, you can share it with others so they can use your model.

We have just created a fastai v2 quick guide for you, that helps you get up and running and we would love for you to try it out and get your feedback!

Our iOS app has been approved for beta testing! If you have completed the quick guide, feel free to get in touch at so you can also start using your model on mobile.

All the best and stay safe.



Hi @zerotosingularity!

Thank you for creating the platform and sharing the starter material.

I’ve created a starter “not hot dog app” and I’m trying to upload my model-I’m not behind any firewall/VPN, I’m using Ubuntu and my model size ~140 MBs.

The upload takes forever to finish-ideally it should take <2-5 minutes on my internet.

Could you please help?

1 Like

Thank you for testing. I’m looking into it!

1 Like

Can you check again? I see a 135mb model, and your can_inference is set to true (just saw it change 1min ago).

While you try it out, I’ll look into it further…

Thanks, yes-it’s working. Thank you! :slight_smile:

1 Like

Happy to hear it is working! Happy testing!

I’m here if you need me!


I am having problems after uploading the picture. How can I check if the pkl was correctly uploaded?


Confidence: 0

Let me have a look…

I see you have three models that have been uploaded. Are you using Fastai v1 of v2? This post is in the v1 forum, but your models are tagged with v2.

Changing to private messages. When solved, we will report here.

Small update [SOLVED]:

Added support for Ranger and EfficientNet-Pytorch last night to fix the problem. Inference on fp16 needs some investigating…

1 Like

Sharing the question here, @zerotosingularity I was a little confused during my first try as well-I’ve clarified the approach in the thread but could you please add some details/clarifications in the starter notebook?


1 Like

Thank you, @init_27!

I have updated the forum posts as well as the documentation on Github, hope it’s clearer now.

Our iOS app has been approved for beta testing!

If you have completed the quick guide, feel free to get in touch at so you can also start using your model on mobile.

1 Like

I am getting the same output. Does it not work with fastai2? I am using it for a multilabel classification problem.

1 Like

@zerotosingularity can you help @dipam7 ?

1 Like

@ttsantos already fixed it yesterday! :slight_smile:

I got a direct message but did not see the message here for some reason.

The issue was deploying a ‘multilabel image classifier’ which is not yet supported, but I fixed a workaround, so he can use the predictions (but confidence is not working atm).

Thanks for the heads up!!

1 Like

Just released a new version of the iOS for testing.

From the releases notes:

This release contains two major changes:

  1. Public model feed

These models will be available to anyone who installs the app. Your models will show up in this feed if you mark them as ‘public’.

Our first model recognizes all the UN nation flags, and there are some more in the pipeline. That being said, we can’t wait to show your models in here too!

  1. Follow up Actions

Once a prediction is made, you can use the outcome to search Google or Wikipedia. Later on, you will be able to add custom follow up actions, like showing your product page for the recognized product.


Let me know if you want to try or have created a model you want to share publicly.

It’s been a while since I’ve updated this thread, but our iOS and Android apps are now live (and free) to be used:



For iOS, we automatically convert your model to CoreML, so you can install it locally. Have a look at the updated quick guides to get you going:

1 Like

Just added support for the v2.0.13 release:

Fastest way to deploy your model, including mobile :slight_smile:

1 Like