ONNX to TensorFlow to TensorFlow.js Notebook

I am working on a new tutorial that involves running models locally in the browser and settled on TensorFlow.js. The tutorial involves exporting from Fastai/PyTorch→ONNX→TensorFlow→TensorFlow.js. I made a Kaggle Notebook for converting ONNX models to TFJS and thought I’d share it here in case anyone wants a quick way to export models.

Kaggle Notebook: ONNX-to-TF-to-TFJS

Note: I also tested using onnxruntime-web, but I found CPU inference only used one thread (despite supposedly being multi-threaded), and the WebGL backend (i.e., GPU inference) currently has limited operator support (and was not that fast when it did work). It’s relatively new, so I’ll probably try it again once it’s developed more.

5 Likes

Great post! Are you considering posting the result model online, for example on github pages? There is a topic on forum for fastpages blog, if you want to try. This would be amazing final step of the tutorial, trying out the js model live online

Oh, the notebook linked above is not the future tutorial I referenced. It’s just that the model conversion steps seemed standalone enough to make the notebook available as a separate resource.

The full tutorial will cover training an image classifier on the HaGRID (HAnd Gesture Recognition Image Dataset) with Fastai, exporting it to TFJS using the steps in the notebook, and creating a plugin to perform inference with the model in WebGL builds for the Unity game engine. We’ll then wrap up the tutorial by hosting the project for free with GitHub pages like the demo below.

  • Live Unity WebGL Demo
    • Note the predicted class shows as an invalid index while the model downloads to your computer.

I already did a tutorial for object detection with IceVision using this dataset, which I talked about in another post, if you are interested.

And yes, my blog uses fastpages. Although, I have been curious about switching to Ghost. Tough to let go of free hosting via GitHub pages, though.