Run model in the browser with 2

I’m trying to run a model with onnx in the web browser

I found a working solution for 1: How to run a fastai model in the browser

  • Replaced cnn_learner with custom cnn_learner
    • Solves the problem with "TypeError: unrecognized operator ‘Shape’

I’m trying to migrate the code to 2.

  • Replaced cnn_learner with custom cnn_learner
    • Flatten() -> nn.Flatten()
    • dls.add_tfms -> dls.after_batch

I can export the model and load the model with onnxjs:

But running the model await[warmupTensor]); results in “invalid inputs detected; op: BatchNormalization_50”

Any ideas how I can fix it?
Or is there any other way to use a model directly in a browser?

Thanks a lot!

1 Like

It works if the check is disabled…

# disable check
# if (!(t = e.op).checkInputs(h)) throw new Error("invalid inputs detected; op: " +;
t = e.op  # works