Streaming inferencing

How can I integrate a fastai model into a streaming data pipeline which is running on Apache Spark or Apache Flink? I.e. when the Model (needs the last couple of hours of the stream (for each key) as the input?

Should the fastai model be converted to ONNX?

3 Likes

Not yet fastai native but perhaps Machine Learning Inference in Flink with ONNX - YouTube is useful to get some initial thoughts on how it could work.

GitHub - muellerzr/fastinference: A collection of inference modules for fastai2 contains some useful examples of how to perform the onnx export

An example also is: Cross-platform inference using fast.ai models - tape software for the ONNX export of fastai models.

1 Like