My name is Sugianto. I only found out about this 2 weeks ago and am on way to course #3 in DL.
My question is, since we are using different library (fastai) on PyTorch vs the production-ready ones out there (Tensorflow)…how does anyone apply what they have learnt here and do it in Tensorflow?
I am aware that fastai library is not for production yet and once I have completed DL part 1, I would like to deploy something in production at work.
What would be the best way to apply the learnings here and do it in TF or any other libraries?
Yes, can anyone describe how they have (or will) move from having a fastai/pytorch kernel to a production-ready, service (eg real time web service) that tests an image or piece of text against a model. Advice may be to redo in tensorflow, which is fine, but it would be good to know the approach to take and learn from others forays.
I feel it critical to have this info for the ‘practical’ keyword in the course. Perhaps it’s a Part 2 challenge