4

I have my custom TensorFlow model. I would like to store that model in firebase storage and write a firebase Function that can use the custom model for making predictions?

I am aware of this architecture, which can solve my issue.

But the problem with this architecture is that it utilizes ML Engine in GCP. Therefore, to be able to use it, I have to upgrade my firebase Spark Plan to Blaze Plan to be able to use GCP services.

I don't want to upgrade currently for two reasons (a) I am still developing and I am not sure which cloud platform I will choose at the end (b) For my application purpose, this architecture will be very costly because I have streaming data.

Firebase-Functions run in Node.js so I tried to install TensorFlow javascript library for Node. Then, when I tried to deploy the firebase function with "tfjs-node": "^1.5.1" as a dependency, I got the following error

⚠  functions[sendPrediction(us-central1)]: Deployment error.
Build failed: {"error": {"canonicalCode": "INVALID_ARGUMENT", "errorMessage": "`npm_install` had stderr output:\nnpm ERR! code E404\nnpm ERR! 404 Not Found: tfjs-node@^1.5.1\n\nnpm ERR! A complete log of this run can be found in:\nnpm ERR!     /builder/home/.npm/_logs/2020-01-11T01_18_50_880Z-debug.log\n\nerror: `npm_install` returned code: 1", "errorType": "InternalError", "errorId": "D17329C9"}}

0 Answers0