Tensorflow Model Deployment in GCP without Tensorflow Serving

大憨熊 提交于 2020-01-02 09:55:13

问题


Machine Learning Model: Tensorflow Based (version 1.9) & Python version 3.6

Data Input: From Bigquery

Data Output: To Bigquery

Production prediction frequency: Monthly

I have a developed a Tensorflow based machine learning model. I have trained it locally and want to deploy it in Google Cloud Platform for predictions.

The model reads input data from Google Bigquery and the output predictions has to be written in Google Bigquery. There are some data preparation scripts which has to be run before the model prediction is run. Currently I cannot use BigQuery ML in Production as it is in Beta stage. Additionally as it is a batch prediction I don't think Tensorflow Serving will be a good choice.

Strategies which I have tried for deployment:

  1. Use Google ML Engine for prediction: This approach creates output part files on GCS. These have to be combined and written to Google Bigquery. So in this approach I have to spin up a VM just to execute the data preparation script and ML Engine output to Google Bigquery script. This adds up to 24x7 cost of VM just for running two scripts in a month.

  2. Use Dataflow for data preparation script execution along with Google ML Engine: Dataflow uses python 2.7 while the model is developed in Tensorflow version 1.9 and python version 3.6. So this approach cannot be used.

  3. Google App Engine: Using this approach a complete web application has to be developed in order to serve predictions. As the predictions are in batch this approach is not suitable. Additionally flask/django has to be integrated with the code in order to use it.

  4. Google Compute Engine: Using this approach the VM would be running 24x7 just for running monthly predictions and running two scripts. The would cause a lot of cost overhead.

I would like to know what is best deployment approach for Tensorflow models which has some pre and post processing scripts.


回答1:


Regarding the option 3, Dataflow can read from BigQuery and store the prepared data in BigQuery at the end of the job.

Then you can have Tensorflow use BigQueryReader to data from BigQuery.

Another that you can use is Datalab, this is a notebook in which you can prepare your data and then use it for your prediction.




回答2:


I've also not found this process flow easy or intuitive. There are two new updates which might help in your project:

  • BigQuery ML now allows you to import TensorFlow models link - there are some limitations but this may eliminate some of the back and forth data movement between BQ and cloud storage or other environments.
  • Cloud DataFlow supports Python 3 in alpha (check the Apache Beam roadmap - link )


来源:https://stackoverflow.com/questions/52621627/tensorflow-model-deployment-in-gcp-without-tensorflow-serving

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!