How to make the environment variables reach Dataflow workers as environment variables in python sdk

风流意气都作罢 提交于 2020-01-01 09:44:02

问题


I write custom sink with python sdk. I try to store data to AWS S3. To connect S3, some credential, secret key, is necessary, but it's not good to set in code for security reason. I would like to make the environment variables reach Dataflow workers as environment variables. How can I do it?


回答1:


Generally, for transmitting information to workers that you don't want to hard-code, you should use PipelineOptions - please see Creating Custom Options. Then, when constructing the pipeline, just extract the parameters from your PipelineOptions object and put them into your transform (e.g. into your DoFn or a sink).

However, for something as sensitive as a credential, passing sensitive information in a command-line argument might be not a great idea. I would recommend a more secure approach: put the credential into a file on GCS, and pass the name of the file as a PipelineOption. Then programmatically read the file from GCS whenever you need the credential, using GcsIO.



来源:https://stackoverflow.com/questions/40285589/how-to-make-the-environment-variables-reach-dataflow-workers-as-environment-vari

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!