How to read blob (pickle) files from GCS in a Google Cloud DataFlow job?

前端 未结 2 816
感动是毒
感动是毒 2020-12-21 14:42

I try to run a DataFlow pipeline remotely which will use a pickle file. Locally, I can use the code below to invoke the file.

with open (known_args.file_path         


        
2条回答
  •  萌比男神i
    2020-12-21 15:15

    open() is the standard Python library function that does not understand Google Cloud Storage paths. You need to use the Beam FileSystems API instead, which is aware of it and of other filesystems supported by Beam.

提交回复
热议问题