Google bigquery export table to multiple files in Google Cloud storage and sometimes one single file

泄露秘密 提交于 2021-02-10 20:25:46

问题


I am using Bigquery python libraries to export data from Bigquery tables into GCS in csv format.

I have given a wildcard pattern assuming some tables can be more than 1 GB

Sometimes even though table is few MB it creates multiple files and sometimes just it creates just 1 file.

Is there a logic behind this?

My export workflow is the following:

project = bq_project dataset_id = bq_dataset_id table_id = bq_table_id     
bucket_name =bq_bucket_name workflow_name=workflow_nm 
csv_file_nm=workflow_nm+"/"+csv_file_prefix_in_gcs+'*'client = 
bigquery.Client() destination_uri = "gs://{}/{}".format(bucket_name, 
csv_file_nm) dataset_ref = client.dataset(dataset_id, project=project) 
table_ref = dataset_ref.table(table_id) destination_table = 
client.get_table(dataset_ref.table(table_id)) configuration = 
bigquery.job.ExtractJobConfig() configuration.destination_format='CSV' – 
csv_file_nm=workflow_nm+"/"+csv_file_prefix_in_gcs 

回答1:


I think this is an intended behaviour of the export. The Bigquery Export documentation specifies the following:

When you export data to multiple files, the size of the files will vary.

This corresponds to the behavior you are seeing in your exports.



来源:https://stackoverflow.com/questions/58447205/google-bigquery-export-table-to-multiple-files-in-google-cloud-storage-and-somet

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!