List all the tables in a dataset in bigquery using bq CLI and store them to google cloud storage

≡放荡痞女 提交于 2020-08-02 05:34:28

问题


I have around 108 tables in a dataset. I am trying to extract all those tables using the following bash script:

# get list of tables
tables=$(bq ls "$project:$dataset" | awk '{print $1}' | tail +3)

# extract into storage
for table in $tables
do
    bq extract --destination_format "NEWLINE_DELIMITED_JSON" --compression "GZIP" "$project:$dataset.$table" "gs://$bucket/$dataset/$table.json.gz" 
done

But it seems that bq ls only show around 50 tables at once and as a result I can not extract them to cloud storage.

Is there anyway I can access all of the 108 tables using the bq ls command?


回答1:


The default number of rows when listing tables that bq ls will display is 100. You can change this with the command line option --max_results or -n.

You can also set the default values for bq in $HOME/.bigqueryrc.

Adding flags to .bigqueryrc




回答2:


I tried with CLI and This command worked for me:-

bq ls --max_results 1000 'project_id:dataset'

Note: --max_results number_based_on_Table_count



来源:https://stackoverflow.com/questions/54208253/list-all-the-tables-in-a-dataset-in-bigquery-using-bq-cli-and-store-them-to-goog

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!