Batch with Google Bigquery and Python

冷暖自知 提交于 2019-12-11 05:47:00

问题


What's the most efficient way to perform a batch insert in python using Google Bigquery Api. I was tryng to perform a stream row using this code on a large dataset (1 000 000 +) but it's taking a while to insert them. Is there a more efficient way to insert a large dataset in Python?

  • The table already exists, and it has info.
  • I have a list of 1 millon datapoints I want to insert
  • I'd like to do it with Python, because I'll reuse the code many times.

回答1:


I don't think Streaming (Insert All API) makes sense in your case
You rather should try Load Job
See python code example in documentation



来源:https://stackoverflow.com/questions/39085925/batch-with-google-bigquery-and-python

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!