I am trying to run a query on a 12 GB csv file loaded in Google big query, I cant run any query on the dataset. I am not sure if the dataset is loaded correctly. It shows a
job.errors contains detailed errors for the job.
This doesn't appear to be documented anywhere, but you can see it in the source code: https://googlecloudplatform.github.io/google-cloud-python/0.20.0/_modules/google/cloud/bigquery/job.html and ctrl+f for _AsyncJob.
So your wait_for_job code could look like this:
def wait_for_job(job):
while True:
job.reload()
if job.state == 'DONE':
if job.error_result:
raise RuntimeError(job.errors)
return
time.sleep(1)