google-bigquery

How to get rid of __key__ columns in BigQuery table for every 'Record' Type field?

流过昼夜 提交于 2020-06-17 14:50:34
问题 For every 'Record' Type of my Firestore table, BigQuery is automatically adding the ' key ' columns. I do not want to have these added for each of the 'Record' Type fields. How can I get rid of these extra columns automatically being added by BigQuery? (I want to get rid of the below columns in my BigQuery table schema highlighted in yellow) 回答1: This is intended behavior, citing Bigquery GCP documentation: Each document in Firestore has a unique key that contains information such as the

BigQuery - Export CSV certain columns

旧街凉风 提交于 2020-06-17 06:13:12
问题 I want to export a table or a view from bigQuery, but I do not need to export everything: I need to export only certain columns. How can I configure which ones to export? My current code is something like this: BigQuery bigQuery = BigQueryOptions.getDefaultInstance().getService(); Field fieldToExport = Field.of("column to export", LegacySQLTypeName.STRING); Table table = bigQuery.getTable("mybigqueryid", "mytable", /* here it only accepts tableOptions, not fields*/); String format = "csv";

BigQuery - Export CSV certain columns

巧了我就是萌 提交于 2020-06-17 06:13:05
问题 I want to export a table or a view from bigQuery, but I do not need to export everything: I need to export only certain columns. How can I configure which ones to export? My current code is something like this: BigQuery bigQuery = BigQueryOptions.getDefaultInstance().getService(); Field fieldToExport = Field.of("column to export", LegacySQLTypeName.STRING); Table table = bigQuery.getTable("mybigqueryid", "mytable", /* here it only accepts tableOptions, not fields*/); String format = "csv";

Bigquery error: 400 No matching signature for operator BETWEEN for argument types: DATE, TIMESTAMP, TIMESTAMP

梦想与她 提交于 2020-06-16 04:59:19
问题 I have deployed my webapp on Google Cloud Bigquery, when I query the data I get an error "400 No matching signature for operator BETWEEN for argument types: DATE, STRING, STRING. Supported signature: (ANY) BETWEEN (ANY) AND (ANY) at [2:38]" . Here is my sql: """SELECT Record_Start_Time, Generator_Power FROM Furnace.FurnaceData WHERE Record_Start_Time BETWEEN TIMESTAMP("2018-01-21") AND TIMESTAMP("2018-07-21") ORDER BY Record_Start_Time LIMIT 100""".format(request.form['start'],request.form[

In Google BigQuery API, what is the default timeout for a query response?

。_饼干妹妹 提交于 2020-06-16 03:41:30
问题 In Google BigQuery API, what is the default timeout for a query response? In other words, how long does it wait by default until the response returns null for an incomplete job. 回答1: The documentation for timeoutMs in the jobs.query says: [Optional] How long to wait for the query to complete, in milliseconds, before the request times out and returns. Note that this is only a timeout for the request, not the query. If the query takes longer to run than the timeout value, the call returns

Exporting BigQuery table from one project to another

与世无争的帅哥 提交于 2020-06-16 02:20:07
问题 I'm trying to copy a BigQuery table (Table1) stored within a Google Cloud Project (Project1) to another Google Cloud Project (Project2). The table is on the order of TBs. What's the best way to do this so that I don't have to export the table locally? Should I export the table from Project1 to Google Cloud Storage, and then to Project2? Or is there a better way? 回答1: Use bq command line tool to copy a table from one project to another. You can have a look at the following sample command

Exporting BigQuery table from one project to another

二次信任 提交于 2020-06-16 02:19:44
问题 I'm trying to copy a BigQuery table (Table1) stored within a Google Cloud Project (Project1) to another Google Cloud Project (Project2). The table is on the order of TBs. What's the best way to do this so that I don't have to export the table locally? Should I export the table from Project1 to Google Cloud Storage, and then to Project2? Or is there a better way? 回答1: Use bq command line tool to copy a table from one project to another. You can have a look at the following sample command

Can BigQuery be fast enough for real-time onsite request

假如想象 提交于 2020-06-16 01:35:32
问题 I'm looking into the possibility to use BigQuery and its API to do on-site queries depending on content that is viewed by our visitors. Therefore the response time is crucial. I have loaded a very simple structured dataset of 10k rows (4columns) and run a very simple query and that takes between 1 and 2 seconds. My questions is hopefully pretty simple to answer, will I ever be able to get a <1sec response time with the BQ API by optimising the data in someway or not? Thanks a lot in advance!

Can BigQuery be fast enough for real-time onsite request

一世执手 提交于 2020-06-16 01:32:53
问题 I'm looking into the possibility to use BigQuery and its API to do on-site queries depending on content that is viewed by our visitors. Therefore the response time is crucial. I have loaded a very simple structured dataset of 10k rows (4columns) and run a very simple query and that takes between 1 and 2 seconds. My questions is hopefully pretty simple to answer, will I ever be able to get a <1sec response time with the BQ API by optimising the data in someway or not? Thanks a lot in advance!

Using Python to Query GCP Stackdriver logs

让人想犯罪 __ 提交于 2020-06-15 05:59:22
问题 I am using Python3 to query Stackdriver for GCP logs. Unfortunately, the log entries that have important data are returned to me as "NoneType" instead of as a "dict" or a "str". The resulting "entry.payload" is type "None" and the "entry.payload_pb" has the data I want, but it is garbled. Is there a way to get Stackdriver to return this data in a clean format, or is there a way I can parse it? If not, is there a way I should query this data that is better than what I am doing and yields clean