google-bigquery

Migrating Non Partitioned Streaming Table to Partitioned Table Bigquery

て烟熏妆下的殇ゞ 提交于 2020-01-17 01:29:05
问题 I have a legacy unpartitioned big query table that streams logs from various sources (Let's say Table BigOldA ). The aim is to transfer it to a new day partition table (Let's say PartByDay ) which is done with the help of the following link: https://cloud.google.com/bigquery/docs/creating-column-partitions#creating_a_partitioned_table_from_a_query_result bq query --allow_large_results --replace=true --destination_table <project>:<data-set>.<PartByDay> --time_partitioning_field REQUEST

Join uneven arrays from many columns and avoid duplicates in BigQuery

六月ゝ 毕业季﹏ 提交于 2020-01-17 00:36:01
问题 I asked a similar question here that I thought abstracted my problem sufficiently but unfortunately, it did not. I have a table of nested arrays, the first column is an int. I can join two arrays without duplication (as answered in my previous question) but I'm unsure how to do it with more than two. Here is the table (in StandardSQL): WITH a AS ( SELECT 1 AS col1, ARRAY[1, 2 ] AS col2, ARRAY[1, 2, 3] AS col3, ARRAY[1, 2, 3, 4] AS col4 UNION ALL SELECT 2 AS col1, ARRAY[1, 2, 2] AS col2, ARRAY

Wait for formula to be completed before executing script

不羁的心 提交于 2020-01-16 18:48:29
问题 I have this scenario: A sheet "BQ" is connected with BigQuery using 1 A second sheet "F" contains a Formula which reads data from "BQ" An AppsScript that trigger BigQuery data reload Here is the script sample: var spreadsheet = SpreadsheetApp.getActive().getSheetByName(sheetName); SpreadsheetApp.enableAllDataSourcesExecution(); spreadsheet.getRange("A1").getCell(1,1).getDataSourceTables()[0].refreshData(); This appscript perform these actions: trigger BigQuery data reload, as mentioned "wait"

Why is the Geostats Table empty when I use Google Ads Transfer (BigQuery Data Transfer Service)?

可紊 提交于 2020-01-16 16:32:49
问题 I am trying to get the below criterias for advertisement spending data to compute ROAS. 'AccountDescriptiveName', 'Date', 'CampaignName', 'CampaignId', 'AdNetworkType2', 'AccountTimeZone', 'Impressions', 'Clicks', 'Cost', 'AccountCurrencyCode', 'Conversions', 'CountryCriteriaId' The CountryCriteriaId is only available in GEO_PERFORMANCE_REPORT which is represented by GeoStats table in Google Ads Transfer Service. I have found some tables like CampaignStats Table are not empty. However, the

Issues loading CSV into BigQuery table

久未见 提交于 2020-01-16 14:31:30
问题 Im trying to create a BigQuery table using a pretty simple csv file I have stored in GCS . I keep getting the same error over and over again: Could not parse '1/1/2008' as datetime for field XXX I've checked that the csv file isn't corrupted, and I've managed to upload everything into one column so the file is readable by BigQuery . I've added the word NULL to any empty fields thinking consecutive delimiters may be causing the issues but I am still facing the same issue. I know data, I

Joining 3 tables in Google bigquery

梦想的初衷 提交于 2020-01-16 08:23:27
问题 The example below stops at the first JOIN with an error message Encountered " "JOIN" "JOIN "" at line 13, column 4. Was expecting: ")" Am I missing something obvious with multiple joins in Bigquery? SELECT type.CourseType AS CourseType, SUM(joined.assign.StudentCount) AS StudentN FROM ( SELECT assign.StateCourseCode, assign.StateCourseName, assign.MatchType, assign.Term, assign.StudentCount FROM [Assignment.AssignmentExtract5] AS assign JOIN SELECT wgt.Term, wgt.Weight FROM [Crosswalk

Upload a csv file using google scripts to bigquery fails

强颜欢笑 提交于 2020-01-16 05:32:25
问题 I did a google script to automatically upload a csv file to bigquery. It was a small file (5mb), and it worked out. Now, I'm trying to upload a 150MB csv file with the very same script, and I always get a "server error". Is it supposed to work until 1GB, isn't it? I appreciate your help! Albert 回答1: Bigquery has 1GB limit but the Google Apps Script UrlFetch Post Size can be maximum upto 10MB. You can not POST 100MB file using Apps Script UrlFetch Check for Quota limits here. https://docs

“Unexpected. Please try again” error when copying BigQuery table

你。 提交于 2020-01-16 05:00:06
问题 I'm trying to copy an existing table, and the job fails with error "Unexpected. Please try again" with no other information. I get the same result whether doing it in the web console or through the bq command line tool. Here is the the job id: arcx-prod:job_XVgg_YqvXZRTFucTJhhh8D-cWU8. A few potentially related pieces of information: copying succeeds for other tables in the same project exporting data from this table also fails with the same error 回答1: This is a known issue in the latest

Slow BigQuery response

点点圈 提交于 2020-01-16 00:54:47
问题 I've got a relatively small dataset where I'm writing IP log analytics then doing queries against it. I'm updating BigQuery once per hour with updated stats. I have 110,000 rows with 37MB of data. The below query takes anywhere from 7 seconds - 50+ seconds to run: SELECT SUM(1) as views FROM [statistics.statsLogNSI] WHERE lastedit > DATE_ADD(CURRENT_TIMESTAMP(), -7, "DAY") My more complex query examples are below: SELECT SUM(1) as views FROM [statistics.statsLogNSI] WHERE NOT combination

Replacing apostrophe or quotes in bq SQL

倖福魔咒の 提交于 2020-01-15 20:11:36
问题 I'm new to using bq, Big Query command line utility. I have a more complex SQL clause that unfortunately has both apostrophe and quote characters within the SQL statement. Since both characters are in the SQL statement I'm looking for some replacement for them, otherwise one or the other will be interpretted as an "end of query" delimitter. Below is the query that I am trying to run, which works great on the Big Query html interface, but not so great using bq command line utility. Suggestions