google-bigquery

Bigquery Custom Schedule Cron Syntax Not Accepted

久未见 提交于 2020-12-12 05:50:25
问题 I am trying to schedule a query to run intraday in Bigquery UI. According to Google's documentation this option uses cron syntax. I have used crontab guru to verify the syntax is correct, although it doesn't matter what syntax you put the scheduler doesn't seem to accept any. Is this a known bug? Below is the cron syntax I'm using to run every 6 hours. 0 */6 * * * 回答1: Form the official documentation: When selecting Custom, a Cron-like time specification is expected, for example every 3 hours

How do I convert a BigQuery row to JSON using the C# API?

烈酒焚心 提交于 2020-12-11 04:42:33
问题 I am pulling some data from a BigQuery table using the code below in C# BigQueryClient client = BigQueryClient.Create("<Project Name>"); BigQueryTable table = client.GetTable("<Database>", "Students"); string sql = $"select * FROM {table} where Marks='50'"; BigQueryResults results = client.ExecuteQuery(sql); foreach (BigQueryRow row in results.GetRows()) { } I want to be able to either read the entire results variable into JSON or be able to get the JSON out of each row. Of course, I could

How do I convert a BigQuery row to JSON using the C# API?

大城市里の小女人 提交于 2020-12-11 04:41:59
问题 I am pulling some data from a BigQuery table using the code below in C# BigQueryClient client = BigQueryClient.Create("<Project Name>"); BigQueryTable table = client.GetTable("<Database>", "Students"); string sql = $"select * FROM {table} where Marks='50'"; BigQueryResults results = client.ExecuteQuery(sql); foreach (BigQueryRow row in results.GetRows()) { } I want to be able to either read the entire results variable into JSON or be able to get the JSON out of each row. Of course, I could

How do I convert a BigQuery row to JSON using the C# API?

让人想犯罪 __ 提交于 2020-12-11 04:41:51
问题 I am pulling some data from a BigQuery table using the code below in C# BigQueryClient client = BigQueryClient.Create("<Project Name>"); BigQueryTable table = client.GetTable("<Database>", "Students"); string sql = $"select * FROM {table} where Marks='50'"; BigQueryResults results = client.ExecuteQuery(sql); foreach (BigQueryRow row in results.GetRows()) { } I want to be able to either read the entire results variable into JSON or be able to get the JSON out of each row. Of course, I could

Execute a BigQuery query in Cloud Build step

怎甘沉沦 提交于 2020-12-10 08:45:11
问题 I'm using Cloud Build with the gcloud builder. I override the entrypoint to be bq so I can run some BigQuery SQL in my build step. Previously, I had the SQL embedded directly in the YAML config for Cloud Build. This works fine: steps: - name: gcr.io/cloud-builders/gcloud entrypoint: 'bq' args: ['query', '--use_legacy_sql=false', 'SELECT 1'] Now I'd like to refactor the SQL out of the YAML and into a file instead. According to here, you can cat the file or pipe it to bq . This works on the

Is it possible to limit a Google service account to specific BigQuery datasets within a project?

久未见 提交于 2020-12-07 01:02:41
问题 I've set up a service account using the GCP UI for a specific project Project X . Within Project X there are 3 datasets: Dataset 1 Dataset 2 Dataset 3 If I assign the role BigQuery Admin to Project X this is currently being inherited by all 3 datasets. Currently all of these datasets inherit the permissions assigned to the service account at the project level. Is there any way to modify the permissions for the service account such that it only has access to specified datasets? e.g. allow

Is it possible to limit a Google service account to specific BigQuery datasets within a project?

∥☆過路亽.° 提交于 2020-12-07 01:00:25
问题 I've set up a service account using the GCP UI for a specific project Project X . Within Project X there are 3 datasets: Dataset 1 Dataset 2 Dataset 3 If I assign the role BigQuery Admin to Project X this is currently being inherited by all 3 datasets. Currently all of these datasets inherit the permissions assigned to the service account at the project level. Is there any way to modify the permissions for the service account such that it only has access to specified datasets? e.g. allow

Bigquery: Append to a nested record

*爱你&永不变心* 提交于 2020-12-06 11:08:48
问题 I'm currently checking out Bigquery, and I want to know if it's possible to add new data to a nested table. For example, if I have a table like this: [ { "name": "name", "type": "STRING" }, { "name": "phone", "type": "RECORD", "mode": "REPEATED", "fields": [ { "name": "number", "type": "STRING" }, { "name": "type", "type": "STRING" } ] } ] And then I insert a phone number for the contact John Doe. INSERT into socialdata.phones_examples (name, phone) VALUES("Jonh Doe", [("555555", "Home")]);

Bigquery: Append to a nested record

北战南征 提交于 2020-12-06 11:08:01
问题 I'm currently checking out Bigquery, and I want to know if it's possible to add new data to a nested table. For example, if I have a table like this: [ { "name": "name", "type": "STRING" }, { "name": "phone", "type": "RECORD", "mode": "REPEATED", "fields": [ { "name": "number", "type": "STRING" }, { "name": "type", "type": "STRING" } ] } ] And then I insert a phone number for the contact John Doe. INSERT into socialdata.phones_examples (name, phone) VALUES("Jonh Doe", [("555555", "Home")]);

Copy table structure alone in Bigquery

我们两清 提交于 2020-12-02 08:11:44
问题 In Google's Big query, is there a way to clone (copy the structure alone) a table without data? bq cp doesn't seem to have an option to copy structure without data. And Create table as Select (CTAS) with filter such as "1=2" does create the table without data. But, it doesn't copy the partitioning/clustering properties. 回答1: If you want to clone structure of table along with partitioning/clustering properties w/o having need in knowing what exactly those partitioning/clustering properties -