snowflake-cloud-data-platform

How to run snowflake side effect functions like SYSTEM$GENERATE_SCIM_ACCESS_TOKEN within a procedure with owner rights?

两盒软妹~` 提交于 2020-07-16 07:59:07
问题 Basically I want to do SCIM integration in snowflake. For that I have to use this command for getting the token which will be passed to Azure AD: call system$generate_scim_access_token('<value>'); This command can only run with AccountAdmin. And running it with AccountAdmin I am able to get token but In future I will not be having rights of AccountAdmin, so for that what I did, I created a procedure with AccountAdmin and execute it as owner. So that, when ever any other role which is having

Snowflake Python Pandas Connector - Unknown error using fetch_pandas_all

痴心易碎 提交于 2020-07-10 07:30:20
问题 I am trying to connect to snowflake using the python pandas connector. I use the anaconda distribution on Windows, but uninstalled the existing connector and pyarrow and reinstalled using instructions on this page: https://docs.snowflake.com/en/user-guide/python-connector-pandas.html I have the following versions pandas 1.0.4 py37h47e9c7a_0 pip 20.1.1 py37_1 pyarrow 0.17.1 pypi_0 pypi python 3.7.7 h81c818b_4 snowflake-connector-python 2.2.7 pypi_0 pypi When running step 2 of this document:

Snowflake Python Pandas Connector - Unknown error using fetch_pandas_all

99封情书 提交于 2020-07-10 07:30:07
问题 I am trying to connect to snowflake using the python pandas connector. I use the anaconda distribution on Windows, but uninstalled the existing connector and pyarrow and reinstalled using instructions on this page: https://docs.snowflake.com/en/user-guide/python-connector-pandas.html I have the following versions pandas 1.0.4 py37h47e9c7a_0 pip 20.1.1 py37_1 pyarrow 0.17.1 pypi_0 pypi python 3.7.7 h81c818b_4 snowflake-connector-python 2.2.7 pypi_0 pypi When running step 2 of this document:

How do I set a number of rows and count the conditional results?

萝らか妹 提交于 2020-07-09 13:39:05
问题 I'm trying to count the number of conditions from a set number of 100 results. Example: Of the last set 100 actions, how many were for 'X'? I'm not quite sure where to start. I'm fairly new to SQL, and I've tried inner joins, subqueries, etc, but I just can't seem to figure it out. I feel it's something fairly simple. Thank you! 回答1: To do this, you simply need to sum up a case statement that checks for the value. However, if you want to do this for only 100 rows, you will need to perform

Significance of Constraints in Snowflake

喜欢而已 提交于 2020-07-08 13:22:55
问题 Snowflake allows UNIQUE, PRIMARY KEY, FOREIGN KEY and NOT NULL constraints but I read that it enforces only NOT NULL constraint. Then what is the purpose of other keys and under what circumstances do we have to define them? I appreciate any examples. Thank you, Prashanth. 回答1: They express intent, helping people understand your data models. Data modeling tools can use them to generate diagrams. You can also programmatically access them to validate data integrity yourself. 来源: https:/

Adding current date in create script for Snowflake

流过昼夜 提交于 2020-06-29 05:24:50
问题 I have a requirement where I have to create tables with the date/datetime in the table name when they were created dynamically.Wondering if this option is possible in Snowflake? Eg: I would need somethinglike this. CREATE TABLE someNewTable_YYYYMMDD Thank you for your responses; Best, AB 回答1: You can achieve this using SQL variables and the IDENTIFIER keyword. Here's an example that adds the current date into the table-name: SET table_name=(SELECT 'someNewTable_' || TO_VARCHAR(CURRENT_DATE(),

Using schedule tasks in snowflake to clone DB's with dynamic names

徘徊边缘 提交于 2020-06-29 03:48:14
问题 I want to use snowflake Task scheduler to clone one or all of the DB's with dynamic clone DB name something like below,Is it possible to do it without creating Stored procedure.As I have multiple DB under my account I would prefer to clone all of the DB's in one task create database xx_date clone xx I appreciate your response Thanks, 回答1: Is it possible to do it without creating a Stored Procedure The CREATE TASK statement syntax only allows for a single SQL statement to be specified, and the

SQL: Set conditional value based on varying criteria in another table

扶醉桌前 提交于 2020-06-29 03:22:32
问题 A bit new to SQL - the db is Snowflake, which I believe is ANSI Main table shown below. Combinations of same Issue/UPC/Warehouse/Date can be possible, since a new record is added whenever a new issue is reported. Other columns exist, but should not affect this question The exclude column is what I'm trying to figure out - it should be 'Y' if the desired combination of Issue/UPC/Warehouse and Date is in the Exclusion table, shown below. The tricky part is the LEVEL column, defining if a UPC

SQL: Set conditional value based on varying criteria in another table

↘锁芯ラ 提交于 2020-06-29 03:22:09
问题 A bit new to SQL - the db is Snowflake, which I believe is ANSI Main table shown below. Combinations of same Issue/UPC/Warehouse/Date can be possible, since a new record is added whenever a new issue is reported. Other columns exist, but should not affect this question The exclude column is what I'm trying to figure out - it should be 'Y' if the desired combination of Issue/UPC/Warehouse and Date is in the Exclusion table, shown below. The tricky part is the LEVEL column, defining if a UPC

How to create a Spark data frame from Pandas data frame using snow flake and python?

为君一笑 提交于 2020-06-17 13:17:07
问题 I have a sql which is stored in a variable in python and we use SnowFlake database. First I have converted to Pandas Data frame using sql, but I need to convert to Spark Data frame and then store in a CreateorReplaceTempView. I tried: import pandas as pd import sf_connectivity (we have a code for establishing connection with Snowflake database) emp = 'Select * From Employee' snowflake_connection = sf_connectivity.collector() (It is a method to establish snowflake conenction) pd_df = pd.read