snowflake-data-warehouse

unable to connect to snowflake

扶醉桌前 提交于 2019-12-24 13:44:58
问题 I am trying to connect to snowflake from python. It's very simple straight forward, but unfortunately I'm unable to succeed. Same piece of code works in other machines (when my friends tested). Not sure what dependencies I'm missing. Tried debugging very hard, even tried with pipenv (assuming python path must be conflict) but no luck. I kindly request you to help and resolve this issue. Summarising below steps what i have done. sudo -H pip install pipenv #installed pipenv mkdir -p test_vn

getting started with liquibase on snowflake

我们两清 提交于 2019-12-24 08:54:59
问题 I am trying to get started with liquibase on snowflake. I think I am almost there with the liquibase.properties file driver: net.snowflake.client.jdbc.SnowflakeDriver classpath: ./liquibase-snowflake-1.0.jar url: jdbc:snowflake://XXXXXX.us-east-1.snowflakecomputing.com username: YYYYYYYYY password: ZZZZZZZZZZ changeLogFile: mySnowflakeChangeLog.xml Unfortunately, liquibase complains about not having a "current database" when trying to create the tables databasechangelog and/or

Lambda error: no module found. Cryptography.hamtaz.bindings._constant_time

旧巷老猫 提交于 2019-12-23 13:16:20
问题 I created a lambda function which upload data to snowflake. I installed a all requirements in folder and zipped along with my main python file. While running in AWS it shows an error: no module found. Cryptography.hamtaz.bindings._constant_time. But I have this module at specified path. I don't know why it shows an error. I don't know why the error is arise. Here is the code: main(event, context): import snowflake.connector cnx = snowflake.connector.connect( user='xxx', password='yyyyy',

Split a large json file into multiple smaller files

若如初见. 提交于 2019-12-20 23:25:21
问题 I have a large JSON file, about 5 million records and a file size of about 32GB, that I need to get loaded into our Snowflake Data Warehouse. I need to get this file broken up into chunks of about 200k records (about 1.25GB) per file. I'd like to do this in either Node.JS or Python for deployment to an AWS Lambda function, unfortunately I haven't coded in either, yet. I have C# and a lot of SQL experience, and learning both node and python are on my to do list, so why not dive right in, right

Encryption of data in transit on the Snowflake platform

一个人想着一个人 提交于 2019-12-20 03:54:08
问题 Is data encrypted while in transit on the Snowflake plaform? It's clear that via Snowflake End to End Encryption that data at rest is encrypted, but what about data on the move? For example when data is being transferred from remote Snowflake disk (long term storage) to local cache (SSDs on compute nodes) - does the data remain encrypted during that transfer? Another example would be when adding results sets to the Snowflake results cache (available to all Virtual Warehouses) - is the data

How to use the percent key word in the snowflake query

不打扰是莪最后的温柔 提交于 2019-12-13 22:38:39
问题 SELECT NBR, Customers, status FROM ( SELECT NBR, Customers, Code AS status FROM CCC AS CS INNER JOIN AAA AS AC ON CCC.B2= ACT.B1 AND CSS.B2 = ACT.B1 ) AS rst WHERE status IN ('A', 'T') ORDER BY NBR LIMIT 100 PERCENT 回答1: I saw you other post. Not 100% sure what you are trying to do, but you might want to consider using a windows function like ratio_to_report. See the example below. Here is a link to the windows functions https://docs.snowflake.net/manuals/sql-reference/functions-analytic.html

How to execute stored procedure on ODBC Snowflake Destinastion?

徘徊边缘 提交于 2019-12-13 03:58:20
问题 I'am building new package for move data from aws sql server instance to snowflake odbc destination. If i found rows which was updated i must change them on snowflake as well. In common's i found only 'OLE DB Command' for execute procedure for update diffrent rows. The problem is i need something like "ODBC Command" for execute procedure to update diffrent rows between SQL Server&Snowflake. 回答1: OK, I do it. So if u need UPDATE rows on ODBC destination in SSIS u have only one way to do that u

How to insert into a snowflake variant field using a DAO?

▼魔方 西西 提交于 2019-12-13 03:30:14
问题 I have the following code: @RegisterMapper(MyEntity.ResultMapper.class) @UseStringTemplate3StatementLocator public interface MyDao { @Transaction(TransactionIsolationLevel.SERIALIZABLE) @SqlBatch("INSERT INTO mySchema.myTable (" + " id, entity_type, entity_id, flags " + " ) VALUES " + "(" + " :stepId , :entityType , :entityId,parse_json(:flags) " + ")") @BatchChunkSize(500) Object create( @BindBean List<MyEntity> entities ); } As you can see, I am bulk inserting a list of entities into my

Move Tables & Revoke All Privileges

安稳与你 提交于 2019-12-11 17:39:45
问题 We need users to move their tables from their personal schemas (user_db.username) to the managed schema (userdb.groupname) which provides a predefined set of permissions for select access. In moving the table, we need to accomplish the following: Move the table out of the old schema Remove the old select grants Apply the new grants from the managed schema I've reviewed the Alter table .. rename to.. documentation, and while that appears to enable movement of the table, it would retain the old

ImportError: cannot import name dump_publickey

懵懂的女人 提交于 2019-12-11 10:34:38
问题 Successfully installed pip install --upgrade snowflake-connector-python , but i'm unable to print snowflake version. Don't know what could be the issue. Followed steps provided in below link. https://docs.snowflake.net/manuals/user-guide/python-connector-install.html#step-2-verify-your-installation import snowflake.connector # Gets the version ctx = snowflake.connector.connect( user='<your_user_name>', password='<your_password>', account='<your_account_name>' ) cs = ctx.cursor() try: cs