external-tables

Snowflake External Table failed to cast variant value NULL to DATETIME/TIMESTAMP_NTZ type

有些话、适合烂在心里 提交于 2021-02-11 15:45:49
问题 Created an external table with a column of type datetime (TIMESTAMP_NTZ type), the external stage has a csv file with null value in the column. Selecting from the external table is giving "Failed to cast variant value "null" to TIMESTAMP_NTZ" CREATE OR REPLACE EXTERNAL TABLE ext_table_datetime ( col1 datetime as (value:c1::datetime) ) with location = 's3://bucket_name' file_format = file_format_1 auto_refresh = true; Also I have the file format defined as follows, which works for other column

Import a CSV file into an Oracle External Table

北战南征 提交于 2021-01-29 11:06:09
问题 I have seen various similar questions to this but none of the solutions seem to work for me. I have been given a CSV file produced on a mainframe that I need to load up into Oracle. I decided to try and map it in an Oracle external table and then use this to get it inserted into Oracle. This is my CSV: CONTRACT_NUMBER,PRODUCTCODE,TRANSACTION_NUMBER,EFFECTIVE_DATE,AMENDMENT,TERM,ACTIVE,AGENT_NUMBER,PREMIUM,ICRATE,RCRATE,IC_ALLOW,RC_ALLOW,SPRATE,TRANSACTION_CODE,TRANSACTION_DATE,AGENT_CATEGORY

Hive table re-create before load every date

前提是你 提交于 2021-01-28 14:30:56
问题 I saw application are droping external table and creating again then loading the data and runnning msck command every time data load..what is the benefit of this on every time dropping and creating? 回答1: There is no benefit in dropping and recreating EXTERNAL table, because dropping table leaves data intact. Though there may be a benefit in dropping and re-creating MANAGED table because it will drop data as well. One possible scenario if you are running on S3: Dropping files early before the

External table truncates trailing whitespace while reading from file

房东的猫 提交于 2020-01-16 05:35:34
问题 I'm trying to load the file contents to an external table. While doing this, trailing spaces are truncated. 'CREATE TABLE ' || rec.ext_table_name || ' (ROW_DATA VARCHAR2(4000)) ORGANIZATION EXTERNAL ' || '(TYPE ORACLE_LOADER DEFAULT DIRECTORY ' || rec.dir_name || ' ACCESS ' || 'PARAMETERS (RECORDS ' || 'DELIMITED by NEWLINE NOBADFILE NODISCARDFILE ' || 'FIELDS REJECT ROWS WITH ALL NULL FIELDS (ROW_DATA POSITION(1:4000) char)) LOCATION (' || l_quote || 'temp.txt' || l_quote || ')) REJECT LIMIT

Insert data of 2 Hive external tables in new External table with additional column

本小妞迷上赌 提交于 2020-01-07 06:36:46
问题 I have 2 external hive tables as follows. I have populated data in them from oracle using sqoop. create external table transaction_usa ( tran_id int, acct_id int, tran_date string, amount double, description string, branch_code string, tran_state string, tran_city string, speendby string, tran_zip int ) row format delimited stored as textfile location '/user/stg/bank_stg/tran_usa'; create external table transaction_canada ( tran_id int, acct_id int, tran_date string, amount double,

Oracle external tables

只愿长相守 提交于 2020-01-03 02:53:29
问题 I'm struggling with an Oracle external table, although I researched the Oracle forums. Still, no success. Let's suppose I have a simple table DESCRIBE PRODUCTS Name Null Type ------------------------------ -------- --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- ID NOT NULL NUMBER NAME VARCHAR2(30) VALUE NUMBER(5,2) DEP VARCHAR2(30) COUNT NUMBER(3) Then

How to truncate a partitioned external table in hive?

随声附和 提交于 2020-01-01 06:44:08
问题 I'm planning to truncate the hive external table which has one partition. So, I have used the following command to truncate the table : hive> truncate table abc; But, it is throwing me an error stating : Cannot truncate non-managed table abc. Can anyone please suggest me out regarding the same ... 回答1: Make your table MANAGED first: ALTER TABLE abc SET TBLPROPERTIES('EXTERNAL'='FALSE'); Then truncate: truncate table abc; And finally you can make it external again: ALTER TABLE abc SET

Hive from JSON Error

99封情书 提交于 2019-12-25 09:40:10
问题 I can't make this json into hive table somehow, either become all null data or not able being selected. i just need all the same fields with my DDL, and if it's structured inside it, i want to let it as a string instead try to parse that. The only one almost achieved only by : hive-hcatalog-core-1.1.0-cdh5.10.0.jar since some data are blank, i'm able to query with LIMIT but when i remove the limit, it was returning me this kind of error org.apache.hadoop.hive.serde2.SerDeException: java.io

Read Excel from DB2

拜拜、爱过 提交于 2019-12-25 05:23:35
问题 I have to import some Excel data on a regular basis. Checking the DB2 documentation one can directly access OLE DB datasources via an external function. However I'm unable to set it up properly. I got the Microsoft Access Database Enginge 2010 plus the fix pack and installed it on the database server. I placed the excel file in a local directory from the database server. ( C:\Temp\test.xls ) The excel has a workbook called TEST1 and two rows ABC and DEF following some numeric data: ABC | DEF

removing EOL delimiter from inserting into external table -oracle

别来无恙 提交于 2019-12-25 04:20:00
问题 I have included notrim for rowdata column in external table as suggesterd by Alex (This is a continuation of this question,), But now End of Line character is also appending at the rowdata column , I mean , End of line (CR-LF) is also joins at the end of rowdata. I don't want to use substr() or translate() , since file size is around 1GB, My external table creation process : 'CREATE TABLE ' || rec.ext_table_name || ' (ROW_DATA VARCHAR2(4000)) ORGANIZATION EXTERNAL ' || '(TYPE ORACLE_LOADER