sql-loader

Insert rows with batch id using sqlldr

痴心易碎 提交于 2019-12-06 16:45:47
I am able to insert rows into a table using sqlldr with no worries. I would like to tag all rows of a file to a unique number so that I can treat them as one batch. I tried "my_db_seq.nextval" as batch id. But its not serving my purpose. So please advise on how to create a unique batch id for entire set of rows of a file while loading using sqlldr. Wrap your call to the sequence in a function like this: create or replace function get_batch_id return integer is x exception; -- ORA-08002: sequence %s.CURRVAL is not yet defined in this session pragma exception_init (x, -8002); begin return my_db

SQL Loader with utf8

谁说我不能喝 提交于 2019-12-05 08:30:29
I am getting following error while loading Japanese data using SQL*Loader. My Database is UTF8 (NLS parameters) and my OS supports UTF8. Record 5: Rejected - Error on table ACTIVITY_FACT, column METADATA. ORA-12899: value too large for column METADATA (actual: 2624, maximum: 3500) My Control file: load data characterset UTF8 infile '../tab_files/activity_fact.csv' "STR ';'" APPEND into tableactivity_fact fields terminated by ',' optionally enclosed by '~' TRAILING NULLCOLS (metadata CHAR(3500)) My table create table actvuty_facr{ metadata varchar2(3500 char) } Why SQL Loader is throwing the

SQL*Loader stuck after loading 4.2 billion records

岁酱吖の 提交于 2019-12-05 06:15:05
We are stuck with a problem in sql loader. We are trying to load a data file with around 4.6 billion rows (nearly 340 GB) into 2 oracle tables on the basis of some when condition using Sql Loader. But after loading 4.2 billion records the SQL loader process is getting completed without throwing any errors even when rest of the records are still to be loaded. There are no dicarded or bad records as well. Is there any limit for the number of records SQL Loader can load? Could not find any such thing documented anywhere. Please let me know if anyone has any clue for this issue. Thanks!! The value

Oracle sqlldr timestamp format headache

こ雲淡風輕ζ 提交于 2019-12-05 02:13:14
I'm struggling to get sqlldr to import a csv data file into my table, specifically with the field that is a timestamp. The data in my csv file is in this format: 16-NOV-09 01.57.48.001000 PM I've tried all manner of combinations in my control file and am going around in circles. I can't find anything online - not even the Oracle reference page that details what all the date/timestamp format strings are. Does anyone know where this reference page is, or what format string I should be using in my control file for this timestamp format. For reference, this is what I've most recently tried: load

Oracle sqlldr TRAILING NULLCOLS required, but why?

倾然丶 夕夏残阳落幕 提交于 2019-12-05 00:14:13
I have an abstruse sqlldr problem that's bothering me. My control file looks something like this: load data infile 'txgen.dat' into table TRANSACTION_NEW fields terminated by "," optionally enclosed by '"' TRAILING NULLCOLS ( A, B, C, D, ID "ID_SEQ.NEXTVAL" ) Data is something like this: a,b,c, a,b,,d a,b,, a,b,c,d If I don't put the TRAILING NULLCOLS in, I get the "column not found before end of logical record" error. But although some of the columns are null, the commas are all there, so I don't see a reason for sqlldr to misinterpret the input file, and not get to the end where it generates

load multiple csv into one table by SQLLDR

陌路散爱 提交于 2019-12-04 09:19:54
I am using SQL LOADER to load multiple csv file in one table. The process I found is very easy like LOAD DATA INFILE '/path/file1.csv' INFILE '/path/file2.csv' INFILE '/path/file3.csv' INFILE '/path/file4.csv' APPEND INTO TABLE TBL_DATA_FILE EVALUATE CHECK_CONSTRAINTS REENABLE DISABLED_CONSTRAINTS EXCEPTIONS EXCEPTION_TABLE FIELDS TERMINATED BY "," OPTIONALLY ENCLOSED BY '"' TRAILING NULLCOLS ( COL0, COL1, COL2, COL3, COL4 ) But I don't want to use INFILE multiple time cause if I have more than 1000 files then I have to mention 1000 times INFILE in control file script. So my question is: is

Is it possible for Oracle sqlldr to accept a TNS entry as an instance qualifier in Oracle 10 and 11?

六月ゝ 毕业季﹏ 提交于 2019-12-03 12:40:24
Is it possible to use a fully qualified TNS entry using sqlldr bundled with Oracle 10/11? For example, in SQLPlus: sqlplus user/password@(description=(address=(host=localhost)(protocol=tcp)(port=1521))(connect_data=(sid=orcl))) @script.sql But using sqlldr (SQL Loader) there appear to be issues with using the TNS entry directly. Specifically: sqlldr user/password@(description=(address=(host=localhost)(protocol=tcp)(port=1521))(connect_data=(sid=orcl))) bad='bad_file.txt' control='control.ctl' data='data.txt' log='log.txt' direct='true' Here is the error message produced: LRM-00116: syntax

Is there any way to use a strftime-like function for dates before 1900 in Python?

痞子三分冷 提交于 2019-12-03 09:34:49
问题 I didn't realize this, but apparently Python's strftime function doesn't support dates before 1900: >>> from datetime import datetime >>> d = datetime(1899, 1, 1) >>> d.strftime('%Y-%m-%d') Traceback (most recent call last): File "<stdin>", line 1, in <module> ValueError: year=1899 is before 1900; the datetime strftime() methods require year >= 1900 I'm sure I could hack together something myself to do this, but I figure the strftime function is there for a reason (and there also is a reason

Oracle SQL-Loader handling efficiently internal Double Quotes in values

放肆的年华 提交于 2019-12-02 07:26:32
I have some Oracle SQL Loader challenges and looking for an efficient and simple solution. my source files to be loaded are pipe | delimited, where values are enclosed by Double Quotes " . the problem seems to be that some of the values contains internal Double Quotes. e.g.: ..."|"a":"b"|"... this causes my records to be rejected under the excuse of: no terminator found after TERMINATED and ENCLOSED field there are various solutions over the web but non seems to fit: [1] I have tried to replace all internal double quotes in quoting the quotes, but it seems that when applying this function on

How to use to_number and nullif in sql-loader?

僤鯓⒐⒋嵵緔 提交于 2019-12-02 00:29:28
I've had a similar problem with dates (combination of to_date and nullif) here : How to use decode in sql-loader? And it was solved nicely. My problem is that a numeric field in my CSV file can have these formats : 999,999,999.99 or just a dot '.' for null values. This is working : MINQUANTITY "TO_NUMBER(:MINQUANTITY, '9999999999D999999', 'NLS_NUMERIC_CHARACTERS='',.''')" or MINQUANTITY NULLIF MINQUANTITY = '.' But it is not working when I'm trying to combine both : MINQUANTITY "TO_NUMBER(:MINQUANTITY, '9999999999D999999', 'NLS_NUMERIC_CHARACTERS='',.''') NULLIF :MINQUANTITY= '.'" Here is the