sql-loader

sqlldr corrupts my primary key after the first commit

瘦欲@ 提交于 2019-12-11 17:08:30
问题 Sqlldr is corrupting my primary key index after the first commit in my ctl file. After the first, no matter what I set the rows value to in my control file, I get: ORA-39776: fatal Direct Path API error loading table PE_OWNER.CLINICAL_CODE ORA-01502: index 'PE_OWNER.CODE_PK' or partition of such index is in unusable state SQL*Loader-2026: the load was aborted because SQL Loader cannot continue. I'm using Oracle database and client 11.1.0.6.0. I know the issue is not due to duplicate rows

Multiple rows in single field not getting loaded | SQL Loader | Oracle

让人想犯罪 __ 提交于 2019-12-11 12:07:57
问题 I need to load from CSV file into an Oracle Table. The problem i m facing is that, the DESCRIPTION field is having Multiple Lines in itself. Solution i am using for it as ENCLOSURE STRING " (Double Quotes) Using KSH to call for sqlldr. I am getting following two problems: The row having Description with multiple lines, is not getting loaded as it terminates there itself and values of further fields/columns are not visible for loader. ERROR: second enclosure string not present (Obviously " is

Oracle sqlldr: column not allowed here

笑着哭i 提交于 2019-12-11 10:35:56
问题 Can anyone spot the error in this attempted data load? The '\\N' is because this is an import of an OUTFILE dump from mysql, which puts \N for NULL fields. The decode is to catch cases where the field might be an empty string, or might have \N. Using Oracle 10g on Linux. load data infile objects.txt discardfile objects.dsc truncate into table objects fields terminated by x'1F' optionally enclosed by '"' (ID INTEGER EXTERNAL NULLIF (ID='\\N'), TITLE CHAR(128) NULLIF (TITLE='\\N'), PRIORITY

sqlldr - load completion not reflected

孤者浪人 提交于 2019-12-11 10:32:31
问题 I have a bash script ( load_data.sh ) that invokes the sqlldr command to load data from a .csv into a table ( data_import ). On a recent invocation, I noticed that even though the command execution was completed, the table didn't contain the data from the .csv file. I say this because the subsequent statement ( process_data.sh ) in the bash script tried to run a stored procedure that threw the error ORA-01403: no data found . I learned that the commit happens right after the file load. So, I

Error “stopped because I can't continue” in SQLLoader - DIRECT mode

断了今生、忘了曾经 提交于 2019-12-11 10:18:18
问题 When trying to load a large text file into the oracle db using SQLLoader, we get the following errors: SQL*Loader-926: OCI-Error; uldlfca:OCIDirPathColArrayLoadStream for table <myTabele> SQL*Loader-2026: the load was aborted because SQL Loader cannot continue. SQL*Loader-925: Error in uldlgs: OCIStmtExecute (ptc_hp) This only happens in DIRECT mode, when we're using the conventional path method, everything is fine (but a lot slower). So I assume it can't be a problem with the data or the

UNIX Shell Script Solution for formatting a pipe-delimited, segmented file

杀马特。学长 韩版系。学妹 提交于 2019-12-11 09:08:56
问题 The input file has up to 34 different record types within the same line. The file is pipe-delimited, and each record type is separated by '~' (except for the originating record type. Not all 34 record types are contained on each line, and I do not need all of them. All record types will be sent within a specified order, but not all record types will always be sent. The first record type is mandatory and will always be sent. Out of the 34 types, there are only 7 that are mandatory. Each record

Load data from multiple data files into multiple tables using single control file

邮差的信 提交于 2019-12-11 07:03:11
问题 I have 3 data files and 3 tables. Is there any way to enter the data from the data files to their respective tables using only a single control file using parameters. I know that we can use multiple datafiles to insert data into single table but i want multiple datafiles to insert data into multiple tables. 回答1: I have not done it, but you can specify multiple files as long as they are the same layout by using multiple INFILE clauses. Then as long as there is some identifier in the record,

SQL*Loader - How can i ignore certain rows with a specific charactre

ε祈祈猫儿з 提交于 2019-12-11 05:37:47
问题 If i have a CSV file that is in the following format "fd!","sdf","dsfds","dsfd" "fd!","asdf","dsfds","dsfd" "fd","sdf","rdsfds","dsfd" "fdd!","sdf","dsfds","fdsfd" "fd!","sdf","dsfds","dsfd" "fd","sdf","tdsfds","dsfd" "fd!","sdf","dsfds","dsfd" Is it possible to exclude any row where the first column has an exclamation mark at the end of the string. i.e. it should only load the following rows "fd","sdf","rdsfds","dsfd" "fd","sdf","tdsfds","dsfd" Thanks 回答1: According to the Loading Records

46 Control Files for SQL Loader

送分小仙女□ 提交于 2019-12-11 00:48:13
问题 I have to load 46 tables with data using SQL Loader for Oracle. All the data files are CSV. The column order in the CSV matches the column order in the table. I need to create a control file for each table. What is the best way for me to mass produce these files? 回答1: I know this is an old question, but it is still a relevant question. For future searchers here is a procedure I added to our utility package that generates a skeleton control file for a table. Pass it the table name and it

Loading large amounts of data to an Oracle SQL Database

六月ゝ 毕业季﹏ 提交于 2019-12-10 17:57:32
问题 I was wondering if anyone had any experience with what I am about to embark on. I have several csv files which are all around a GB or so in size and I need to load them into a an oracle database. While most of my work after loading will be read-only I will need to load updates from time to time. Basically I just need a good tool for loading several rows of data at a time up to my db. Here is what I have found so far: I could use SQL Loader t do a lot of the work I could use Bulk-Insert