I am using SQL LOADER to load multiple csv file in one table. The process I found is very easy like
LOAD
  DATA 
  INFILE '/path/file1.csv'
  INFILE '/path/file2.csv'
  INFILE '/path/file3.csv'
  INFILE '/path/file4.csv'
  APPEND INTO TABLE TBL_DATA_FILE
    EVALUATE CHECK_CONSTRAINTS
    REENABLE DISABLED_CONSTRAINTS
    EXCEPTIONS EXCEPTION_TABLE
  FIELDS TERMINATED BY "," 
  OPTIONALLY ENCLOSED BY '"'
  TRAILING NULLCOLS
  ( 
    COL0,
    COL1,
    COL2,
    COL3,
    COL4
  )
But I don't want to use INFILE multiple time cause if I have more than 1000 files then I have to mention 1000 times INFILE in control file script.
So my question is: is there any other way (like any loop / any *.csv) to load multiple files without using multiple infile?
Thanks, Bithun
Solution 1: Can you concatenate the 1000 files into on big file, which is then loaded by SQL*Loader. On unix, I'd use something like
cd path
cat file*.csv > all_files.csv
Solution 2: Use external tables and load the data using a PL/SQL procedure:
CREATE PROCEDURE myload AS
BEGIN
  FOR i IN 1 .. 1000 LOOP
    EXECUTE IMMEDIATE 'ALTER TABLE xtable LOCATION ('''||to_char(i,'FM9999')||'.csv'')';
    INSERT INTO mytable SELECT * FROM xtable;
  END LOOP;
END;
You can use a wildcards (? for a single character, * for any number) like this:
infile 'file?.csv'
;)
Loop over the files from the shell:
#!/bin/bash
for csvFile in `ls file*.csv`
do
    ln -s $csvFile tmpFile.csv
    sqlldr control=file_pointing_at_tmpFile.ctl
    rm tmpFile.csv
done
OPTIONS (skip=1)
LOAD DATA
INFILE /export/home/applmgr1/chalam/Upload/*.csv
REPLACE INTO TABLE XX_TEST_FTP_UP
FIELDS TERMINATED BY ','
TRAILING NULLCOLS
(FULL_NAME,EMPLOYEE_NUMBER)
whether it will check all the CSV and load the data or not
来源:https://stackoverflow.com/questions/17405071/load-multiple-csv-into-one-table-by-sqlldr