bulkinsert

Bulk insert, SQL Server 2000, unix linebreaks

怎甘沉沦 提交于 2019-11-27 11:07:22
问题 I am trying to insert a .csv file into a database with unix linebreaks. The command I am running is: BULK INSERT table_name FROM 'C:\file.csv' WITH ( FIELDTERMINATOR = ',', ROWTERMINATOR = '\n' ) If I convert the file into Windows format the load works, but I don't want to do this extra step if it can be avoided. Any ideas? 回答1: I felt compelled to contribute as I was having the same issue, and I need to read 2 UNIX files from SAP at least a couple of times a day. Therefore, instead of using

Accelerate bulk insert using Django's ORM?

血红的双手。 提交于 2019-11-27 10:29:46
I'm planning to upload a billion records taken from ~750 files (each ~250MB) to a db using django's ORM. Currently each file takes ~20min to process, and I was wondering if there's any way to accelerate this process. I've taken the following measures: Use @transaction.commit_manually and commit once every 5000 records Set DEBUG=False so that django won't accumulate all the sql commands in memory The loop that runs over records in a single file is completely contained in a single function (minimize stack changes) Refrained from hitting the db for queries (used a local hash of objects already in

Cannot bulk load. The file “c:\\data.txt” does not exist

六月ゝ 毕业季﹏ 提交于 2019-11-27 09:40:15
I'm having a problem reading data from a text file into ms sql. I created a text file in my c:\ called data.txt, but for some reason ms sql server cannot find the file. I get the error "Cannot bulk load. The file "c:\data.txt" does not exist." Any ideas? The data file (yes I know the data looks crappy, but in the real world thats how it comes from clients): 01-04 10.338,18 0,00 597.877,06- 5 0,7500 62,278- 06-04 91.773,00 9.949,83 679.700,23- 1 0,7500 14,160- 07-04 60.648,40 149.239,36 591.109,27- 1 0,7500 12,314- 08-04 220.173,70 213.804,37 597.478,60- 1 0,7500 12,447- 09-04 986.071,39 0,00 1

Bulk insert using stored procedure

不羁岁月 提交于 2019-11-27 08:49:42
I have a query which is working fine: BULK INSERT ZIPCodes FROM 'e:\5-digit Commercial.csv' WITH ( FIRSTROW = 2 , FIELDTERMINATOR = ',', ROWTERMINATOR = '\n' ) but now I want to create a stored procedure for it. I have written below code to make its stored procedure: create proc dbo.InsertZipCode @filepath varchar(500)='e:\5-digit Commercial.csv' as begin BULK INSERT ZIPCodes FROM @filepath WITH ( FIRSTROW = 2 , FIELDTERMINATOR = ',', ROWTERMINATOR = '\n' ) end but its showing error: Msg 102, Level 15, State 1, Procedure InsertZipCode, Line 6 Incorrect syntax near '@filepath'. Msg 319, Level

MySql bulk load command line tool

半世苍凉 提交于 2019-11-27 08:30:27
Does MySql have a bulk load command line tool like bcp for SQLServer and sqlldr for Oracle? I know there's a SQL command LOAD INFILE or similar but I sometimes need to bulk load a file that is on a different box to the MySQL database. mysqlimport. takes the same connection parameters as the mysql command line shell. Make sure to use the -L flag to use a file on the local file system, otherwise it will (strangely) assume the file is on the server. There is also an analogous variant to the load data infile command, i.e., load data local infile , according to which the file will be loaded from

php PDO insert batch multiple rows with placeholders

匆匆过客 提交于 2019-11-27 08:03:27
I am looking to do multiple inserts using PHP PDO. The closest answer I have found is this one how-to-insert-an-array-into-a-single-mysql-prepared-statement However the example thats been given uses ?? instead of real placeholders. I have looked at the examples on the PHP doc site for place holders php.net pdo.prepared-statements $stmt = $dbh->prepare("INSERT INTO REGISTRY (name, value) VALUES (:name, :value)"); $stmt->bindParam(':name', $name); $stmt->bindParam(':value', $value); Now lets say I wanted to achieve the above but with an array $valuesToInsert = array( 0 => array('name' => 'Robert

Use binary COPY table FROM with psycopg2

心不动则不痛 提交于 2019-11-27 06:41:47
I have tens of millions of rows to transfer from multidimensional array files into a PostgreSQL database. My tools are Python and psycopg2. The most efficient way to bulk instert data is using copy_from . However, my data are mostly 32-bit floating point numbers (real or float4), so I'd rather not convert from real → text → real. Here is an example database DDL: CREATE TABLE num_data ( id serial PRIMARY KEY NOT NULL, node integer NOT NULL, ts smallint NOT NULL, val1 real, val2 double precision ); Here is where I'm at with Python using strings (text): # Just one row of data num_row = [23253,

Bulk Insertion in MYSQL from XML Files

假装没事ソ 提交于 2019-11-27 06:31:03
问题 How can we load data to Mysql Tables from XML Files?? Is there any way to read data from XML Files and Write to MySql database.. I have a bulk of data in XML Files. Thanks in Advance for help. 回答1: Try the LOAD XML function (MySQL 6.0). Here's the sample code from the reference manual: Using an XML document person.xml containing: <?xml version="1.0"?> <list> <person person_id="1" fname="Pekka" lname="Nousiainen"/> <person person_id="2" fname="Jonas" lname="Oreland"/> <person person_id="3">

Performance of bcp/BULK INSERT vs. Table-Valued Parameters

纵然是瞬间 提交于 2019-11-27 06:11:08
I'm about to have to rewrite some rather old code using SQL Server's BULK INSERT command because the schema has changed, and it occurred to me that maybe I should think about switching to a stored procedure with a TVP instead, but I'm wondering what effect it might have on performance. Some background information that might help explain why I'm asking this question: The data actually comes in via a web service. The web service writes a text file to a shared folder on the database server which in turn performs a BULK INSERT . This process was originally implemented on SQL Server 2000, and at

C# Bulk Insert SQLBulkCopy - Update if Exists [duplicate]

梦想的初衷 提交于 2019-11-27 05:33:59
问题 This question already has answers here : Closed 7 years ago . Possible Duplicate: Any way to SQLBulkCopy “insert or update if exists”? I am using SQLBulkCopy to insert Bulk records How can I perform on update (rather than an insert) on records that already exist? Is this possible with SQLBulkCopy ? This is my code for SQLBulkCopy using (var bulkCopy = new SqlBulkCopy(ConfigurationManager.ConnectionStrings["ConnectionString"].ConnectionString, SqlBulkCopyOptions.KeepNulls & SqlBulkCopyOptions