bulkinsert

SQL won't insert null values with BULK INSERT

自闭症网瘾萝莉.ら 提交于 2019-12-01 01:22:44
问题 I have a CSV file and each line looks similar to this: EASTTEXAS,NULL,BELLVILLE AREA,NULL,BELLVILLE AREA,RGP,NULL,NULL,0,NULL,NULL,NULL,1,1,PM,PM Settings,NULL,NULL I couldn't find any examples on how NULL values were supposed to be handled when doing BULK INSERT, so I assumed that was OK. When I try to run the BULK INSERT, it gives me this error: Msg 4864, Level 16, State 1, Line 28 Bulk load data conversion error (type mismatch or invalid character for the specified codepage) for row 1,

Why Bulk Import is faster than bunch of INSERTs?

百般思念 提交于 2019-11-30 21:27:02
I'm writing my graduate work about methods of importing data from file to SQL Server table. I have created my own programm and now I'm comparing it with some standart methods as bcp, BULK INSERT, INSERT ... SELECT * FROM OPENROWSET(BULK...). My program read in lines from file, parse them and import them one by one using ordinary INSERTs. I have generated file with 1 million lines each with 4 columns for tests. And now I have situation that my program works 160 seconds while standart methods 5-10 seconds. So qestion is why BULK operations are faster than 1 million INSERTs? Do they use special

Improve insert performance massively

坚强是说给别人听的谎言 提交于 2019-11-30 19:38:09
In my application I need to massively improve insert performance. Example: A file with about 21K records takes over 100 min to insert. There are reasons it can takes some time, like 20 min or so but over 100 min is just too long. Data is inserted into 3 tables (many-to-many). Id's are generated from a sequence but I have already googled and set hibernate.id.new_generator_mappings = true and allocationSize + sequence increment to 1000. Also the amount of data is not anything extraordinary at all, the file is 90 mb. I have verified with visual vm that most of the time is spent in jdbc driver

JPA/Hibernate improve batch insert performance

折月煮酒 提交于 2019-11-30 19:23:34
问题 I have a data model that has a ONE TO MANY relationship between ONE entity and 11 other entities. These 12 entities together represent one data packet. The problem I am having is to do with the number of inserts that occur on the 'many' side of these relationships. Some of them can have as many as 100 individual values so to save one whole data packet in the database it requires up to 500 inserts. I am using MySQL 5.5 with InnoDB tables. Now, from testing the database I see that it can easily

Bulk insert rowterminator issue

安稳与你 提交于 2019-11-30 18:39:17
I have this csv named test.csv with the content below 1,"test user",,,4075619900,example@example.com,"Aldelo for Restaurants","this is my deal",,"location4" 2,"joe johnson",,"32 bit",445555519,antle@gmail.com,"Restaurant Pro Express","smoe one is watching u",,"some location" Here is my SQL FILE to do the BULK insert USE somedb GO CREATE TABLE CSVTemp (id INT, name VARCHAR(255), department VARCHAR(255), architecture VARCHAR(255), phone VARCHAR(255), email VARCHAR(255), download VARCHAR(255), comments TEXT, company VARCHAR(255), location VARCHAR(255)) GO BULK INSERT CSVTemp FROM 'c:\test\test

Fastest way to insert 30 thousand rows in a temp table on SQL Server with C#

喜你入骨 提交于 2019-11-30 17:40:22
I am trying to find out how I can improve my insert performance in a temporary table in SQL Server using c#. Some people are saying that I should use SQLBulkCopy however I must be doing something wrong as it seems to work much slower than simply building an SQL insert string instead. My code to create table using SQLBulkCopy is below: public void MakeTable(string tableName, List<string> ids, SqlConnection connection) { SqlCommand cmd = new SqlCommand("CREATE TABLE ##" + tableName + " (ID int)", connection); cmd.ExecuteNonQuery(); DataTable localTempTable = new DataTable(tableName); DataColumn

“Column is too long” error with BULK INSERT

▼魔方 西西 提交于 2019-11-30 17:09:13
I am trying to run the following command to bulk insert data from a CSV file-- BULK INSERT TestDB.dbo.patent FROM 'C:\1patents.csv' WITH (FIRSTROW = 1, FIELDTERMINATOR = '^', ROWTERMINATOR='\n'); The error I am getting is this-- Msg 4866, Level 16, State 1, Line 1 The bulk load failed. The column is too long in the data file for row 1, column 6. Verify that the field terminator and row terminator are specified correctly. Msg 7399, Level 16, State 1, Line 1 The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error. Msg

Mysql Bulk Update

一曲冷凌霜 提交于 2019-11-30 14:10:49
问题 I have to execute ~6k update queries on a table through sql (No Hibernate/JDBC). Query is something like UPDATE A SET some_id = 'value1' WHERE id = 'value2'; It takes too long to execute all these queries. Is there a way to improve the performance? 回答1: Create a temp table (containing just the value1 and value2 values) and populate it in bulk (ie, you can potentially do this with a single insert statement). Then do an update using a join between your existing table and the temp table.

Azure documentdb bulk insert using stored procedure

无人久伴 提交于 2019-11-30 14:02:43
Hi I am using 16 collections to insert around 3-4 million json objects ranging from 5-10k per object.I am using stored procedure to insert these documents.I have 22 Capacity Unit. function bulkImport(docs) { var collection = getContext().getCollection(); var collectionLink = collection.getSelfLink(); // The count of imported docs, also used as current doc index. var count = 0; // Validate input. if (!docs) throw new Error("The array is undefined or null."); var docsLength = docs.length; if (docsLength == 0) { getContext().getResponse().setBody(0); } // Call the CRUD API to create a document.

SQL Server insert performance

我的梦境 提交于 2019-11-30 13:00:06
问题 I have an insert query that gets generated like this INSERT INTO InvoiceDetail (LegacyId,InvoiceId,DetailTypeId,Fee,FeeTax,Investigatorid,SalespersonId,CreateDate,CreatedById,IsChargeBack,Expense,RepoAgentId,PayeeName,ExpensePaymentId,AdjustDetailId) VALUES(1,1,2,1500.0000,0.0000,163,1002,'11/30/2001 12:00:00 AM',1116,0,550.0000,850,NULL,@ExpensePay1,NULL); DECLARE @InvDetail1 INT; SET @InvDetail1 = (SELECT @@IDENTITY); This query is generated for only 110K rows. It takes 30 minutes for all