bulkinsert

Delete All / Bulk Insert

不问归期 提交于 2019-12-04 05:17:25
First off let me say I am running on SQL Server 2005 so I don't have access to MERGE . I have a table with ~150k rows that I am updating daily from a text file. As rows fall out of the text file I need to delete them from the database and if they change or are new I need to update/insert accordingly. After some testing I've found that performance wise it is exponentially faster to do a full delete and then bulk insert from the text file rather than read through the file line by line doing an update/insert. However I recently came across some posts discussing mimicking the MERGE functionality

Bulk insert strategy from c# to SQL Server

寵の児 提交于 2019-12-04 05:06:06
In our current project, customers will send collection of a complex/nested messages to our system. Frequency of these messages are approx. 1000-2000 msg/per seconds. These complex objects contains the transaction data (to be added) as well as master data (which will be added if not found). But instead of passing the ids of the master data, customer passes the 'name' column. System checks if master data exist for these names. If found, it uses the ids from database otherwise create this master data first and then use these ids. Once master data ids are resolved, system inserts the transactional

NHibernate bulk insert or update

拈花ヽ惹草 提交于 2019-12-04 04:26:48
Hi I'm working a project where we need to process several xml files once a day and populate a Database with the information contained in those files. Each file is roughly 1Mb and contains about 1000 records; we usually need to process between 12 and 25 of these files. I've seen some information regarding bulk inserts using NHibernate but our problem is somehow trickier since the xml files contain new records mixed with updated records. In the xml there is a flag that tells us is a specific record is a new one or an update to an existing record, but not what information has changed. The xml

Issue with bulk insert

余生长醉 提交于 2019-12-04 03:41:32
I am trying to insert the data from this link to my SQL server https://www.ian.com/affiliatecenter/include/V2/CityCoordinatesList.zip I created the table CREATE TABLE [dbo].[tblCityCoordinatesList]( [RegionID] [int] NOT NULL, [RegionName] [nvarchar](255) NULL, [Coordinates] [nvarchar](4000) NULL ) ON [PRIMARY] And I am running the following script to do the bulk insert BULK INSERT tblCityCoordinatesList FROM 'C:\data\CityCoordinatesList.txt' WITH ( FIRSTROW = 2, MAXERRORS = 0, FIELDTERMINATOR = '|', ROWTERMINATOR = '\n' ) But the bulk insert fails with following error Cannot obtain the

Efficient way to bulk insert into Dbase (.dbf) files

百般思念 提交于 2019-12-04 03:31:56
问题 Im currently using OleDBCommand.ExecuteNonQuery (repeatedly called) to insert as much as 350,000 rows into dbase files (*.dbf) at a time from a source DataTable. I'm reusing an OleDbCommand object and OleDbParameters to set the values to be inserted each time when the insert statement is called. Inserting 350,000 rows currently takes my program about 45 mins. Is there a more efficient way to do this? Does something similar to the Bulk Insert option used in SQL Server exist for Dbase (*.dbf)

Extract a full timestamp (date included) from a Select query; Oracle

别等时光非礼了梦想. 提交于 2019-12-04 02:29:47
问题 So I am trying to insert multiples rows of data from one table to another. I have done this; however, I am having an issue with some of my columns, specifically my date columns. When the query returns data it is missing the time component of the date which is normally present. If this doesn't make sense hopefully the follow helps things to make sense. My original query SELECT 'insert into dante2 (subcar, batch_id, silicon, temperature, sulphur, manganese, phosphorus, start_pour, end_pour,

How to use SQLAlchemy to dump an SQL file from query expressions to bulk-insert into a DBMS?

蓝咒 提交于 2019-12-04 00:54:50
Please bear with me as I explain the problem, how I tried to solve it, and my question on how to improve it is at the end. I have a 100,000 line csv file from an offline batch job and I needed to insert it into the database as its proper models. Ordinarily, if this is a fairly straight-forward load, this can be trivially loaded by just munging the CSV file to fit a schema; but, I had to do some external processing that requires querying and it's just much more convenient to use SQLAlchemy to generate the data I want. The data I want here is 3 models that represent 3 pre-exiting tables in the

Bulk insert questions

泪湿孤枕 提交于 2019-12-03 20:59:43
I have a CSV file at the client side, and I want to develop a C# application to bulk insert the data into a table of a database to minimal log output. I am confused about if I use ADO.NET at the client side to call stored procedures in the database server. What kind of code needs to develop at the client side and what kind of code needs to be implemented at the server side in the form of stored procedures? But I did not find any samples from Google. What are some ready to use samples? :-) EDIT: Some more information: I have a lot of data at the client side and I want to import to the database,

“Column is too long” error with BULK INSERT

拜拜、爱过 提交于 2019-12-03 19:44:44
问题 I am trying to run the following command to bulk insert data from a CSV file-- BULK INSERT TestDB.dbo.patent FROM 'C:\1patents.csv' WITH (FIRSTROW = 1, FIELDTERMINATOR = '^', ROWTERMINATOR='\n'); The error I am getting is this-- Msg 4866, Level 16, State 1, Line 1 The bulk load failed. The column is too long in the data file for row 1, column 6. Verify that the field terminator and row terminator are specified correctly. Msg 7399, Level 16, State 1, Line 1 The OLE DB provider "BULK" for

How to bulk insert a CSV file into SQLite C#

倾然丶 夕夏残阳落幕 提交于 2019-12-03 18:05:41
问题 I have seen similar questions (1, 2), but none of them discuss how to insert CSV files into SQLite. About the only thing I could think of doing is to use a CSVDataAdapter and fill the SQLiteDataSet , then use the SQLiteDataSet to update the tables in the database: The only DataAdapter for CSV files I found is not actually available: CSVDataAdapter CSVda = new CSVDataAdapter(@"c:\MyFile.csv"); CSVda.HasHeaderRow = true; DataSet ds = new DataSet(); // <-- Use an SQLiteDataSet instead CSVda.Fill