bulkinsert

SQL Server Bulk Insert

北慕城南 提交于 2019-12-01 19:49:57
I want to import a one column text file into one of my sql tables. The file is just a list of swear words. I've written the following TSQL to do this BULK INSERT SwearWords FROM 'c:\swears.txt' WITH ( FIELDTERMINATOR = ',', ROWTERMINATOR = '\n' ) However it errors with unexapected end of file. The table im importing to is just an identity field followed by a nvarchar field that I want to insert the text into. It works fine if I add in the text file "1," to the beginning of eveyr line, I assume this is because SQL if looking for 2 fields. Is there any way around this? Thanks You need to use

How to read millions of rows from the text file and insert into table quickly

China☆狼群 提交于 2019-12-01 19:38:56
I have gone through the Insert 2 million rows into SQL Server quickly link and found that I can do this by using Bulk insert. So I am trying to create the datatable (code as below), but as this is a huge file (more than 300K row) I am getting an OutOfMemoryEexception in my code: string line; DataTable data = new DataTable(); string[] columns = null; bool isInserted = false; using (TextReader tr = new StreamReader(_fileName, Encoding.Default)) { if (columns == null) { line = tr.ReadLine(); columns = line.Split(','); } for (int iColCount = 0; iColCount < columns.Count(); iColCount++) { data

Efficient way to bulk insert into Dbase (.dbf) files

五迷三道 提交于 2019-12-01 18:59:49
Im currently using OleDBCommand.ExecuteNonQuery (repeatedly called) to insert as much as 350,000 rows into dbase files (*.dbf) at a time from a source DataTable. I'm reusing an OleDbCommand object and OleDbParameters to set the values to be inserted each time when the insert statement is called. Inserting 350,000 rows currently takes my program about 45 mins. Is there a more efficient way to do this? Does something similar to the Bulk Insert option used in SQL Server exist for Dbase (*.dbf) files? Fixed the problem by changing the OleDB driver from Jet to vfpoledb. This cut the total time from

Minimally Logged Insert Into

强颜欢笑 提交于 2019-12-01 18:07:31
I have an INSERT statement that is eating a hell of a lot of log space, so much so that the hard drive is actually filling up before the statement completes. The thing is, I really don't need this to be logged as it is only an intermediate data upload step. For argument's sake, let's say I have: Table A: Initial upload table (populated using bcp , so no logging problems) Table B: Populated using INSERT INTO B from A Is there a way that I can copy between A and B without anything being written to the log? P.S. I'm using SQL Server 2008 with simple recovery model. From Louis Davidson, Microsoft

Entity Framework Bulk Insert Throws KeyNotFoundException error

心不动则不痛 提交于 2019-12-01 15:36:09
I am using EF6 and due to the low speed of AddRange() method I need to use BulkInsert . So I added the NuGet package of BulkInsert for EF6 via here . First thing I received after adding the dll s was this warning: Found conflicts between different versions of the same dependent assembly. Please set the "AutoGenerateBindingRedirects" property to true in the project file. I made a List of all my Contact entities namely contactsToInsert that need to be added (My contacts have a foreign key in another table, too). When I tried to run the following code I receive a KeyNotFoundException that claims

Entity Framework Bulk Insert Throws KeyNotFoundException error

社会主义新天地 提交于 2019-12-01 14:33:00
问题 I am using EF6 and due to the low speed of AddRange() method I need to use BulkInsert . So I added the NuGet package of BulkInsert for EF6 via here. First thing I received after adding the dll s was this warning: Found conflicts between different versions of the same dependent assembly. Please set the "AutoGenerateBindingRedirects" property to true in the project file. I made a List of all my Contact entities namely contactsToInsert that need to be added (My contacts have a foreign key in

What is the fastest way to insert values from datagridview to mysql database

人盡茶涼 提交于 2019-12-01 14:27:34
I have a vb.net system and I want to insert 10,000 or more records from datagridview to mysql database. But it takes 8mins for 10,000 records when i tried this For i As Integer = 0 To DataGridView1.Rows.Count - 1 Dim queryInsert As String = "INSERT INTO tbl_shipdetails (ship_date, item_type, item_code, imei1, imei2)" & _ "VALUES('" & DataGridView1.Rows(i).Cells(1).Value & "','" & DataGridView1.Rows(i).Cells(2).Value & "','" & DataGridView1.Rows(i).Cells(3).Value & "','" & DataGridView1.Rows(i).Cells(4).Value & "','" & DataGridView1.Rows(i).Cells(5).Value & "')" MySqlCmd = New MySqlCommand

Bulk Insert Multiple XML files with SSIS 2008

你。 提交于 2019-12-01 14:00:47
I have a folder with multiple XML files. I need to bulk insert each one into a table in sql server. I am at a complete loss as to how to get this to work, as I am new to SSIS. Currently, My SSIS package pulls the files off an FTP server and uses a command line to unzip the xml (the come as .xml.gz). This all works great, but now I'm at a loss as to get the files into the database, as the bulk insert task only takes delimited files. Suggestions? You can accomplish this by using a ForEach Loop Container with an enumerator type of file. If the XML files are complex, you can use an XML Task .

python sqlalchemy insert multiple lines in a tuple data structure

拥有回忆 提交于 2019-12-01 11:50:23
I have been researching how to insert a list of ~500 tuples (rows) that has 7 elements (columns) into a database. I have read through various posts on stackoverflow as well as other forums. I found the following and it suggests to use the 'executemany()' method but its not so clear to me how. Do I need to covert my object from tuple to a dictionary? The problem is I don't have a name:value type of data structure. How to use SQLAlchemy to dump an SQL file from query expressions to bulk-insert into a DBMS? Here is an example: engine = create_engine('sqlite:///:memory:', echo=True) metadata =

python sqlalchemy insert multiple lines in a tuple data structure

时光总嘲笑我的痴心妄想 提交于 2019-12-01 11:40:27
问题 I have been researching how to insert a list of ~500 tuples (rows) that has 7 elements (columns) into a database. I have read through various posts on stackoverflow as well as other forums. I found the following and it suggests to use the 'executemany()' method but its not so clear to me how. Do I need to covert my object from tuple to a dictionary? The problem is I don't have a name:value type of data structure. How to use SQLAlchemy to dump an SQL file from query expressions to bulk-insert