sqlbulkcopy

Weird "OLE DB provider 'STREAM' for linked server '(null)' returned invalid data for column '[!BulkInsert].Value' error

白昼怎懂夜的黑 提交于 2021-02-07 11:24:59
问题 Software used: Windows 7 64 bit Ultimate, .Net 4, SQL Server 2008 R2. select @@version returns: Microsoft SQL Server 2008 R2 (RTM) - 10.50.1617.0 (X64) Apr 22 2011 19:23:43 Copyright (c) Microsoft Corporation Developer Edition (64-bit) on Windows NT 6.1 <X64> (Build 7601: Service Pack 1) (Hypervisor) To reproduce, and assuming you have a local sql server 2008 R2 instance, paste the following code in linqpad and run it as a Program. It blows up with: OLE DB provider 'STREAM' for linked server

Weird "OLE DB provider 'STREAM' for linked server '(null)' returned invalid data for column '[!BulkInsert].Value' error

岁酱吖の 提交于 2021-02-07 11:24:54
问题 Software used: Windows 7 64 bit Ultimate, .Net 4, SQL Server 2008 R2. select @@version returns: Microsoft SQL Server 2008 R2 (RTM) - 10.50.1617.0 (X64) Apr 22 2011 19:23:43 Copyright (c) Microsoft Corporation Developer Edition (64-bit) on Windows NT 6.1 <X64> (Build 7601: Service Pack 1) (Hypervisor) To reproduce, and assuming you have a local sql server 2008 R2 instance, paste the following code in linqpad and run it as a Program. It blows up with: OLE DB provider 'STREAM' for linked server

Parallel Bulk Inserting with SqlBulkCopy and Azure

馋奶兔 提交于 2021-02-07 06:42:25
问题 I have an azure app on the cloud with a sql azure database. I have a worker role which needs to do parsing+processing on a file (up to ~30 million rows) so i can't directly use BCP or SSIS. I'm currently using SqlBulkCopy, however this seems too slow as I've seen load times of up to 4-5 minutes for 400k rows. I want to run my bulk inserts in parallel; however reading through the articles on importing data in parallel/controlling lock behaviour, it says that SqlBulkCopy requires that the table

Can I use sqlbulkcopy with Azure SQL Paas?

非 Y 不嫁゛ 提交于 2021-01-28 11:40:32
问题 Can I use sqlbulkcopy with Azure SQL Paas? I have an app that does bulk copy to a database and we are testing it with SQL PaaS. It appears to be failing on the SQLBulkCopy. I thought I read this is not supported somewhere but do not see it in the Azure SQL documentation. Is this still a limitation? Where is that documented? I am using .net code to do the bulk copy (not SSIS or any other tool), it is a .net app, if it matters. 回答1: Short answer is, yes, you can use SQLBulkCopy to insert data

Can I use sqlbulkcopy with Azure SQL Paas?

孤街醉人 提交于 2021-01-28 11:17:06
问题 Can I use sqlbulkcopy with Azure SQL Paas? I have an app that does bulk copy to a database and we are testing it with SQL PaaS. It appears to be failing on the SQLBulkCopy. I thought I read this is not supported somewhere but do not see it in the Azure SQL documentation. Is this still a limitation? Where is that documented? I am using .net code to do the bulk copy (not SSIS or any other tool), it is a .net app, if it matters. 回答1: Short answer is, yes, you can use SQLBulkCopy to insert data

SqlBulkCopy with ObjectReader - Failed to convert parameter value from a String to a Int32

社会主义新天地 提交于 2020-07-19 18:26:33
问题 I am using SqlBulkCopy (.NET) with ObjectReader (FastMember) to perform an import from XML based file. I have added the proper column mappings. At certain instances I get an error: Failed to convert parameter value from a String to a Int32. I'd like to understand how to 1. Trace the actual table column which has failed 2. Get the "current" on the ObjectReader sample code: using (ObjectReader reader = genericReader.GetReader()) { try { sbc.WriteToServer(reader); //sbc is SqlBulkCopy instance

SqlBulkCopy with ObjectReader - Failed to convert parameter value from a String to a Int32

点点圈 提交于 2020-07-19 18:25:51
问题 I am using SqlBulkCopy (.NET) with ObjectReader (FastMember) to perform an import from XML based file. I have added the proper column mappings. At certain instances I get an error: Failed to convert parameter value from a String to a Int32. I'd like to understand how to 1. Trace the actual table column which has failed 2. Get the "current" on the ObjectReader sample code: using (ObjectReader reader = genericReader.GetReader()) { try { sbc.WriteToServer(reader); //sbc is SqlBulkCopy instance

Bulk C# datatable to postgresql table

依然范特西╮ 提交于 2020-06-16 03:43:08
问题 I have got a datatable with thousands of records. I have got a postgres table with the same fields of the datatable. I want everyday to truncate this table and fill again with the data of the datatable. I have seen sql bulk copy, but it is not avalaible on postgres. So, which one is the most effective way? One insert per record Multiple insert: insert into table values (1,1),(1,2),(1,3),(2,1); Select from datatable and insert into postgres with linq? no idea... Thanks. 回答1: PostgreSQL