sqlbulkcopy

sqlbulkcopy using sql CE

一曲冷凌霜 提交于 2019-11-27 04:25:44
Is it possible to use SqlBulkcopy with Sql Compact Edition e.g. (*.sdf) files? I know it works with SQL Server 200 Up, but wanted to check CE compatibility. If it doesnt does anyone else know the fastest way of getting a CSV type file into SQL Server CE without using DataSets (puke here)? BULKCOPY is not supported in SQL CE. Here is the fastest way if you have a huge number of rows in your table; insert is too slow! using (SqlCeConnection cn = new SqlCeConnection(yourConnectionString)) { if (cn.State == ConnectionState.Closed) cn.Open(); using (SqlCeCommand cmd = new SqlCeCommand()) { cmd

SqlBulkCopy - The given value of type String from the data source cannot be converted to type money of the specified target column

老子叫甜甜 提交于 2019-11-26 22:22:01
问题 I'm getting this exception when trying to do an SqlBulkCopy from a DataTable. Error Message: The given value of type String from the data source cannot be converted to type money of the specified target column. Target Site: System.Object ConvertValue(System.Object, System.Data.SqlClient._SqlMetaData, Boolean, Boolean ByRef, Boolean ByRef) I understand what the error is saying, but how I can I get more information, such as the row/field this is happening on? The datatable is populated by a 3rd

Bulk insert is not working properly in Azure SQL Server

本秂侑毒 提交于 2019-11-26 21:55:40
问题 I'm not able to insert the bulk amount of data into Azure SQL server DB using C# webapi Consider I want to insert 60K> data in SQL. In my local sql server there is no problem but in Azure SQL its getting connection timed-out My approach:(All are working in local sql server but not in Azure sql server) 1) Tried using EF its inserting record one by one (For 10000 approx. 10 min,mostly timeout) 2) Tried using Bulk insert Extension along with EF 3) Tried in SqlBulkCopy 4) Tried increasing

Bulk inserts taking longer than expected using Dapper

人盡茶涼 提交于 2019-11-26 21:41:49
After reading this article I decided to take a closer look at the way I was using Dapper. I ran this code on an empty database var members = new List<Member>(); for (int i = 0; i < 50000; i++) { members.Add(new Member() { Username = i.toString(), IsActive = true }); } using (var scope = new TransactionScope()) { connection.Execute(@" insert Member(Username, IsActive) values(@Username, @IsActive)", members); scope.Complete(); } it took about 20 seconds. That's 2500 inserts/second. Not bad, but not great either considering the blog was achieving 45k inserts/second. Is there a more efficient way

How to prevent duplicate records being inserted with SqlBulkCopy when there is no primary key

无人久伴 提交于 2019-11-26 20:54:23
问题 I receive a daily XML file that contains thousands of records, each being a business transaction that I need to store in an internal database for use in reporting and billing. I was under the impression that each day's file contained only unique records, but have discovered that my definition of unique is not exactly the same as the provider's. The current application that imports this data is a C#.Net 3.5 console application, it does so using SqlBulkCopy into a MS SQL Server 2008 database

Get an IDataReader from a typed List

烈酒焚心 提交于 2019-11-26 19:53:16
I have a List<MyObject> with a million elements. (It is actually a SubSonic Collection but it is not loaded from the database). I'm currently using SqlBulkCopy as follows: private string FastInsertCollection(string tableName, DataTable tableData) { string sqlConn = ConfigurationManager.ConnectionStrings[SubSonicConfig.DefaultDataProvider.ConnectionStringName].ConnectionString; using (SqlBulkCopy s = new SqlBulkCopy(sqlConn, SqlBulkCopyOptions.TableLock)) { s.DestinationTableName = tableName; s.BatchSize = 5000; s.WriteToServer(tableData); s.BulkCopyTimeout = SprocTimeout; s.Close(); } return

sqlbulkcopy using sql CE

旧巷老猫 提交于 2019-11-26 17:32:32
问题 Is it possible to use SqlBulkcopy with Sql Compact Edition e.g. (*.sdf) files? I know it works with SQL Server 200 Up, but wanted to check CE compatibility. If it doesnt does anyone else know the fastest way of getting a CSV type file into SQL Server CE without using DataSets (puke here)? 回答1: BULKCOPY is not supported in SQL CE. Here is the fastest way if you have a huge number of rows in your table; insert is too slow! using (SqlCeConnection cn = new SqlCeConnection(yourConnectionString)) {

SqlBulkCopy Insert with Identity Column

牧云@^-^@ 提交于 2019-11-26 16:24:42
I am using the SqlBulkCopy object to insert a couple million generated rows into a database. The only problem is that the table I am inserting to has an identity column. I have tried setting the SqlBulkCopyOptions to SqlBulkCopyOptions.KeepIdentity and setting the identity column to 0 's, DbNull.Value and null . None of which have worked. I feel like I am missing something pretty simple, if someone could enlighten me that would be fantastic. Thanks! edit To clarify, I do not have the identity values set in the DataTable I am importing. I want them to be generated as part of the import. edit 2

Mapping columns in a DataTable to a SQL table with SqlBulkCopy

こ雲淡風輕ζ 提交于 2019-11-26 11:09:25
问题 I would like to know how I can map columns in a database table to the datatable in c# before adding the data to the database. using (SqlBulkCopy s = new SqlBulkCopy(conn)) { s.DestinationTableName = destination; s.WriteToServer(Ads_api_ReportData); } 回答1: You probably need some thing like public void BatchBulkCopy(DataTable dataTable, string DestinationTbl, int batchSize) { // Get the DataTable DataTable dtInsertRows = dataTable; using (SqlBulkCopy sbc = new SqlBulkCopy(connectionString,

SqlBulkCopy Insert with Identity Column

女生的网名这么多〃 提交于 2019-11-26 04:47:11
问题 I am using the SqlBulkCopy object to insert a couple million generated rows into a database. The only problem is that the table I am inserting to has an identity column. I have tried setting the SqlBulkCopyOptions to SqlBulkCopyOptions.KeepIdentity and setting the identity column to 0 \'s, DbNull.Value and null . None of which have worked. I feel like I am missing something pretty simple, if someone could enlighten me that would be fantastic. Thanks! edit To clarify, I do not have the