sqlbulkcopy

“ERROR: extra data after last expected column” when using PostgreSQL COPY

别说谁变了你拦得住时间么 提交于 2019-12-03 13:43:59
Please bear with me as this is my first post. I'm trying to run the COPY command in PostgreSQL-9.2 to add a tab delimited table from a .txt file to a PostgreSQL database such as: COPY raw_data FROM '/home/Projects/TestData/raw_data.txt' WITH (DELIMITER ' '); I've already created an empty table called "raw_data" in the database using the SQL command: CREATE TABLE raw_data (); I keep getting the following error message when trying to run the COPY command: ERROR: extra data after last expected column CONTEXT: COPY raw_data, line 1: " 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24

How to add CsvHelper records to DataTable to use for SqlBulkCopy to the database

99封情书 提交于 2019-12-03 12:04:50
I am trying to read a CSV file with CsvHelper, load each record into a DataTable, and then use SqlBulkCopy to insert the data into a database table. With the current code, I get an exception when adding a row to the DataTable. The exception is: "Unable to cast object of type 'MvcStockAnalysis.Models.StockPrice' to type 'System.IConvertible'.Couldn't store in Date Column. Expected type is DateTime." The example CSV file is from yahoo finance. For example: http://ichart.yahoo.com/table.csv?s=MMM&a=0&b=1&c=2010&d=0&e=17&f=2014&g=d&ignore=.csv The CSV file contains the following header: Date Open

How to automatically truncate string when do bulk insert?

好久不见. 提交于 2019-12-03 11:51:34
I want to insert many rows (constructed from Entity Framework objects) to SQL Server. The problem is, some of string properties have length exceeded length of column in database, which causes an exception, and then all of rows will unable to insert into database. So I wonder that if there is a way to tell SqlBulkCopy to automatically truncate any over-length rows? Of course, I can check and substring each property if it exceeds the limited length, before insert it in to a DataTable, but it would slow down the whole program. Always use a staging/load table for bulk actions. Then you can process

The given ColumnMapping does not match up with any column in the source or destination

﹥>﹥吖頭↗ 提交于 2019-12-03 10:07:42
I dont know why I am getting the above exception, please someone look at it .... DataTable DataTable_Time = new DataTable("Star_Schema__Dimension_Time"); DataColumn Sowing_Day = new DataColumn(); Sowing_Day.ColumnName = "Sowing_Day"; DataColumn Sowing_Month= new DataColumn(); Sowing_Month.ColumnName = "Sowing_Month"; DataColumn Sowing_Year = new DataColumn(); Sowing_Year.ColumnName = "Sowing_Year"; DataColumn Visit_Day= new DataColumn(); Visit_Day.ColumnName = "Visit_Day"; DataColumn Visit_Month = new DataColumn(); Visit_Month.ColumnName = "Visit_Month"; DataColumn Visit_Year = new DataColumn(

SqlBulkCopy performance

人走茶凉 提交于 2019-12-03 07:10:17
I am working to increase the performance of bulk loads; 100's of millions of records + daily. I moved this over to use the IDatareader interface in lieu of the data tables and did get a noticeable performance boost (500,000 more records a minute). The current setup is: A custom cached reader to parse the delimited files. Wrapping the stream reader in a buffered stream. A custom object reader class that enumerates over the objects and implements the IDatareader interface. Then SqlBulkCopy writes to server The bulk of the performance bottle neck is directly in SqlBulkCopy.WriteToServer . If I

Real-time unidirectional synchronization from sql-server to another data repository

二次信任 提交于 2019-12-02 09:38:41
In my previous question on this portal, I had asked about some insight about syncing data between SQL Server and key-value based data repositories. In lieu of the same problem (one way real-time synchronization from SQL to HBase or any other database), I need to take care of some performance and latency considerations and did not find a very foolproof way of doing it. We have multiple SQL 2008 data shards where data is updated from various sources and processed by many processes at the same time (and UI reads from the same shards). The goal is to get all updates in selected tables at any point

SqlBulkCopy.WriteToServer() keep getting “connection is closed”

北战南征 提交于 2019-12-02 03:22:44
问题 This makes no sense to me but maybe someone with keener eyes can spot the problem. I have a Windows service that uses FileSystemWatcher. It processes some files and uploads data to an MSSQL database. It works totally fine on my machine -- detached from Visual Studio (ie not debugging) and running as a service. If copy this compiled code to our server, and have it point to the same database, and even the same files (!), I get this error every single time: System.InvalidOperationException:

sqlbulkcopy from Excel via ACE.OLEDB truncates text to 255 chars

你说的曾经没有我的故事 提交于 2019-12-02 02:54:39
Pretty straight-forward import using SqlBulkCopy: string excelConnectionString = @"Provider=Microsoft.ACE.OLEDB.12.0;Data Source=" + filePath + ";Extended Properties=\"Excel 12.0 Xml;HDR=YES;IMEX=1;\""; using (OleDbConnection excelConnection = new OleDbConnection(excelConnectionString)) { excelConnection.Open(); OleDbCommand cmd = new OleDbCommand("Select " + fileID.ToString() + " as [FileID], * from [Sheet1$] where [Text] IS NOT NULL", excelConnection); OleDbDataReader dReader = cmd.ExecuteReader(); using (SqlBulkCopy sqlBulk = new SqlBulkCopy(ConfigurationManager.ConnectionStrings[

Fire trigger for every inserted row using SqlBulkCopy

余生长醉 提交于 2019-12-01 21:33:54
问题 I am using SqlBulkCopy class to insert 50k rows at a time in table tbl_records I have set a After Insert trigger on this table and using following code SqlBulkCopy SqlBc1 = new SqlBulkCopy(strConnString, SqlBulkCopyOptions.FireTriggers); // Set DataReader For SqlBulkCopy sqlComm = new SqlCommand(strQuery, sqlTemCon); sqlComm.CommandTimeout = 3600000; sqlComm.CommandType = System.Data.CommandType.Text; SqlDataReader dReader = sqlComm.ExecuteReader(); SqlBc1.WriteToServer(dReader); But after

Select into statement where source is other database

蓝咒 提交于 2019-12-01 17:23:49
How to copy data from one DB into another DB with the same table structure and keep the key identities? I use Sql Server 2012 "Denali" and I want to copy some data from a Sql Server 2008 DB. The tables I have are exactly the same but I want the data from the old DB into the new "Denali" DB. The databases are on different servers. So I want something like USE newDB; GO SELECT * INTO newTable FROM OldDb.oldTable WITH (KEEPIDENTITY); GO Anyone have a suggestion to make this workable ? Configure a linked server and reference it in your query. You may need to use IDENTITY_INSERT as well. The SSIS