bulkinsert

GtkTreeView insert/update performance penalty because of sorting

空扰寡人 提交于 2019-12-07 02:07:26
问题 I'm having performance problems when inserting many rows into a GTK treeview (using PyGTK) - or when modifying many rows. The problem is that the model seems to get resorted after each change (insert/modification). This causes the GUI to hang for multiple seconds. Leaving the model unsorted by commenting out model.set_sort_column_id(SOME_ROW_INDEX, gtk.SORT_ASCENDING) eliminates these problems. Therefore, I would like to disable the sorting while I'm inserting or modifying the model, and re

How to insert an array of objects (bulk-insert) into neo4j with bolt protocol (javascript)

杀马特。学长 韩版系。学妹 提交于 2019-12-07 02:05:36
问题 1.Send an http post with objects array to server [{id:1, title: ‘one’}, {id:2, title:’two’}] 2.Receive post on server and bulk insert into neo4j with bolt let data = req.body; //set up bolt let db = require('neo4j-driver').v1; let driver = db.driver('bolt://localhost', db.auth.basic('neo4j', ’neo4j’)); let session = driver.session(); 3. Set up statements for execution // start transaction for(var i=0; i>data.length; i++) { //add CREATE statements to bolt session ??? "CREATE (r:Record {id:1,

bulk insert with or without index

半城伤御伤魂 提交于 2019-12-07 00:58:57
问题 In a comment I read Just as a side note, it's sometimes faster to drop the indices of your table and recreate them after the bulk insert operation. Is this true? Under which circumstances? 回答1: As with Joel I will echo the statement that yes it can be true. I've found that the key to identifying the scenario that he mentioned is all in the distribution of data, and the size of the index(es) that you have on the specific table. In an application that I used to support that did a regular bulk

MySQL LOAD DATA INFILE Data too long for column exception

a 夏天 提交于 2019-12-06 16:41:55
I'm using MySQL LOAD DATA INFILE Data command to bulk insert data to a table. Here's how I am doing it : LOAD DATA INFILE 'MyFile.csv' INTO TABLE `dbname`.`tablename` FIELDS TERMINATED BY '\t' ENCLOSED BY '"' LINES TERMINATED BY '\r\n' ; When I run it from our C# project I'm getting a Data too long for column xxx exception for a char(50) column which the provided data for it is less than 50 (but it is in Persian)but when I use a MySql client such as SQLyog it is working fine. Here's how I am running this command : private static void RunCommand(string command,params object[] args) { if (args !

bulk insert and update with ADO.NET Entity Framework

徘徊边缘 提交于 2019-12-06 15:14:41
I am writing a small application that does a lot of feed processing. I want to use LINQ EF for this as speed is not an issue, it is a single user app and, in the end, will only be used once a month. My questions revolves around the best way to do bulk inserts using LINQ EF. After parsing the incoming data stream I end up with a List of values. Since the end user may end up trying to import some duplicate data I would like to "clean" the data during insert rather than reading all the records, doing a for loop, rejecting records, then finally importing the remainder. This is what I am currently

Get SCOPE_IDENTITY value when inserting bulk records for SQL TableType

穿精又带淫゛_ 提交于 2019-12-06 11:54:48
问题 I have following table structure, for convenience purpose I am only marking individual columns Table_A ( Id, Name, Desc ) Table_1 ( Id this is identity column , Name ....) Table_2 ( Id this is identity column , Table_A_Id , Table_1_Id ) The relationship between Table_1 and Table_2 is 1...* Now I have created a table type for Table_A called TType_Table_A (which only contains Id as column and from my C# app I send multiple records). I have achieved this bulk insert functionality as desired.

Why does this oracle bulk insert not work?

半世苍凉 提交于 2019-12-06 10:04:53
问题 I am trying to bulk insert some data into an oracle db. I followed the example in the documentation. this.DataBaseAccess = new OracleConnection(connString); var dataAdapter = new OracleDataAdapter(); var insertCmd = DataBaseAccess.CreateCommand(); insertCmd.CommandType = CommandType.Text; insertCmd.BindByName = true; var names = new List<string>(); foreach (DataTable table in product.Contracts.Tables) { foreach (DataRow row in table.Rows) { names.Add(row["Contract"].ToString()); } const

Cannot bulk load because the file could not be opened. Operating system error code 1326(Logon failure: unknown user name or bad password.)

两盒软妹~` 提交于 2019-12-06 09:34:17
bulk upload from csv test file "\servername\wwwroot\Upload\LDSAgentsMap.txt" SET QUOTED_IDENTIFIER ON SET ANSI_NULLS ON GO CREATE PROCEDURE [dbo].[sp_CSVTest_BulkInsert] ( @Path NVARCHAR(128) ) AS DECLARE @Sql NVARCHAR(256) SET @Sql = 'BULK INSERT CSVTest FROM ''' + @Path + ''' WITH ( FIELDTERMINATOR = '','', ROWTERMINATOR = ''\n'' )' --PRINT @Sql EXEC(@Sql) GO path is "\servername\wwwroot\Upload\LDSAgentsMap.txt" note this is in shared hosting and database user have blukadmin and public service role This can occur when the Windows user account that SQL runs under (e.g. SqlServerAccount) doesn

Bulk Insert Failed “Bulk load data conversion error (truncation)”

£可爱£侵袭症+ 提交于 2019-12-06 08:54:30
I've done data imports with SQL Server's BULK INSERT task hundreds of times, but this time I'm receiving an error that's unfamiliar and that I've tried troubleshooting to no avail with Google. The below is the code I use with a comma deliminated file where the new rows are indicated by new line characters: BULK INSERT MyTable FROM 'C:\myflatfile.txt' WITH ( FIELDTERMINATOR = ',' ,ROWTERMINATOR = '/n') GO It consistently works, yet now on a simple file with a date and rate, it's failing with the error of " Msg 4863, Level 16, State 1, Line 1 Bulk load data conversion error (truncation) for row

OracleBulkCopy Memory Leak(OutOfMemory Exception)

China☆狼群 提交于 2019-12-06 08:22:04
问题 Below is the code I used to bulkcopy data from a temp table dataTable into a destTable in Oracle Database. The dataTable has about 2 million records. using (OracleBulkCopy bulkCopy = new OracleBulkCopy(VMSDATAConnectionString)) { try { foreach (OracleBulkCopyColumnMapping columnMapping in columnMappings) bulkCopy.ColumnMappings.Add(columnMapping); bulkCopy.DestinationTableName = destTableName; //bulkCopy.BatchSize = dataTable.Rows.Count; //bulkCopy.BulkCopyTimeout = 100; int defaultSize =