bulkinsert

Copying data between Oracle schemas using SQL

戏子无情 提交于 2019-12-04 19:09:42
问题 I'm trying to copy data from one Oracle schema ( CORE_DATA ) into another ( MY_DATA ) using an INSERT INTO (...) SQL statement. What would the SQL statement look like? 回答1: Prefix your table names with the schema names when logged in as a user with access to both: insert into MY_DATA.table_name select * from CORE_DATA.table_name; Assuming that the tables are defined identically in both schemas, the above will copy all records from the table named table_name in CORE_DATA to the table named

Get SCOPE_IDENTITY value when inserting bulk records for SQL TableType

一笑奈何 提交于 2019-12-04 16:57:45
I have following table structure, for convenience purpose I am only marking individual columns Table_A ( Id, Name, Desc ) Table_1 ( Id this is identity column , Name ....) Table_2 ( Id this is identity column , Table_A_Id , Table_1_Id ) The relationship between Table_1 and Table_2 is 1...* Now I have created a table type for Table_A called TType_Table_A (which only contains Id as column and from my C# app I send multiple records). I have achieved this bulk insert functionality as desired. What I need is when I insert records into Table_2 from TType_Table_A say with below statements, I would

Bulk inserts into sqlite db on the iphone

℡╲_俬逩灬. 提交于 2019-12-04 16:23:23
I'm inserting a batch of 100 records, each containing a dictonary containing arbitrarily long HTML strings, and by god, it's slow. On the iphone, the runloop is blocking for several seconds during this transaction. Is my only recourse to use another thread? I'm already using several for acquiring data from HTTP servers, and the sqlite documentation explicitly discourages threading with the database, even though it's supposed to be thread-safe... Is there something I'm doing extremely wrong that if fixed, would drastically reduce the time it takes to complete the whole operation? NSString*

OracleBulkCopy Memory Leak(OutOfMemory Exception)

点点圈 提交于 2019-12-04 14:51:36
Below is the code I used to bulkcopy data from a temp table dataTable into a destTable in Oracle Database. The dataTable has about 2 million records. using (OracleBulkCopy bulkCopy = new OracleBulkCopy(VMSDATAConnectionString)) { try { foreach (OracleBulkCopyColumnMapping columnMapping in columnMappings) bulkCopy.ColumnMappings.Add(columnMapping); bulkCopy.DestinationTableName = destTableName; //bulkCopy.BatchSize = dataTable.Rows.Count; //bulkCopy.BulkCopyTimeout = 100; int defaultSize = 5000; int.TryParse(ConfigurationManager.AppSettings["OracleBulkCopyBatchSize"], out defaultSize); bulkCopy

HyperSQL (HSQLDB): massive insert performance

江枫思渺然 提交于 2019-12-04 14:19:06
问题 I have an application that has to insert about 13 million rows of about 10 average length strings into an embedded HSQLDB. I've been tweaking things (batch size, single threaded/multithreaded, cached/non-cached tables, MVCC transactions, log_size/no logs, regular calls to checkpoint , ...) and it still takes 7 hours on a 16 core, 12 GB machine. I chose HSQLDB because I figured I might have a substantial performance gain if I put all of those cores to good use but I'm seriously starting to

import bulk data into MySQL

倾然丶 夕夏残阳落幕 提交于 2019-12-04 13:08:50
So I'm trying to import some sales data into my MySQL database. The data is originally in the form of a raw CSV file, which my PHP application needs to first process, then save the processed sales data to the database. Initially I was doing individual INSERT queries, which I realized was incredibly inefficient (~6000 queries taking almost 2 minutes ). I then generated a single large query and INSERT ed the data all at once. That gave us a 3400% increase in efficiency, and reduced the query time to just over 3 seconds . But as I understand it, LOAD DATA INFILE is supposed to be even quicker

Need recommendations on pushing the envelope with SqlBulkCopy on SQL Server

痞子三分冷 提交于 2019-12-04 10:52:50
I am designing an application, one aspect of which is that it is supposed to be able to receive massive amounts of data into SQL database. I designed the database stricture as a single table with bigint identity, something like this one: CREATE TABLE MainTable ( _id bigint IDENTITY(1,1) NOT NULL PRIMARY KEY CLUSTERED, field1, field2, ... ) I will omit how am I intending to perform queries, since it is irrelevant to the question I have. I have written a prototype, which inserts data into this table using SqlBulkCopy. It seemed to work very well in the lab. I was able to insert tens of millions

Postgres insert optimization

倾然丶 夕夏残阳落幕 提交于 2019-12-04 08:42:05
问题 I have a script that generates tens of thousands of inserts into a postgres db through a custom ORM. As you can imagine, it's quite slow. This is used for development purposes in order to create dummy data. Is there a simple optimization I can do at the Postgres level to make this faster? It's the only script running, sequentially, and requires no thread safety. Perhaps I can turn off all locking, safety checks, triggers, etc? Just looking for a quick and dirty solution that will greatly

INSERT Batch, and if duplicate key Update in codeigniter

为君一笑 提交于 2019-12-04 06:49:13
Is there any way of performing in batch Insert query and if the key already exists, UPDATE that row in codeigniter? I have gone through the documentation and found only insert_batch and update_batch. But how to update the row with duplicate key in active records? And what happens if one row fails to be inserted or updated in batch_insert? All insertion fails or only that row? You will have to go with little custom query by adding "ON DUPLICATE" statement $sql = $this->db->insert_string('YourTable', $data) . ' ON DUPLICATE KEY UPDATE duplicate=duplicate+1'; $this->db->query($sql); $id = $this-

What permissions are required to bulk insert in SQL Server from a network share with Windows authentication?

拈花ヽ惹草 提交于 2019-12-04 06:22:15
I am working on an application which bulk-loads data into a SQL Server 2008 database. It writes a CSV file to a network share then calls a stored procedure which contains a BULK INSERT command. I'm migrating the application to what amounts to a completely new network. In this new world bulk insertion fails with this error: Msg 4861, Level 16, State 1, Line 1 Cannot bulk load because the file "\\myserver\share\subfolder\filename" could not be opened. Operating system error code 5(failed to retrieve text for this error. Reason: 15105). I connect to the database using Windows Authentication,