bulkinsert

Efficient way to bulk insert with get_or_create() in Django (SQL, Python, Django)

删除回忆录丶 提交于 2019-11-28 19:22:42
Is there a more efficient way for doing this? for item in item_list: e, new = Entry.objects.get_or_create( field1 = item.field1, field2 = item.field2, ) You can't do decent bulk insertions with get_or_create (or even create), and there's no API for doing this easily. If your table is simple enough that creating rows with raw SQL isn't too much of a pain, it's not too hard; something like: INSERT INTO site_entry (field1, field2) ( SELECT i.field1, i.field2 FROM (VALUES %s) AS i(field1, field2) LEFT JOIN site_entry as existing ON (existing.field1 = i.field1 AND existing.field2 = i.field2) WHERE

Bulk insert, SQL Server 2000, unix linebreaks

寵の児 提交于 2019-11-28 18:12:58
I am trying to insert a .csv file into a database with unix linebreaks. The command I am running is: BULK INSERT table_name FROM 'C:\file.csv' WITH ( FIELDTERMINATOR = ',', ROWTERMINATOR = '\n' ) If I convert the file into Windows format the load works, but I don't want to do this extra step if it can be avoided. Any ideas? I felt compelled to contribute as I was having the same issue, and I need to read 2 UNIX files from SAP at least a couple of times a day. Therefore, instead of using unix2dos, I needed something with less manual intervention and more automatic via programming. As noted, the

MySQL Insert 20K rows in single insert

核能气质少年 提交于 2019-11-28 12:14:11
In my table I insert around 20,000 rows on each load. Right now I am doing it one-by-one. From mysql website I came to know inserting multiple rows with single insert query is faster. Can I insert all 20000 in single query? What will happen if there are errors within this 20000 rows? how will mysql handle that? If you are inserting the rows from some other table then you can use the INSERT ... SELECT pattern to insert the rows. However if you are inserting the values using INSERT ... VALUES pattern then you have the limit of max_allowed_packet. Also from the docs:- To optimize insert speed,

Bulk Insertion in MYSQL from XML Files

折月煮酒 提交于 2019-11-28 11:50:39
How can we load data to Mysql Tables from XML Files?? Is there any way to read data from XML Files and Write to MySql database.. I have a bulk of data in XML Files. Thanks in Advance for help. Try the LOAD XML function (MySQL 6.0). Here's the sample code from the reference manual: Using an XML document person.xml containing: <?xml version="1.0"?> <list> <person person_id="1" fname="Pekka" lname="Nousiainen"/> <person person_id="2" fname="Jonas" lname="Oreland"/> <person person_id="3"><fname>Mikael</fname><lname>Ronström</lname></person> <person person_id="4"><fname>Lars</fname><lname>Thalmann<

Elasticsearch bulk insert with NEST returns es_rejected_execution_exception

本秂侑毒 提交于 2019-11-28 11:03:12
问题 I am trying to do bulk insert using .Net API in Elasticsearch and this is the error that I am getting while performing the operation; Error {Type: es_rejected_execution_exception Reason: "rejected execution of org.elasticsearch.transport.TransportService$6@604b47a4 on EsThreadPoolExecutor[bulk, queue capacity = 50, org.elasticsearch.common.util.concurrent.EsThreadPoolExecutor@51f4f734[Running, pool size = 4, active threads = 4, queued tasks = 50, completed tasks = 164]]" CausedBy: ""} Nest

Is SQL Server Bulk Insert Transactional?

家住魔仙堡 提交于 2019-11-28 07:02:35
问题 If I run the following query in SQL Server 2000 Query Analyzer: BULK INSERT OurTable FROM 'c:\OurTable.txt' WITH (CODEPAGE = 'RAW', DATAFILETYPE = 'char', FIELDTERMINATOR = '\t', ROWS_PER_BATCH = 10000, TABLOCK) On a text file that conforms to OurTable's schema for 40 lines, but then changes format for the last 20 lines (lets say the last 20 lines have fewer fields), I receive an error. However, the first 40 lines are committed to the table. Is there something about the way I'm calling Bulk

SQL Server Destination vs OLE DB Destination

有些话、适合烂在心里 提交于 2019-11-28 05:43:32
问题 I was using OLE Db destination for Bulk import of multiple Flat Files. After some tuning I ended up with SQL Server Destination to be 25 - 50 % faster. Though I am confused about this destination as there are contradictory information on the web, some are against it, some are suggesting using it. I would like to know, are there any serious pitfalls before I deploy it to production? Thanks 回答1: In this answer, I will try to provide information from official documentation of SSIS and I will

Bulk Insert of Generic List C# into SQL Server

寵の児 提交于 2019-11-28 05:28:21
问题 How can I bulk insert a generic list in c# into SQL Server, rather than looping through the list and inserting individual items one at a time? I currently have this; private void AddSnapshotData() { var password = Cryptography.DecryptString("vhx7Hv7hYD2bF9N4XhN5pkQm8MRfxi+kogALYqwqSuo="); var figDb = "ZEUS"; var connString = String.Format( "Data Source=1xx.x.xx.xxx;Initial Catalog={0};;User ID=appuser;Password={1};MultipleActiveResultSets=True", figDb, password); var myConnection = new

C# Bulk Insert SQLBulkCopy - Update if Exists [duplicate]

拥有回忆 提交于 2019-11-28 05:27:46
Possible Duplicate: Any way to SQLBulkCopy “insert or update if exists”? I am using SQLBulkCopy to insert Bulk records How can I perform on update (rather than an insert) on records that already exist? Is this possible with SQLBulkCopy ? This is my code for SQLBulkCopy using (var bulkCopy = new SqlBulkCopy(ConfigurationManager.ConnectionStrings["ConnectionString"].ConnectionString, SqlBulkCopyOptions.KeepNulls & SqlBulkCopyOptions.KeepIdentity)) { bulkCopy.BatchSize = CustomerList.Count; bulkCopy.DestinationTableName = "dbo.tCustomers"; bulkCopy.ColumnMappings.Clear(); bulkCopy.ColumnMappings

How do I use bulk collect and insert in Pl/SQl

倖福魔咒の 提交于 2019-11-28 05:24:10
问题 I want to fetch around 6 millions rows from one table and insert them all into another table. How do I do it using BULK COLLECT and FORALL ? 回答1: declare -- define array type of the new table TYPE new_table_array_type IS TABLE OF NEW_TABLE%ROWTYPE INDEX BY BINARY_INTEGER; -- define array object of new table new_table_array_object new_table_array_type; -- fetch size on bulk operation, scale the value to tweak -- performance optimization over IO and memory usage fetch_size NUMBER := 5000; --