bulkinsert

php PDO insert batch multiple rows with placeholders

淺唱寂寞╮ 提交于 2019-11-26 22:17:13
问题 I am looking to do multiple inserts using PHP PDO. The closest answer I have found is this one how-to-insert-an-array-into-a-single-mysql-prepared-statement However the example thats been given uses ?? instead of real placeholders. I have looked at the examples on the PHP doc site for place holders php.net pdo.prepared-statements $stmt = $dbh->prepare("INSERT INTO REGISTRY (name, value) VALUES (:name, :value)"); $stmt->bindParam(':name', $name); $stmt->bindParam(':value', $value); Now lets

Bulk Insertion on Android device

て烟熏妆下的殇ゞ 提交于 2019-11-26 21:27:15
I want to bulk insert about 700 records into the Android database on my next upgrade. What's the most efficient way to do this? From various posts, I know that if I use Insert statements, I should wrap them in a transaction. There's also a post about using your own database, but I need this data to go into my app's standard Android database. Note that this would only be done once per device. Some ideas: Put a bunch of SQL statements in a file, read them in a line at a time, and exec the SQL. Put the data in a CSV file, or JSON, or YAML, or XML, or whatever. Read a line at a time and do db

SqlBulkCopy from a List<>

谁说胖子不能爱 提交于 2019-11-26 19:53:49
How can I make a big insertion with SqlBulkCopy from a List<> of simple object ? Do I implement my custom IDataReader ? Simply create a DataTable from your list of objects and call SqlBulkCopy.WriteToServer , passing the data table. You might find the following useful: Adding columns to a DataTable . Add a column for each property/field you wish to write. Adding rows to a DataTable . Add a row for each object in your list. For maximum performance with SqlBulkCopy, you should set an appropriate BatchSize . 10,000 seems to work well - but profile for your data. You might also observe better

How do I temporarily disable triggers in PostgreSQL?

ぐ巨炮叔叔 提交于 2019-11-26 18:55:33
问题 I'm bulk loading data and can re-calculate all trigger modifications much more cheaply after the fact than on a row-by-row basis. How can I temporarily disable all triggers in PostgreSQL? 回答1: Alternatively, if you are wanting to disable all triggers, not just those on the USER table, you can use: SET session_replication_role = replica; This disables triggers for the current session. To re-enable for the same session: SET session_replication_role = DEFAULT; Source: http://koo.fi/blog/2013/01

Slow bulk insert for table with many indexes

柔情痞子 提交于 2019-11-26 18:03:41
问题 I try to insert millions of records into a table that has more than 20 indexes. In the last run it took more than 4 hours per 100.000 rows, and the query was cancelled after 3½ days... Do you have any suggestions about how to speed this up. (I suspect the many indexes to be the cause. If you also think so, how can I automatically drop indexes before the operation, and then create the same indexes afterwards again?) Extra info: The space used by the indexes is about 4 times the space used by

SQL Server Bulk insert of CSV file with inconsistent quotes

夙愿已清 提交于 2019-11-26 17:35:57
Is it possible to BULK INSERT (SQL Server) a CSV file in which the fields are only OCCASSIONALLY surrounded by quotes? Specifically, quotes only surround those fields that contain a ",". In other words, I have data that looks like this (the first row contain headers): id, company, rep, employees 729216,INGRAM MICRO INC.,"Stuart, Becky",523 729235,"GREAT PLAINS ENERGY, INC.","Nelson, Beena",114 721177,GEORGE WESTON BAKERIES INC,"Hogan, Meg",253 Because the quotes aren't consistent, I can't use '","' as a delimiter, and I don't know how to create a format file that accounts for this. I tried

Bulk Insert records into Active Record table

陌路散爱 提交于 2019-11-26 17:28:19
问题 I found that my Model.create! statements were taking a very long time to run when I added a large number of records at once. Looked at ActiveRecord-Import but it didn't work with an array of hashes (which is what I have and which I think is pretty common). How can I improve the performance? 回答1: Use the activerecord-import gem. Let us say you are reading a CSV file and generating a Product catalogue and you want to insert records in batches of 1000: batch,batch_size = [], 1_000 CSV.foreach("

How can I Insert many rows into a MySQL table and return the new IDs?

若如初见. 提交于 2019-11-26 17:18:27
Normally I can insert a row into a MySQL table and get the last_insert_id back. Now, though, I want to bulk insert many rows into the table and get back an array of IDs. Does anyone know how I can do this? There are some similar questions, but they are not exactly the same. I don't want to insert the new ID to any temporary table; I just want to get back the array of IDs. Can I retrieve the lastInsertId from a bulk insert? Mysql mulitple row insert-select statement with last_insert_id() Dag Sondre Hansen Old thread but just looked into this, so here goes: if you are using InnoDB on a recent

mongodb: insert if not exists

两盒软妹~` 提交于 2019-11-26 17:10:36
Every day, I receive a stock of documents (an update). What I want to do is insert each item that does not already exist. I also want to keep track of the first time I inserted them, and the last time I saw them in an update. I don't want to have duplicate documents. I don't want to remove a document which has previously been saved, but is not in my update. 95% (estimated) of the records are unmodified from day to day. I am using the Python driver (pymongo). What I currently do is (pseudo-code): for each document in update: existing_document = collection.find_one(document) if not existing

How to create and populate a table in a single step as part of a CSV import operation?

ⅰ亾dé卋堺 提交于 2019-11-26 16:15:57
问题 I am looking for a quick-and-dirty way to import CSV files into SQL Server without having to create the table beforehand and define its columns . Each imported CSV would be imported into its own table. We are not concerned about data-type inferencing. The CSV vary in structure and layout, and all of them have many many columns, yet we are only concerned with a few of them: street addresses and zipcodes. We just want to get the CSV data into the SQL database quickly and extract the relevant