bulkinsert

SQLAlchemy: prevent automatic closing

拟墨画扇 提交于 2019-11-29 13:44:11
问题 I need to insert/update bulk rows via SQLAlchemy. And get inserted rows. I tried to do it with session.execute: >>> posts = db.session.execute(Post.__table__.insert(), [{'title': 'dfghdfg', 'content': 'sdfgsdf', 'topic': topic}]*2) >>> posts.fetchall() ResourceClosedError Traceback (most recent call last) And with engine: In [17]: conn = db.engine.connect() In [18]: result = conn.execute(Post.__table__.insert(), [{'title': 'title', 'content': 'content', 'topic': topic}]*2) In [19]: print

Is SQL Server Bulk Insert Transactional?

痴心易碎 提交于 2019-11-29 13:26:57
If I run the following query in SQL Server 2000 Query Analyzer: BULK INSERT OurTable FROM 'c:\OurTable.txt' WITH (CODEPAGE = 'RAW', DATAFILETYPE = 'char', FIELDTERMINATOR = '\t', ROWS_PER_BATCH = 10000, TABLOCK) On a text file that conforms to OurTable's schema for 40 lines, but then changes format for the last 20 lines (lets say the last 20 lines have fewer fields), I receive an error. However, the first 40 lines are committed to the table. Is there something about the way I'm calling Bulk Insert that makes it not be transactional, or do I need to do something explicit to force it to rollback

SQL Server Destination vs OLE DB Destination

人走茶凉 提交于 2019-11-29 11:57:01
I was using OLE Db destination for Bulk import of multiple Flat Files. After some tuning I ended up with SQL Server Destination to be 25 - 50 % faster. Though I am confused about this destination as there are contradictory information on the web, some are against it, some are suggesting using it. I would like to know, are there any serious pitfalls before I deploy it to production? Thanks In this answer, I will try to provide information from official documentation of SSIS and I will mention my personal experience with SQL Server destination. 1. SQL Server Destination According to the official

MySQL LOAD DATA Error (Errcode: 2 - “No such file or directory”)

笑着哭i 提交于 2019-11-29 11:12:43
I am trying to load data into a table of my MySQL database, and getting this error. LOAD DATA LOCAL INFILE 'C:\Users\Myself\Desktop\Blah Blah\LOAD DATA\week.txt' INTO TABLE week; Reference: this The path is hundred percent correct, I copied it by pressing shift and clicking "copy path as" and checked it many times. So any tips on this will be much appreciated . . My research: Seeing this answer, I tried by changing C:\Users to C:\\Users . It did not work for me. Secondly, is there a way to use some kind of a relative path (rather than an absolute path) here? I don't know what version of MySQL

Special characters displaying incorrectly after BULK INSERT

本小妞迷上赌 提交于 2019-11-29 11:00:29
问题 I'm using BULK INSERT to import a CSV file. One of the columns in the CSV file contains some values that contain fractions (e.g. 1m½f ). I don't need to do any mathematical operations on the fractions, as the values will just be used for display purposes, so I have set the column as nvarchar . The BULK INSERT works but when I view the records within SQL the fraction has been replaced with a cent symbol ( ¢ ) so the displayed text is 1m¢f . I'm interested to understand why this is happening

How do I bulk insert with SQLite?

眉间皱痕 提交于 2019-11-29 10:23:42
How do I bulk insert with SQLite? I looked it up and it seems like I do an insert with a select statement. I googled, looked at the examples and they all look like they are copying data from one table to another or is not compatible with SQLite. I want to do something like "INSERT INTO user_msg_media (recipientId, mediaId, catagory, current_media_date) " + "VALUES(@mediaId, @catagory, @current_media_date)"; where the value of recipientId is the watcher from each of "SELECT watcher FROM userwatch WHERE watched=@watched"; I tried the code below and I get the error "SQLite error no such column:

How to bulk insert a CSV file into SQLite C#

隐身守侯 提交于 2019-11-29 07:56:21
I have seen similar questions ( 1 , 2 ), but none of them discuss how to insert CSV files into SQLite. About the only thing I could think of doing is to use a CSVDataAdapter and fill the SQLiteDataSet , then use the SQLiteDataSet to update the tables in the database: The only DataAdapter for CSV files I found is not actually available: CSVDataAdapter CSVda = new CSVDataAdapter(@"c:\MyFile.csv"); CSVda.HasHeaderRow = true; DataSet ds = new DataSet(); // <-- Use an SQLiteDataSet instead CSVda.Fill(ds); To write to a CSV file: CSVDataAdapter CSVda = new CSVDataAdapter(@"c:\MyFile.csv"); bool

Doctrine 2: weird behavior while batch processing inserts of entities that reference other entities

爷,独闯天下 提交于 2019-11-29 07:18:46
I am trying out the batch processing method described here: http://docs.doctrine-project.org/projects/doctrine-orm/en/latest/reference/batch-processing.html my code looks like this $limit = 10000; $batchSize = 20; $role = $this->em->getRepository('userRole')->find(1); for($i = 0; $i <= $limit; $i++) { $user = new \Entity\User; $user->setName('name'.$i); $user->setEmail('email'.$i.'@email.blah'); $user->setPassword('pwd'.$i); $user->setRole($role); $this->em->persist($user); if (($i % $batchSize) == 0) { $this->em->flush(); $this->em->clear(); } } the problem is, that after the first call to em

Bulk Parameterized Inserts

风格不统一 提交于 2019-11-29 06:23:38
I'm trying to switch some hard-coded queries to use parameterized inputs, but I've run into a problem: How do you format the input for parameterized bulk inserts? Currently, the code looks like this: $data_insert = "INSERT INTO my_table (field1, field2, field3) "; $multiple_inserts = false; while ($my_condition) { if ($multiple_inserts) { $data_insert .= " UNION ALL "; } $data_insert .= " SELECT myvalue1, myvalue2, myvalue3 "; } $recordset = sqlsrv_query($my_connection, $data_insert); A potential solution (modified from How to insert an array into a single MySQL Prepared statement w/ PHP and

Bulk insert in MongoDB using mongoose

耗尽温柔 提交于 2019-11-29 06:22:04
I currently have a collection in Mongodb say "Collection1". I have the following array of objects that need to be into inserted into MongoDB. I am using Mongoose API. For now, I am iterating through the array and inserting each of them into mongo. This is ok for now, but will be a problem when the data is too big. I need a way of inserting the data in bulk into MongoDB without repetition. I am not sure how to do this. I could not find a bulk option in Mongoose. My code below myData = [Obj1,Obj2,Obj3.......] myData.forEach(function(ele){ //console.log(ele) saveToMongo(ele); }); function