bulkinsert

Bulk insert from CSV file - skip duplicates

喜夏-厌秋 提交于 2019-12-05 19:19:30
UPDATE: Ended up using this method created by Johnny Bubriski and then modified it a bit to skip duplicates. Works like a charm and is apparently quite fast. Link: http://johnnycode.com/2013/08/19/using-c-sharp-sqlbulkcopy-to-import-csv-data-sql-server/ I have been searching for an answer to this but cannot seem to find it. I am doing a T-SQL bulk insert to load data into a table in a local database from a csv file. My statement looks like this: BULK INSERT Orders FROM 'csvfile.csv' WITH(FIELDTERMINATOR = ';', ROWTERMINATOR = '0x0a', FORMATFILE = 'formatfile.fmt', ERRORFILE = 'C:\\ProgramData\

BULK INSERT missing last row?

五迷三道 提交于 2019-12-05 19:03:06
I use BULK INSERT for my text files. Everything works fine but one thing that I discovered, If I give the final line's final column a value, it will import. If the value of that final column in the final line is blank, it discards the line, despite the fact that the destination column allows nulls! Text file uses tab delimiter, here is example of the last row data: Mike Johnson 1/29/1987 M if I have any value in the last column field row will be inserted, example here: Mike Johnson 1/29/1987 M test This is my BULK Insert: BULK INSERT ##TEMP_TEXT FROM '#uncdir#\#cffile.ServerFile#' WITH (

Mongodb bulk insert limit in Python

北城余情 提交于 2019-12-05 13:21:31
Is there a limit to the number of documents one can bulk insert with PyMongo? And I don't mean the 16mb limit of document size for MongoDB, but the actual size of the list of documents I wish to insert in bulk through Python. There is no limit on the number of documents for bulk insert via pymongo. According to the docs , you can provide an iterable to the collection.insert , and it will insert each document in the iterable, sending only a single command to the server Key point here is that pymongo will try to do your insert by sending one single message to the mongodb server. Mongodb itself

laravel 5.6 bulk inserting json data

荒凉一梦 提交于 2019-12-05 10:02:37
I am trying to build an API to store and retrieve MCQ exam papers. I am using laravel resource class to send handle Json data. I need to insert 40 records into MySQL database in a single query without using multi dimensional arrays. Is there any method available? Sample data from front end: { "data":[ { "paper_id":"5", "question_no":"2", "question":"test insert code", "answer1":"answer1", "answer2":"answer2 ", "answer3":"answer3 ", "answer4":"Answer4 ", "answerC":"Correct Answer", "knowarea":"who knows!" }, { "paper_id":"5", "question_no":"3", "question":"test insert code", "answer1":"answer1"

Bulk Insert from table to table

二次信任 提交于 2019-12-05 06:08:09
I am implementing an A/B/View scenario, meaning that the View points to table A , while table B is updated, then a switch occurs and the view points to table B while table A is loaded. The switch occurs daily. There are millions of rows to update and thousands of users looking at the view. I am on SQL Server 2012. My questions are: how do I insert data into a table from another table in the fastest possible way? (within a stored proc) Is there any way to use BULK INSERT? Or, is using regular insert/select the fastest way to go? You could to a Select ColA, ColB into DestTable_New From SrcTable.

How to insert an array of objects (bulk-insert) into neo4j with bolt protocol (javascript)

我们两清 提交于 2019-12-05 05:59:42
1.Send an http post with objects array to server [{id:1, title: ‘one’}, {id:2, title:’two’}] 2.Receive post on server and bulk insert into neo4j with bolt let data = req.body; //set up bolt let db = require('neo4j-driver').v1; let driver = db.driver('bolt://localhost', db.auth.basic('neo4j', ’neo4j’)); let session = driver.session(); 3. Set up statements for execution // start transaction for(var i=0; i>data.length; i++) { //add CREATE statements to bolt session ??? "CREATE (r:Record {id:1, title:'one'})" "CREATE (r:Record {id:2, title:'two'})" ... } //execute session.run(???); //stop

Fastest way to create large file in c++?

北城余情 提交于 2019-12-05 00:48:52
问题 Create a flat text file in c++ around 50 - 100 MB with the content 'Added first line' should be inserted in to the file for 4 million times 回答1: using old style file io fopen the file for write. fseek to the desired file size - 1. fwrite a single byte fclose the file 回答2: The fastest way to create a file of a certain size is to simply create a zero-length file using creat() or open() and then change the size using chsize() . This will simply allocate blocks on the disk for the file, the

bulk insert mysql - can i use ignore clause? is there a limit to no. of records for bulk insert?

我是研究僧i 提交于 2019-12-05 00:06:58
I have a bunch of data that i want to insert and i have decided to use bulk insert for mysql. insert into friends (requestor, buddy) values (value1, value2), (value2, value1), (value3, value4), (value4, value3), ... i would like to know the following: 1) can i use ignore? eg insert ignore into friends (requestor, buddy) values (value1, value2), (value2, value1), (value3, value4), (value4, value3), ... what happens if i have duplicate? will it a) not insert everything? b) insert the records before the duplicate record and STOP processing the data after that? c) ignore the duplicate and carry on

Hibernate saveOrUpdate large data

百般思念 提交于 2019-12-04 21:29:42
I am trying to insert or update a large data with Hibernate. I have a list contains 350k objects and when I use Hibernate saveOrUpdate() , it takes hours to insert all the data. I am using the code below for this operation. My development environment is JDK1.4 and Oracle Database. public void saveAll(List list)throws HibernateException{ Session session = HibernateUtil.getEYSSession(); Iterator it = list.iterator(); int i = 0; while(it.hasNext()){ i++; Object obj = it.next(); session.saveOrUpdate(obj); if (i % 50 == 0) { session.flush(); session.clear(); } } } I am using batch update and also

Bulk insert from C# list into SQL Server into multiple tables with foreign key constraints

巧了我就是萌 提交于 2019-12-04 19:53:26
I am completely clueless with this problem, Any help would be highly appreciated: I have two tables, one is the master data table ( Table A ), the other table ( Table B ) has a foreign key relationship with multiple entries (to be specific 18) for one entry in Table A . I am getting the data in a list and wish to insert it in SQL Server database. I am currently using the below pattern but is taking 14 minutes for inserting 100 rows in Table A and corresponding 18*100 rows in Table B . using (SqlConnection conn = new SqlConnection(conStr)) { foreach (var ticket in Tickets) { sql = string.Format