bulkinsert

Performance of bcp/BULK INSERT vs. Table-Valued Parameters

自闭症网瘾萝莉.ら 提交于 2020-01-26 21:50:54
问题 I'm about to have to rewrite some rather old code using SQL Server's BULK INSERT command because the schema has changed, and it occurred to me that maybe I should think about switching to a stored procedure with a TVP instead, but I'm wondering what effect it might have on performance. Some background information that might help explain why I'm asking this question: The data actually comes in via a web service. The web service writes a text file to a shared folder on the database server which

Performance of bcp/BULK INSERT vs. Table-Valued Parameters

回眸只為那壹抹淺笑 提交于 2020-01-26 21:50:29
问题 I'm about to have to rewrite some rather old code using SQL Server's BULK INSERT command because the schema has changed, and it occurred to me that maybe I should think about switching to a stored procedure with a TVP instead, but I'm wondering what effect it might have on performance. Some background information that might help explain why I'm asking this question: The data actually comes in via a web service. The web service writes a text file to a shared folder on the database server which

Performance of bcp/BULK INSERT vs. Table-Valued Parameters

对着背影说爱祢 提交于 2020-01-26 21:50:02
问题 I'm about to have to rewrite some rather old code using SQL Server's BULK INSERT command because the schema has changed, and it occurred to me that maybe I should think about switching to a stored procedure with a TVP instead, but I'm wondering what effect it might have on performance. Some background information that might help explain why I'm asking this question: The data actually comes in via a web service. The web service writes a text file to a shared folder on the database server which

Insert array of records into mysql with Node JS

随声附和 提交于 2020-01-23 06:58:05
问题 I have a array of data something like var records = [ {Name: '', Id: 1}, {Name: '', Id: 2}, {Name: '', Id: 3}, {Name: '', Id: 4}, {Name: '', Id: 5}, {Name: '', Id: 6} ]; there could be thousands of items inside records array... Ques1: Can we create a stored procedure which will accept an array of objects in mysql? Ques2: Is there a way to bulk insert this data into mysql with Node JS? 回答1: You can bulk insert the array of records ,but before that you might need to convert it into array of

How do I capture the data passed in SqlBulkCopy using the Sql Profiler?

ε祈祈猫儿з 提交于 2020-01-22 11:59:38
问题 I am using Sql Profiler all the time to capture the SQL statements and rerun problematic ones. Very useful. However, some code uses the SqlBulkCopy API and I have no idea how to capture those. I see creation of temp tables, but nothing that populates them. Seems like SqlBulkCopy bypasses Sql Profiler or I do not capture the right events. 回答1: Capturing event info for bulk insert operations ( BCP.EXE , SqlBulkCopy , and I assume BULK INSERT , and OPENROWSET(BULK... ) is possible, but you won't

How do I capture the data passed in SqlBulkCopy using the Sql Profiler?

左心房为你撑大大i 提交于 2020-01-22 11:58:26
问题 I am using Sql Profiler all the time to capture the SQL statements and rerun problematic ones. Very useful. However, some code uses the SqlBulkCopy API and I have no idea how to capture those. I see creation of temp tables, but nothing that populates them. Seems like SqlBulkCopy bypasses Sql Profiler or I do not capture the right events. 回答1: Capturing event info for bulk insert operations ( BCP.EXE , SqlBulkCopy , and I assume BULK INSERT , and OPENROWSET(BULK... ) is possible, but you won't

Python Postgres Best way to insert data from table on one DB to another table on another DB

折月煮酒 提交于 2020-01-17 05:50:10
问题 I have the following python code that copies content of a table on postgres DB1 and INSERTS into a similar table on postgres DB2. I want to speed it up by using BULK INSERTS. How do I achieve this import psycopg2 import sys import os all_data = [] try: connec = psycopg2.connect("host = server1 dbname = DB1 ") connecc = psycopg2.connect("host = server2 dbname = DB2 ") connec.autocommit = True connecc.autocommit = True except: print("I am unable to connect to the database.") cur = connec.cursor

INSERT multiple records in MySQL with one PHP form

匆匆过客 提交于 2020-01-15 11:37:08
问题 INSERT multiple records in MySQL with one PHP form. Simple Form <form action="process.php" method="post"> <p><label>Beamline ID</label> <input type="text" name="bline_id[][bline_id]" /> <label>Flow</label> <input type="text" name="flow[][flow]" /> </p> <p><label>Beamline ID</label> <input type="text" name="bline_id[][bline_id]" /> <label>Flow</label> <input type="text" name="flow[][flow]" /> </p> <p><label>Beamline ID</label> <input type="text" name="bline_id[][bline_id]" /> <label>Flow<

Bulk Insert from CSV file to MS SQL Database

爱⌒轻易说出口 提交于 2020-01-14 05:14:15
问题 I have this working script that I use to BULK INSERT A CSV FILE. The code is: ' OPEN DATABASE dim objConn,strQuery,objBULK,strConnection set objConn = Server.CreateObject("ADODB.Connection") objConn.ConnectionString = "Driver={SQL Server Native Client 11.0};Server=DemoSrvCld;Trusted_Connection=no;UID=dcdcdcdcd;PWD=blabla;database=demotestTST;" objConn.Open strConnection set objBULK = Server.CreateObject("ADODB.Recordset") set objBULK.ActiveConnection = objConn dim strAPPPATH strAPPPATH="C:

bulk insert and update with ADO.NET Entity Framework

て烟熏妆下的殇ゞ 提交于 2020-01-14 03:00:07
问题 I am writing a small application that does a lot of feed processing. I want to use LINQ EF for this as speed is not an issue, it is a single user app and, in the end, will only be used once a month. My questions revolves around the best way to do bulk inserts using LINQ EF. After parsing the incoming data stream I end up with a List of values. Since the end user may end up trying to import some duplicate data I would like to "clean" the data during insert rather than reading all the records,