bulkinsert

INSERT multiple entries from Android -> PHP -> MYSQL

狂风中的少年 提交于 2019-12-08 05:06:48
I am trying to insert multiple (1-50) entries from an Android application to an external Mysql database. I perfectly got a PHP script to work for single INSERT queries. But I am failing so far to make this work for a whole array of entries, most likely due to my limited understanding of PHP. Android code: List<NameValuePair> upload_array = new ArrayList<NameValuePair>(); upload_array.add(new BasicNameValuePair("mFirstname[0]", "FirstName 1")); upload_array.add(new BasicNameValuePair("mFirstname[1]", "FirstName 2")); upload_array.add(new BasicNameValuePair("mLastname[0]", "LastName 1")); upload

How to pass String to Bulk insert instead of file?

老子叫甜甜 提交于 2019-12-08 04:03:48
问题 I used to use bulk insert command to Convert a Csv file int table.Resently i saved a CSV file as a VarBinary value in sql server.Now I can get data from Varbinary file by typecasting it to Varchar using CAST and CONVERT functions.But now i got an issue i cant convert this Varchar String containing csv content to table using bulk insert.Can any one help me My example code is given below: --@String contains varchar value of CSV file content. SET @sql = 'BULK INSERT TempCsv FROM ''' + @String +

bulk insert txt error with ROWTERMINATOR

会有一股神秘感。 提交于 2019-12-08 02:39:31
Have a txt file and have to pass it to sql A bulk insert BULK INSERT table FROM '\ \ 01cends5 \ TestBulk \ a.txt' WITH ( DATAFILETYPE = 'char' FIELDTERMINATOR = '|' ROWTERMINATOR = '\ n ', FIRSTROW = 1, LASTROW = 15 ) But it do not take as a final line ROWTERMINATOR and probe everything and does not work {CR} {LF}{LF}{CR}\ n\ r\ r \ n\ n \ r My txt format is: 0 | 20276708598 | 119302 | 201101 | 000000 | 000000 vic123 It looks like something is wrong with '\r' translation to 0x0A, at least in my case. http://dbaspot.com/sqlserver-programming/463913-bulk-insert-rowterminator-failing.html

BULK INSERT fails with row terminator on last row

心已入冬 提交于 2019-12-08 00:11:42
问题 I'm importing a CSV compiled using cygwin shell commands into MS SQL 2014 using: BULK INSERT import from 'D:\tail.csv' WITH (FIELDTERMINATOR = ',', ROWTERMINATOR = '\r', FIRSTROW = 1) GO I have confirmed that each row contains a \r\n. If I leave a CR/LF on the last row the bulk import fails with Msg 4832: Bulk load: An unexpected end of file was encountered in the data file. If I end the file at the end of the last data row then the bulk import succeeds. For very large CSVs a kludgy way

OrientDB GraphED - SQL insert edge between two (select vertex RID)s? Or alternative approach for very large import

醉酒当歌 提交于 2019-12-07 17:42:41
问题 For example, two simple vertices in an OrientDB Graph: orientdb> CREATE DATABASE local:/databases/test admin admin local graph; Creating database [local:/databases/test] using the storage type [local]... Database created successfully. Current database is: local:/graph1/databases/test orientdb> INSERT INTO V (label,in,out) VALUES ('vertexOne',[],[]); Inserted record 'V#6:0{label:vertexOne,in:[0],out:[0]} v0' in 0.001000 sec(s). orientdb> INSERT INTO V (label,in,out) VALUES ('vertexTwo',[],[]);

Bulk insert, asp.net

纵饮孤独 提交于 2019-12-07 15:31:25
问题 I have a need to take in a list of ID numbers corresponding to a member. Their can be anywhere from 10 to 10,000 being processed at any given time. I have no problem collecting the data, parsing the data and loading it in to a DataTable or anything (C#) but I want to do some operations in the database. What is the best way to insert all of this data into a table? I am pretty sure I don't want run a for each statement and insert 10,000 times. 回答1: I've used the SqlBulkCopy class before to do a

TRY doesn't CATCH error in BULK INSERT

痴心易碎 提交于 2019-12-07 14:48:07
问题 Why in the following code TRY didn't catch the error and how can I catch this error? BEGIN TRY BULK INSERT [dbo].[tblABC] FROM 'C:\temp.txt' WITH (DATAFILETYPE = 'widechar',FIELDTERMINATOR = ';',ROWTERMINATOR = '\n') END TRY BEGIN CATCH select error_message() END CATCH I just get this: Msg 4860, Level 16, State 1, Line 2 Cannot bulk load. The file "C:\temp.txt" does not exist. 回答1: This is one option that helps to catch this error: BEGIN TRY DECLARE @cmd varchar(1000) SET @cmd = 'BULK INSERT

Bulk insert from CSV file - skip duplicates

牧云@^-^@ 提交于 2019-12-07 13:04:47
问题 UPDATE: Ended up using this method created by Johnny Bubriski and then modified it a bit to skip duplicates. Works like a charm and is apparently quite fast. Link: http://johnnycode.com/2013/08/19/using-c-sharp-sqlbulkcopy-to-import-csv-data-sql-server/ I have been searching for an answer to this but cannot seem to find it. I am doing a T-SQL bulk insert to load data into a table in a local database from a csv file. My statement looks like this: BULK INSERT Orders FROM 'csvfile.csv' WITH

Bulk Insert from table to table

和自甴很熟 提交于 2019-12-07 03:34:07
问题 I am implementing an A/B/View scenario, meaning that the View points to table A , while table B is updated, then a switch occurs and the view points to table B while table A is loaded. The switch occurs daily. There are millions of rows to update and thousands of users looking at the view. I am on SQL Server 2012. My questions are: how do I insert data into a table from another table in the fastest possible way? (within a stored proc) Is there any way to use BULK INSERT? Or, is using regular

Bulk-Insert to MySQL in Entity-Framework Core

老子叫甜甜 提交于 2019-12-07 02:46:46
问题 I have a list consisting of ~10,000 objects (let's say of class Person ) that I need to insert to a MySQL table. If I use the regular DbContext.SaveChanges() , it takes 60-70 seconds to issue, which I need to reduce drastically. I've found several extensions for bulk-insertions: EF extensions (not free, so no option) BulkExtensions (no MySQL, only SQL Server) EFBulkInsert (no MySQL,only SQL Server) ... Unfortunately, non seem to exist for MySQL databases. Does anybody know of one for MySQL?