bulkinsert

How to import bulk data in documentDB from excel?

荒凉一梦 提交于 2019-12-01 11:02:16
I have surfed for a day about how to insert bulk data in documentDB from excel file, but I didn't get any informaion. I can able to read data from excel file and insert one by one in documentDB Service service = new Service(); foreach(data in exceldata) //exceldata contains set of rows { var student = new Student(); student.id= ""; student.name = data.name; student.age = data.age; student.class = data.class; student.id = service.savetoDocumentDB(collectionLink,student); //collectionlink is a string stored in web.config students.add(student); } Class Service { public async Task<string>

How to INSERT an array of values in SQL Server 2005?

瘦欲@ 提交于 2019-12-01 10:44:39
How do I write the SQL code to INSERT (or UPDATE) an array of values (with probably an attendant array of fieldnames, or with a matrix with them both) without simple iteration? I construct the list as an xml string and pass it to the stored procs. In SQL 2005, it has enhanced xml functionalities to parse the xml and do a bulk insert. check this post: Passing lists to SQL Server 2005 with XML Parameters Simple way to concatenate the values into a list and pass it to the sp. In the sp use dbo.Split udf to convert back to resultset (table). Create this function: CREATE FUNCTION dbo.Split(@String

How to INSERT an array of values in SQL Server 2005?

徘徊边缘 提交于 2019-12-01 09:36:36
问题 How do I write the SQL code to INSERT (or UPDATE) an array of values (with probably an attendant array of fieldnames, or with a matrix with them both) without simple iteration? 回答1: I construct the list as an xml string and pass it to the stored procs. In SQL 2005, it has enhanced xml functionalities to parse the xml and do a bulk insert. check this post: Passing lists to SQL Server 2005 with XML Parameters 回答2: Simple way to concatenate the values into a list and pass it to the sp. In the sp

mysql LOAD DATA INFILE NA to NULL transformation

亡梦爱人 提交于 2019-12-01 09:09:33
Is there an option in the mysql LOAD DATA INFILE command, to take a .tsv file as input to mysql LOAD DATA INFILE, and transform every 'NA' field in that file to NULL in mysql? And as a bonus, also to be able to take multiple different ones, like 'NaN','NA','--' , etc. and transform all of them into 'NULL'. You can use variables: LOAD DATA LOCAL INFILE 'file.tsv' INTO TABLE my_table FIELDS TERMINATED BY ',' ENCLOSED BY '"' LINES TERMINATED BY '\n' (@col1, @col2, @col3) SET col1 = CASE WHEN @col1 NOT IN ('NA', 'NaN', '--') THEN @col1 END, col2 = CASE WHEN @col2 NOT IN ('NA', 'NaN', '--') THEN

mysql LOAD DATA INFILE NA to NULL transformation

≯℡__Kan透↙ 提交于 2019-12-01 06:47:40
问题 Is there an option in the mysql LOAD DATA INFILE command, to take a .tsv file as input to mysql LOAD DATA INFILE, and transform every 'NA' field in that file to NULL in mysql? And as a bonus, also to be able to take multiple different ones, like 'NaN','NA','--' , etc. and transform all of them into 'NULL'. 回答1: You can use variables: LOAD DATA LOCAL INFILE 'file.tsv' INTO TABLE my_table FIELDS TERMINATED BY ',' ENCLOSED BY '"' LINES TERMINATED BY '\n' (@col1, @col2, @col3) SET col1 = CASE

bulk updating a list of values from a list of ids

偶尔善良 提交于 2019-12-01 06:11:05
I'm frequently facing this issue, as an Oracle user, playing around with MySql. Be the following situation: a list of ids (1, 2, 3, ..., n) a list of values ('val1', 'val2', 'val3', ..., 'valn') [The values are obviously totally different than these] The 2 previous lists are passed ordered. It means the value passed first corresponds to the id passed first. The objective is to update all the value of the table value having the corresponding id : val1 should update id 1, val2 should update id 2 etc... In only ONE query. The easy solution is to update n times: UPDATE `value` SET `value`='val1'

Pymongo bulk inserts not working

白昼怎懂夜的黑 提交于 2019-12-01 05:46:24
I am following the tutorial http://api.mongodb.org/python/current/tutorial.html for bulk inserts. However, I am getting the error that I have listed below. What am I missing? The reviews_array is a json_array client = MongoClient() client = MongoClient('localhost', 27017) db = client.is_proj db_handle = db.reviews self.db_handle.insert_many(reviews_array) The Error: TypeError: 'Collection' object is not callable. If you meant to call the 'insert_many' method on a 'Collection' object it is failing because no such method exists. In pymongo, before V3.0 , you use insert for both single-doc and

Thousands of ORMLite raw inserts taking several minutes on Android

∥☆過路亽.° 提交于 2019-12-01 04:23:50
I'm trying to pre-populate an Android SQLite database using ORMLite. The problem is that this operation is too slow. It takes several minutes. The code below show how it happens. RuntimeExceptionDao<Company, Integer> companyDao = ORMLiteHelper.getInstance(context).getCompanyRuntimeDao();AssetManager am = context.getAssets(); try { InputStream instream = am.open("companies.sqlite"); if (instream != null) { InputStreamReader inputreader = new InputStreamReader(instream); BufferedReader buffreader = new BufferedReader(inputreader); String line; try { while ((line = buffreader.readLine()) != null)

Pymongo bulk inserts not working

隐身守侯 提交于 2019-12-01 03:51:18
问题 I am following the tutorial http://api.mongodb.org/python/current/tutorial.html for bulk inserts. However, I am getting the error that I have listed below. What am I missing? The reviews_array is a json_array client = MongoClient() client = MongoClient('localhost', 27017) db = client.is_proj db_handle = db.reviews self.db_handle.insert_many(reviews_array) The Error: TypeError: 'Collection' object is not callable. If you meant to call the 'insert_many' method on a 'Collection' object it is

Thousands of ORMLite raw inserts taking several minutes on Android

让人想犯罪 __ 提交于 2019-12-01 02:33:07
问题 I'm trying to pre-populate an Android SQLite database using ORMLite. The problem is that this operation is too slow. It takes several minutes. The code below show how it happens. RuntimeExceptionDao<Company, Integer> companyDao = ORMLiteHelper.getInstance(context).getCompanyRuntimeDao();AssetManager am = context.getAssets(); try { InputStream instream = am.open("companies.sqlite"); if (instream != null) { InputStreamReader inputreader = new InputStreamReader(instream); BufferedReader