bulkinsert

Dynamics CRM 2011 Bulk Update

痴心易碎 提交于 2019-12-03 06:00:13
Running Dynamics CRM 2011 rollout 3. Need to update millions of customer records periodically (delta updates). Using standard update (one by one) takes a few weeks. Also we don't want to touch the DB directly as it may break stuff in the future. Is there a bulk update method in the Dynamics CRM 2011 webservice/REST API we can use? (WhatWhereHow) I realize this is post is over 2 years old, but I can add to it in case someone else reads it and has a similar need. Peter Majeed's answer is on target in that CRM processes requests one record at a time. There is no bulk edit that works the way you

EntityFrameWork and TableValued Parameter

限于喜欢 提交于 2019-12-03 05:13:28
问题 I'm trying to call a stored procedure from EntityFramework which uses Table-value parameter. But when I try to do function import I keep getting a warning message saying - The function 'InsertPerson' has a parameter 'InsertPerson_TVP' at parameter index 0 that has a data type 'table type' which is currently not supported for the target .NET Framework version. The function was excluded. I did a initial search here and found few posts which says It's possible in EntityFrameWork with some work

Postgres insert optimization

允我心安 提交于 2019-12-03 00:32:06
I have a script that generates tens of thousands of inserts into a postgres db through a custom ORM. As you can imagine, it's quite slow. This is used for development purposes in order to create dummy data. Is there a simple optimization I can do at the Postgres level to make this faster? It's the only script running, sequentially, and requires no thread safety. Perhaps I can turn off all locking, safety checks, triggers, etc? Just looking for a quick and dirty solution that will greatly speed up this process. Thanks. If you don't need that kind of functionality in production environment, I'd

How to use bulkInsert() function in android?

橙三吉。 提交于 2019-12-03 00:13:58
I want only a single notification after all the bulk insertion is done into database. Please provide an example to use bulkInsert() function. I cannot find a proper example on internet. Please Help!!!! This is bulkInsert using ContentProvider. public int bulkInsert(Uri uri, ContentValues[] values){ int numInserted = 0; String table; int uriType = sURIMatcher.match(uri); switch (uriType) { case PEOPLE: table = TABLE_PEOPLE; break; } SQLiteDatabase sqlDB = database.getWritableDatabase(); sqlDB.beginTransaction(); try { for (ContentValues cv : values) { long newID = sqlDB.insertOrThrow(table,

Why aren't my triggers firing during an insert by SSIS?

夙愿已清 提交于 2019-12-02 23:47:20
I have an SSIS data flow task with an OLE DB Destination component that inserts records into a table with a trigger. When I execute a normal INSERT statement against this table, the trigger fires. When I insert records through the SSIS task the trigger does not fire. How can I get the trigger firing in SSIS? Because the OLE DB Destination task uses a bulk insert, triggers are not fired by default. From BULK INSERT (MSDN) : If FIRE_TRIGGERS is not specified, no insert triggers execute. One must manually specify FIRE_TRIGGERS as part of the OLE DB component through its Advanced Editor. Then add

How to perform a bulk update of documents in MongoDB with Java

老子叫甜甜 提交于 2019-12-02 20:48:31
I'm using MongoDB 3.2 and MongoDB Java Driver 3.2. I have an array of a couple of hundreds of updated documents which should be now saved/stored in MongoDB. In order to do that, I iterate over the array and call for each document in this array the updateOne() method. Now, I want to re-implement this logic with a bulk update. I tried to find an example of bulk update in MongoDB 3.2 with MongoDB Java Driver 3.2. I tried this code: MongoClient mongo = new MongoClient("localhost", 27017); DB db = (DB) mongo.getDB("test1"); DBCollection collection = db.getCollection("collection");

SQL Server BCP Bulk insert Pipe delimited with text qualifier format file

有些话、适合烂在心里 提交于 2019-12-02 20:17:50
问题 I have a csv file which is vertical pipe delimited with every column also with a text qualifier of ". I have been trying for ages to try and get the BCP format file to work, but no luck. I have the following staging table: [ID] [VARCHAR](100) NULL, [SUB_ID] [NUMERIC](18, 0) NULL, [CODE1] [VARCHAR](20) NULL, [CODE2] [NUMERIC](18, 0) NULL, [DATE] [DATE] NULL Data in csv: "ID"|"SUB_ID"|"CODE1"|"CODE2"|"DATE" "HAJHD87SADAD9A87SD9ADAS978DAA89D09AS"|"7510"|"N04FY-1"|"359420013"|"08/08/2018" Format

EntityFrameWork and TableValued Parameter

China☆狼群 提交于 2019-12-02 18:30:14
I'm trying to call a stored procedure from EntityFramework which uses Table-value parameter. But when I try to do function import I keep getting a warning message saying - The function 'InsertPerson' has a parameter 'InsertPerson_TVP' at parameter index 0 that has a data type 'table type' which is currently not supported for the target .NET Framework version. The function was excluded. I did a initial search here and found few posts which says It's possible in EntityFrameWork with some work arounds and few saying it's not supported in current versions. Does any one know a better approach or

How to Bulk insert with a dynamic value for a column

前提是你 提交于 2019-12-02 09:50:00
问题 the situation is this: I have 200 txt files with different names like 601776.txt each file's name is actually an ID_foo and it contains some data like this (2 columns): 04004 Albánchez 04006 Albox 04008 Alcóntar 04009 Alcudia de Monteagud . . . now I wanna BULK INSERT these TXT files into a SQL Server Table which has 3 column one of these columns should be the name of the txt file. I'm using a PHP script, so I made a loop to get the file names and then what? BULK INSERT Employee_Table FROM '.

Bulk insert from DataTable to SQLCE DataSource

只谈情不闲聊 提交于 2019-12-02 06:14:33
问题 This an C# WPF application with SQL CE as DataSource: I have a DataTable (display as DataGrid) and a SQL CE DataSource. I populate my DataTable from SQL CE using DataAdapter, DataSet and DataTable. Then bind my DataGrid to the DataTable. I may add rows (>10,000) rows to my DataTable and may have data edited before propagating all my changes all together to my Sql CE DataSource. My current approach is DROP TABLE, CREATE TABLE, and re-INSERT rows by brute force to SQLCE. SQL CE has no bulk