bulkinsert

SSIS- OleDb Fast Load vs. Bulk Insert Task

▼魔方 西西 提交于 2019-12-10 23:55:27
问题 I have done research including threads on this forum but cant seem to find an answer. I am loading text files with 40 columns. No transformation at this time. There are 8 files ~25MB with a total of 1,400,000 rows. Using Bulk Insert task the load completes in 3 minutes. Using OleDb destination and flat file input connection manager the load completes in 30 minutes. From all I have read, SSIS should be using Bulk Inserts behind the OleDb connection. If so, why is there such a dramatic

Passing multiple values to single parameter in SQL inline query using C#

放肆的年华 提交于 2019-12-10 21:09:54
问题 I am new to coding and looking for some help on how to pass multiple values to a single parameter in an inline SQL query. I have framed the below query, but I heard this could result in SQL-injection issue. Kindly help on how can I frame the below by using parameter based in the SQL query. string query = "Select ID, email FROM DBTABLE WHERE email in ("; var stringBuiler = new StringBuilder(); using (StringReader stringReader = new StringReader(DownloadIDtextBox.Text)) { string line; string

Bulk inserts of heavily indexed child items (Sql Server 2008)

让人想犯罪 __ 提交于 2019-12-10 20:23:26
问题 I'm trying to create a data import mechanism for a database that requires high availability to readers while serving irregular bulk loads of new data as they are scheduled. The new data involves just three tables with new datasets being added along with many new dataset items being referenced by them and a few dataset item metadata rows referencing those. Datasets may have tens of thousands of dataset items. The dataset items are heavily indexed on several combinations of columns with the

Loading large amounts of data to an Oracle SQL Database

六月ゝ 毕业季﹏ 提交于 2019-12-10 17:57:32
问题 I was wondering if anyone had any experience with what I am about to embark on. I have several csv files which are all around a GB or so in size and I need to load them into a an oracle database. While most of my work after loading will be read-only I will need to load updates from time to time. Basically I just need a good tool for loading several rows of data at a time up to my db. Here is what I have found so far: I could use SQL Loader t do a lot of the work I could use Bulk-Insert

Its possible insert row with embedded ID on table with HQL?

点点圈 提交于 2019-12-10 17:22:54
问题 I'm able to insert any row with HQL. Example: insert into MyMappedTable(field,field,field) select c.x, c.y, c.z from Object c where .... But, my requirement is insert with embedded Id @JoinColumn(insertable = false, name = "CATEGORYID", referencedColumnName = "ID", updatable = false) @ManyToOne(fetch = FetchType.EAGER, optional = false) private Category category; @EmbeddedId protected CategoryProductPK categoryProductPK; @Basic(optional = true) @Column(name = "POSITION") private Integer

Copy data from one database to another in Oracle

浪子不回头ぞ 提交于 2019-12-10 13:43:43
问题 I have 2 Oracle databases and I frequently copy data from prod DB to test DB using TOAD, by generating insert scripts for Prod DB and running it on the test DB later. I am trying to do it faster through a batch file. I think that I can use this solution but the DB has an auto-increment column. If I use this solution, would that column be affected? Do I need to change the script in some way? I haven't tried this so far as I have no access do the DB and would be able to test this only on Monday

Performance SQLite Issue - Can a codechange speed up things?

微笑、不失礼 提交于 2019-12-10 12:00:01
问题 I use the following code to add rows to my database : public void insert(String kern, String woord) { SQLiteDatabase db = getWritableDatabase(); ContentValues values = new ContentValues(); values.put(KERN, kern); values.put(WOORD, woord); db.insertOrThrow(TABLE_NAME, null, values); return; Currently, I'm invoking this insert() 3.455 times, to add all words to the database, using : insert("Fruits", "Banana") ; It takes forever. How can I change this code to work faster? I'm thinking in the

OracleBulkCopy does not insert entries to table

心已入冬 提交于 2019-12-10 11:15:41
问题 Here's the code I'm executing: public static void Main(string[] args) { var connectionString = "Data Source=dbname;User Id=usrname;Password=pass;"; DataTable dt = new DataTable("BULK_INSERT_TEST"); dt.Columns.Add("N", typeof(double)); var row = dt.NewRow(); row["N"] = 1; using (var connection = new OracleConnection(connectionString)){ connection.Open(); using(var bulkCopy = new OracleBulkCopy(connection, OracleBulkCopyOptions.UseInternalTransaction)) { bulkCopy.DestinationTableName = dt

Cannot bulk load because the file could not be opened. Operating system error code 1326(Logon failure: unknown user name or bad password.)

不问归期 提交于 2019-12-10 10:54:56
问题 bulk upload from csv test file "\servername\wwwroot\Upload\LDSAgentsMap.txt" SET QUOTED_IDENTIFIER ON SET ANSI_NULLS ON GO CREATE PROCEDURE [dbo].[sp_CSVTest_BulkInsert] ( @Path NVARCHAR(128) ) AS DECLARE @Sql NVARCHAR(256) SET @Sql = 'BULK INSERT CSVTest FROM ''' + @Path + ''' WITH ( FIELDTERMINATOR = '','', ROWTERMINATOR = ''\n'' )' --PRINT @Sql EXEC(@Sql) GO path is "\servername\wwwroot\Upload\LDSAgentsMap.txt" note this is in shared hosting and database user have blukadmin and public

SQL Import skip duplicates

限于喜欢 提交于 2019-12-10 10:19:16
问题 I am trying to do a bulk upload into a SQL server DB. The source file has duplicates which I want to remove, so I was hoping that the operation would automatically upload the first one, then discard the rest. (I've set a unique key constraint). Problem is, the moment a duplicate upload is attempted the whole thing fails and gets rolled back. Is there any way I can just tell SQL to keep going? 回答1: Try to bulk insert the data to the temporary table and then SELECT DISTINCT as @madcolor