bulk

How to convert text lines to variables?

て烟熏妆下的殇ゞ 提交于 2019-12-11 12:35:19
问题 Check out this list. I need each one turned into a variable and set equal to 0. Example: ;1-Methyoxy-2-Propanol would be: $OneMethoxyTwoPropanol = 0 ;and 1,2-BUTADIENE would be: $OneTwoButadiene = 0 Assigning them to a variable wouldn't be a problem, but there are 1500 of them. 回答1: If i had to do this work i'll make it this way : I'll change the case of each word : Regex to change to sentence case Make a "SearchAndReplace" : 1 -> One 2 -> Two ... {underscore} -> '' {space} -> '' ... Then in

bulk user import for joomla 2.5

情到浓时终转凉″ 提交于 2019-12-11 12:23:35
问题 can someone help me with this php-script, this script is for Joomla 1.5 and I am using 2.5. This is not compatible and I am confused of the tables used because 1.5 and 2.5 has different table names. some tables in 1.5 are not also in 2.5 btw, this script is a bulk user import using csv file to sql/database. <?php // Hande form upload if(isset($_POST['import'])) { $mysql_host = trim($_POST['mysql_host']); $mysql_user = trim($_POST['mysql_username']); $mysql_password = trim($_POST['mysql

Fastest OLEDB read from ORACLE

戏子无情 提交于 2019-12-11 07:24:56
问题 What would be the fastest way of retrieving data from the Oracle DB via OLEDB? It should be portable (have to work on Postgres and MS SQL), only one column is transfered (ID from some large table). Current performance is 100k rows/sec. Am I expecting too much if I want it to go faster? Clarification: datatable has 23M records Query is: SELECT ID FROM OBJECTS Bottleneck is transfer from oracle to the client software, which is c++/OLEDB 回答1: What the heck, I'll take a chance. Edit: As far as

How can a CSV file with counter columns be loaded to a Cassandra CQL3 table

ⅰ亾dé卋堺 提交于 2019-12-11 04:27:41
问题 I have a CSV file in the following format key1,key2,key3,counter1,counter2,counter3,counter4 1,2,1,0,0,0,1 1,2,2,0,1,0,4 The CQL3 table has all value columns of type 'counter'. When I try to use the COPY command to load the CSV I get the usual error which asks for an UPDATE instead of an INSERT. The question is : how can I tell CQL to use an UPDATE ? Is there any other way to do this ? 回答1: using sstables solved this issue. Although a little slower than what i expected , it does the job 回答2:

Linq to Nhibernate Bulk Update Query Equivalent?

你。 提交于 2019-12-10 19:28:16
问题 Not sure if I'm missing anything here. Basically, I am looking for Linq to Nhibernate to do the following SQL statement: update SomeTable set SomeInteger = (SomeInteger + 1) where SomeInteger > @NotSoMagicNumber Is there any way to do that? Thanks! 回答1: Linq (not Linq to NHibernate, Linq in general) does not have a bulk update verb like SQL has. If you need the efficiency of the bulk update statement like yours, I'd just stick to SQL. 回答2: Late answer but it now exists in Nhibernate 5.0. // /

Javamail, Transport.send() very slow

99封情书 提交于 2019-12-10 14:46:00
问题 I have written a method for sending emails in bulk but it is very very slow (around 3 mails every 10 seconds). I want to send thousands of mails. Is there any way to do this much more faster? I am using gmail now but only for test, finally I want to send using my own SMTP server. Here is the code: public boolean sendMessages() { try { Session session = Session.getInstance(this._properties, new javax.mail.Authenticator() { @Override protected PasswordAuthentication getPasswordAuthentication()

Cannot bulk load because the file could not be opened. Operating system error code 1326(Logon failure: unknown user name or bad password.)

不问归期 提交于 2019-12-10 10:54:56
问题 bulk upload from csv test file "\servername\wwwroot\Upload\LDSAgentsMap.txt" SET QUOTED_IDENTIFIER ON SET ANSI_NULLS ON GO CREATE PROCEDURE [dbo].[sp_CSVTest_BulkInsert] ( @Path NVARCHAR(128) ) AS DECLARE @Sql NVARCHAR(256) SET @Sql = 'BULK INSERT CSVTest FROM ''' + @Path + ''' WITH ( FIELDTERMINATOR = '','', ROWTERMINATOR = ''\n'' )' --PRINT @Sql EXEC(@Sql) GO path is "\servername\wwwroot\Upload\LDSAgentsMap.txt" note this is in shared hosting and database user have blukadmin and public

A way out from getting SystemOutOfMemoryException while importing from large text file into database

老子叫甜甜 提交于 2019-12-08 08:37:13
问题 we're using ZyWall to guard our servers from external intrusions. It generates daily log files with huge sizes, over a GB, sometimes 2 GBs. They ususally contain more than 10 millions of lines. Now my task is to write an application that will import these lines into Oracle database. I'm writing it in C#. What I'm currently doing is: I read the logfiles line by line. I do not load the whole file at once: using(StreamReader reader=new StreamReader("C:\ZyWall.log")) { while ((line=reader

Upload of huge file using a web application

半城伤御伤魂 提交于 2019-12-08 06:09:33
问题 The environment for the given objective is not currenly available, hence, I'm not able to try out things and have to rely on the analysis only ! My objective can be broken into the following distinct steps : Uploading huge files(upto 100GB ) using a dumb 'Upload File' page - there is no escape from this as the users want a (dumb)front-end and are not willing to ftp the file etc. The web application which provides the above front end will be hosted on a low-end machine - 2GB RAM and 40GB HDD

Shredding XML from DB using SSIS

半腔热情 提交于 2019-12-08 05:38:17
问题 I am looking for a way to pull XML from a SQL database and shred the XML via SSIS in bulk. I currently have a package that can pull XML from the database and pass the XML to a stored procedure, via variable, for shredding but this only works 1 record at a time. When processing 100,000 records, this can become quite time consuming. I would like to shred multiple XML values at once using SSIS. Is this possible with SSIS? Perhaps something in a Data Flow Task where all the XML values are