bulkinsert

Optimizing InnoDB Insert Queries

会有一股神秘感。 提交于 2019-12-11 05:47:08
问题 According to slow query log, the following query (and similar queries) would take around 2s to execute occassionally: INSERT INTO incoming_gprs_data (data,type) VALUES ('3782379837891273|890128398120983891823881abcabc','GT100'); Table structure: CREATE TABLE `incoming_gprs_data` ( `id` int(200) NOT NULL AUTO_INCREMENT, `dt` timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP, `data` text NOT NULL, `type` char(10) NOT NULL, `test_udp_id` int(20) NOT NULL, `parse_result` text NOT NULL, `completed`

How to prevent SQL injection when doing batch insert in PostgreSQL?

守給你的承諾、 提交于 2019-12-11 05:27:39
问题 I have up to 100 items I would like to insert in one batch operation. I am doing it like this: INSERT INTO MyTable (f1, f2, ..., fk) VALUES (v11, v12, ..., v1k), (v21, v22, ..., v2k), ... (vn1, vn2, ..., vnk) All is fine, but I am building this string by concatenating the values as is, which means my code is vulnerable to SQL injection. How can I continue using the bulk insert syntax on one hand, yet be protected from the SQL injection? EDIT 1 I would like to provide a bit more context. The

mysql query select keys and insert

允我心安 提交于 2019-12-11 05:08:35
问题 I have two tables: Articles that stores information about Articles, and PageLinks that stores hyperlinks between pages. The schema is as below. CREATE TABLE `Articles` ( `id` int(11) NOT NULL AUTO_INCREMENT, `slug` varchar(255) CHARACTER SET utf8 COLLATE utf8_bin NOT NULL, `label` varchar(255) DEFAULT NULL, PRIMARY KEY (`id`), UNIQUE KEY `slug_UNIQUE` (`slug`) ) ENGINE=InnoDB AUTO_INCREMENT=1 DEFAULT CHARSET=utf8 CREATE TABLE `PageLinks` ( `id` int(11) NOT NULL AUTO_INCREMENT, `from_id` int

How use Bulk insert csv to sql server with datetime format correct?

ぐ巨炮叔叔 提交于 2019-12-11 04:46:26
问题 i want use bulk insert file csv insert to SQL Server 2012 . same column have datetime but use bulk insert datetime format not work and i not use SSIS . Example Create Table CREATE TABLE [dbo].[scanindex_test]( [request_no] [varchar](13) NOT NULL, [request_date] [datetime] NULL, [id_card] [varchar](20) NULL, [firstname] [varchar](100) NULL, [surname] [varchar](100) NULL ) Query Sql Server 2012: declare @path varchar(255), @sql varchar(5000) SET @path = 'C:\Test\TESTFILE.csv' set @sql = 'BULK

Invalid character value for cast specification

痴心易碎 提交于 2019-12-11 03:39:56
问题 I am inserting data into SQL 2005 using SQLXMLBulkLoad.SQLXMLBulkload.3.0 My data table has following column: objDataTable.Columns.Add("TaskDateTime", System.Type.GetType("System.DateTime")) My bulk insert schema has following definition: <xsd:element name="DepartureTime" type="xsd:date" /> (Using xmlns:xsd="http://www.w3.org/2001/XMLSchema") And I am getting 'Invalid character value for cast specification' exception. Any advice? 回答1: Solved!. Changed column type from: objDataTable.Columns

xsd schema file must be annotated in SQLXMLBULKLOADLib.SQLXMLBulkLoad4Class?

删除回忆录丶 提交于 2019-12-11 03:32:15
问题 Here is an example to use SQLXMLBULKLOADLib.SQLXMLBulkLoad4Class - [STAThread] static void Main(string[] args) { try { SQLXMLBULKLOADLib.SQLXMLBulkLoad4Class objBL = new SQLXMLBULKLOADLib.SQLXMLBulkLoad4Class(); objBL.ConnectionString = "Provider=sqloledb;server=server;database=databaseName;integrated security=SSPI"; objBL.ErrorLogFile = "error.xml"; objBL.KeepIdentity = false; objBL.Execute ("schema.xml","data.xml"); } catch(Exception e) { Console.WriteLine(e.ToString()); } } It seems that

Can't identify reason for BULK INSERT errors

孤街醉人 提交于 2019-12-11 02:38:58
问题 I'm trying to run this query (I also tried it without specifying FIELDTERMINATOR and ROWTERMINATOR). It's using a datafile that I am manually creating beforehand (not with bcp out). BULK INSERT FS.dbo.Termination_Call_Detail FROM 'C:\Termination_Call_Detail__1317841711.dat' WITH ( FORMATFILE = 'C:\Termination_Call_Detail__update_TerminationCallDetailData.fmt', FIELDTERMINATOR = '\t', ROWTERMINATOR = '\r\n' ) The errors I'm getting: Server message number=4864 severity=16 state=1 line=1 server

How to import CSV with unrecognized datetime format?

前提是你 提交于 2019-12-11 02:06:05
问题 This is how the table looks like: CREATE TABLE [dbo].[temptable] ( [id] [nvarchar] (50) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL , [datetime] [datetime] NOT NULL, [status] [nvarchar] (50) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL, [col4] [money] NULL, [col5] [float] NULL, [col6] [money] NULL, [col7] [float] NULL, [col8] [money] NULL, [total] [money] NOT NULL ) This is how the CSV looks like: "ID","Date","status","Total" "1611120001","12/11/2016 10:06 AM","closed","8.15" "1611120002",

T-SQL Insert into multiple linked tables using a condition and without using a cursor

爷,独闯天下 提交于 2019-12-11 01:39:23
问题 T-SQL Insert into multiple linked tables using a condition and without using a cursor. Hello, I have the following tables CREATE TABLE [dbo].[TestMergeQuote]( [uid] [uniqueidentifier] NOT NULL, [otherData] [nvarchar](50) NULL, CONSTRAINT [PK_TestMergeQuote] PRIMARY KEY CLUSTERED ( [uid] ASC )WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY] ) ON [PRIMARY] ALTER TABLE [dbo].[TestMergeQuote] ADD CONSTRAINT [DF

ios Coredata large set insert

眉间皱痕 提交于 2019-12-11 00:09:47
问题 hey i am stuck with the same problem for days , time for insert increases gradually and in lower ipads it also crashes with memory problem .To insert 20k records it takes 4-5 minutes.Will background thread improve efficieny. ? Is there anyway i can optimize this. Please help out if you can. +(BOOL) addObjectToProfessionalsDBWithDict:(NSArray*)profArray{ if (!([profArray count]>0 && profArray )) { return NO; } NSManagedObjectContext *thisContext=[self getManagedObjectContext]; for (int i=0; i<