bulkinsert

(“IID_IColumnsInfo”) error with SQL Server BULK INSERT of CSV file

折月煮酒 提交于 2019-12-11 16:55:36
问题 I'm new to SQL Server, so forgive me for being a bit of a noob here. The code shown here returns the following error: Cannot obtain the required interface ("IID_IColumnsInfo") from OLE DB provider "BULK" for linked server "(null)". Code: BULK INSERT testingtable FROM 'D:\TimeLords\data\db-test-file.csv' WITH (FORMAT = 'CSV', FIELDQUOTE = '"', FIRSTROW = 2, FIELDTERMINATOR = ',', ROWTERMINATOR = '\n', TABLOCK) I've tried using: ROWTERMINATOR = '0x0a' and ROWTERMINATOR = '\r\n' This is the CSV

Merge two database into a single database

你说的曾经没有我的故事 提交于 2019-12-11 12:49:30
问题 I have two dump of same database from different mysql servers. i need to combine these two dumps into a single database automatically by adjusting the primary keys and foreign keys. For example consider two sql dumps say mysqldump1 and mysqldump2 which contains two tables country_table and child_table. Here follows the data of two dumps Dump: mysqldump1.dmp Table1: country_table +----+-------------------+ | id | Country | +----+-------------------| | 1 | India | | 2 | China | | 3 | USA | | 4

BULK INSERT returns error “Access is denied”

Deadly 提交于 2019-12-11 11:48:39
问题 When running Bulk Insert BULK INSERT MyDatabase.dbo.MyTable FROM '\\Mylaptop\UniversalShare\SQLRuleOutput.csv' WITH (FIRSTROW = 2, FIELDTERMINATOR = ',', ROWTERMINATOR = '\n') on a remote SQL Server I get this error: "Cannot bulk load because the file "\MyLaptop\UniversalShare\SQLRuleOutput.csv" could not be opened. Operating system error code 5(Access is denied.)." The share is open to all. I have run PowerShell Invoke-SQLCMD scripts on that SQL Server that where it connects to that same

How to run DB2 import/load using Perl

和自甴很熟 提交于 2019-12-11 08:33:27
问题 Has anyone ever try to use DB2 import from within a Perl program? My application is inserting 10 million of rows and apparently connecting using DBI and doing the insert row by row takes forever. DB2 import/load from the command line works great, however is there a better way rather than to call a system call from the Perl program to invoke: use IPC::System::Simple qw( systemx ); use autodie; systemx( "db2 connect ..." ); systemx( "db2 import ..." ); etc? thanks! 回答1: I have actually had

Optimizing MySQL inserts to handle a data stream

不打扰是莪最后的温柔 提交于 2019-12-11 07:37:47
问题 I am consuming a high rate data stream and doing the following steps to store data in a MySQL database. For each new arriving item. (1) Parse incoming item. (2) Execute several "INSERT ... ON DUPLICATE KEY UPDATE" I have used INSERT ... ON DUPLICATE KEY UPDATE to eliminate one additional round-trip to the database. While trying to improve the overall performance, I have considered doing bulk updates in the following way: (1) Parse incoming item. (2) Generate SQL statement with "INSERT ... ON

Bulk Import Unicode with SQL Server 2016

喜夏-厌秋 提交于 2019-12-11 07:27:22
问题 since we have migrated to the SQL Server 2016 we now trying to import Unicode characters into a table via BULK IMPORT using non-XML format files and UTF-8 encoded data files (with newline Unix (LF)) . The format files specify the host file data length but not the terminator. The host file data type is SQLCHAR. My BULK INSERT statement looks like: SET @cmd = N'Bulk Insert myTable from ''D:\DATA\datafile'' with (DATAFILETYPE =''widechar'', KEEPNULLS, FORMATFILE = ''D:\DATA\fmt\formatfile.ftm''

Laravel add dynamic input fields

。_饼干妹妹 提交于 2019-12-11 07:23:40
问题 I want to insert dynamic fields into DB. I'm using the following code but it does not work as I expect. <html> <input id="reporting" type="text" value="salman" name="reporting[]"> <input id="reporting" type="text" value="ankur" name="reporting[]"> </html> <?php $report = Input::get('reporting'); for($i=0; $i<=count($report);$i++) { $news = new Reporting(); $news->user_id = 1; $news->reporting = $report; $news->save(); } ?> expected result: user_id || reporting 1 Salman 1 Ankur Can you guys

How to save bulk records as transaction to SQL server more efficiently?

◇◆丶佛笑我妖孽 提交于 2019-12-11 07:19:28
问题 I am working on a C# MVC application. In this application user is uploading data from EXCEL spreadsheet and data is showing to grid. After it has been showing to grid, user hit 'validate data' button. Application needs to perform UI (data length, empty field, data formats, etc.) validation and additionally SQL validations are also required for eg. record should not already exists already, any constraints, etc. After validation data is displayed to user for any errors associated with each row,

Bulk insert images into SQL Server database

对着背影说爱祢 提交于 2019-12-11 07:06:35
问题 I want to insert images in this way: DECLARE @lpath varchar(100) SET @lpath = 'd:\Photo\5604.jpg' --insert into Photos(id, Photo, Path) SELECT 4144, *, @lpath FROM OpenRowSet(BULK @lpath, Single_blob) AS i but it's not working If I execute the code like this: SELECT 1, *, @lpath FROM OpenRowSet(BULK N'd:\Photo\5604.jpg', Single_blob) AS i it works well. How to execute script like in the first way? 回答1: You cannot use variables in OpenRowSet , try to use dynamic SQL like this: DECLARE @lpath

insert multiple rows via a php array into mysql

杀马特。学长 韩版系。学妹 提交于 2019-12-11 05:48:26
问题 I'm passing a large dataset into a MySQL table via PHP using insert commands and I'm wondering if its possible to insert approximately 1000 rows at a time via a query other than appending each value on the end of a mile long string and then executing it. I am using the CodeIgniter framework so its functions are also available to me. 回答1: Assembling one INSERT statement with multiple rows is much faster in MySQL than one INSERT statement per row. That said, it sounds like you might be running