bulkinsert

Bulk Insert with filename parameter [duplicate]

只愿长相守 提交于 2019-11-27 22:47:00
This question already has an answer here: How to cast variables in T-SQL for bulk insert? 6 answers I need to load a couple of thousands of data files into SQL Server table. So I write a stored procedure that receives just one parameter - file name. But.. The following doesn't work.. The "compiler" complains on @FileName parameter.. It wants just plain string.. like 'file.txt'. Thanks in advance. Ilan. BULK INSERT TblValues FROM @FileName WITH ( FIELDTERMINATOR =',', ROWTERMINATOR ='\n' ) The syntax for BULK INSERT statement is : BULK INSERT [ database_name. [ schema_name ] . | schema_name. ]

Sybase JConnect: ENABLE_BULK_LOAD usage

白昼怎懂夜的黑 提交于 2019-11-27 21:57:16
问题 Can anyone out there provide an example of bulk inserts via JConnect (with ENABLE_BULK_LOAD ) to Sybase ASE? I've scoured the internet and found nothing. 回答1: I got in touch with one of the engineers at Sybase and they provided me a code sample. So, I get to answer my own question. Basically here is a rundown, as the code sample is pretty large... This assumes a lot of pre initialized variables, but otherwise it would be a few hundred lines. Anyone interested should get the idea. This can

Memory leak with large Core Data batch insert in Swift

青春壹個敷衍的年華 提交于 2019-11-27 20:49:14
I am inserting tens of thousands of objects into my Core Data entity. I have a single NSManagedObjectContext and I am calling save() on the managed object context every time I add an object. It works but while it is running, the memory keeps increasing from about 27M to 400M. And it stays at 400M even after the import is finished. There are a number of SO questions about batch insert and everyone says to read Efficiently Importing Data , but it's in Objective-C and I am having trouble finding real examples in Swift that solve this problem. Suragch There are a few things you should change:

How to use BULK INSERT when rows depend on foreign keys values?

安稳与你 提交于 2019-11-27 18:11:28
问题 My question is related to this one I asked on ServerFault. Based on this, I've considered the use of BULK INSERT. I now understand that I have to prepare myself a file for each entities I want to save into the database. No matter what, I still wonder whether this BULK INSERT will avoid the memory issue on my system as described in the referenced question on ServerFault. As for the Streets table, it's quite simple! I have only two cities and five sectors to care about as the foreign keys. But

Bulk Load Files into SQL Azure?

僤鯓⒐⒋嵵緔 提交于 2019-11-27 17:36:02
问题 I have an ASP.NET app that takes multimegabyte file uploads, writes them to disk, and later MSSQL 2008 loads them with BCP. I would like to move the whole thing to Azure, but since there are no "files" for BCP, can anyone comment on how to get bulk data from an Azure app into SQL Azure? I did see http://msdn.microsoft.com/en-us/library/system.data.sqlclient.sqlbulkcopy.aspx but am not sure if that applies. Thanks. 回答1: BCP is one way to do it. This post explains it in three easy steps: Bulk

How do I temporarily disable triggers in PostgreSQL?

白昼怎懂夜的黑 提交于 2019-11-27 17:21:55
I'm bulk loading data and can re-calculate all trigger modifications much more cheaply after the fact than on a row-by-row basis. How can I temporarily disable all triggers in PostgreSQL? Alternatively, if you are wanting to disable all triggers, not just those on the USER table, you can use: SET session_replication_role = replica; This disables triggers for the current session. To re-enable for the same session: SET session_replication_role = DEFAULT; Source: http://koo.fi/blog/2013/01/08/disable-postgresql-triggers-temporarily/ PostgreSQL knows the ALTER TABLE tblname DISABLE TRIGGER USER

Bulk Insert Sql Server millions of record

♀尐吖头ヾ 提交于 2019-11-27 14:07:18
I have a Windows Service application that receives a stream of data with the following format IDX|20120512|075659|00000002|3|AALI |Astra Agro Lestari Tbk. |0|ORDI_PREOPEN|12 |00000001550.00|00000001291.67|00001574745000|00001574745000|00500|XDS1BXO1| |00001574745000|›§ IDX|20120512|075659|00000022|3|ALMI |Alumindo Light Metal Industry Tbk. |0|ORDI |33 |00000001300.00|00000001300.00|00000308000000|00000308000000|00500|--U3---2| |00000308000000|õÄ This data comes in millions of rows and in sequence 00000002....00198562 and I have to parse and insert them according to the sequence into a database

How to create and populate a table in a single step as part of a CSV import operation?

喜夏-厌秋 提交于 2019-11-27 13:15:21
I am looking for a quick-and-dirty way to import CSV files into SQL Server without having to create the table beforehand and define its columns . Each imported CSV would be imported into its own table. We are not concerned about data-type inferencing. The CSV vary in structure and layout, and all of them have many many columns, yet we are only concerned with a few of them: street addresses and zipcodes. We just want to get the CSV data into the SQL database quickly and extract the relevant columns. I'd like to supply the FieldTerminator and RowTerminator, point it at the CSV, and have the

Efficient way to bulk insert with get_or_create() in Django (SQL, Python, Django)

久未见 提交于 2019-11-27 12:13:16
问题 Is there a more efficient way for doing this? for item in item_list: e, new = Entry.objects.get_or_create( field1 = item.field1, field2 = item.field2, ) 回答1: You can't do decent bulk insertions with get_or_create (or even create), and there's no API for doing this easily. If your table is simple enough that creating rows with raw SQL isn't too much of a pain, it's not too hard; something like: INSERT INTO site_entry (field1, field2) ( SELECT i.field1, i.field2 FROM (VALUES %s) AS i(field1,

SQL Bulk import from CSV

℡╲_俬逩灬. 提交于 2019-11-27 12:06:38
I need to import a large CSV file into an SQL server. I'm using this : BULK INSERT CSVTest FROM 'c:\csvfile.txt' WITH ( FIELDTERMINATOR = ',', ROWTERMINATOR = '\n' ) GO problem is all my fields are surrounded by quotes (" ") so a row actually looks like : "1","","2","","sometimes with comma , inside", "" Can I somehow bulk import them and tell SQL to use the quotes as field delimiters? Edit : The problem with using '","' as delimiter, as in the examples suggested is that : What most examples do, is they import the data including the first " in the first column and the last " in the last, then