bulk

Restful PATCH on collection to update sorting parameter in bulk

若如初见. 提交于 2020-02-03 04:48:46
问题 We have a big list ("collection") with a number of entities ("items"). This is all managed via a RESTful interface. The items are manually sortable via an order property on the item. When queried, the database lists all items in a collection based on the order. Now we want to expose this mechanism to users where they can update the complete sorting of all items in one call. The database does not allow the same order for the same collection_id (unique collection_id + order ), so you can't (and

Header “To:” for a Bulk Email Sender [duplicate]

一笑奈何 提交于 2020-01-22 02:16:07
问题 This question already has answers here : How to send email to multiple recipients using python smtplib? (13 answers) Closed 12 months ago . I`m trying to make a python code to send some newsletter to people have signed up to a list. my problem is with Header "To:" part! I can't put emails in a list as "To:" address, and when receivers open the email, they don't see their email address in "To:" header. And here is a screenshot of what I'm talking about: http://tinypic.com/r/zlr7sl/9 I`m not a

SCOPE_IDENTITY() for bulk insert in SSIS

点点圈 提交于 2020-01-17 09:09:51
问题 In SSIS, I am able to make an insert of rows and retrieve their SCOPE_IDENTITY using OLE DB Command Task which calls stored procedure, but this is not bulk insert, it's slow load. Is it possible to get id of inserted rows using bulk insert in SSIS? Example: When inserting Customer, first I have to insert record in Person table and then use this FK in Customer table. UPDATE: Here's a structure of Person and Customer tables that need to be populated from external source. One option is to have

SCOPE_IDENTITY() for bulk insert in SSIS

点点圈 提交于 2020-01-17 09:09:50
问题 In SSIS, I am able to make an insert of rows and retrieve their SCOPE_IDENTITY using OLE DB Command Task which calls stored procedure, but this is not bulk insert, it's slow load. Is it possible to get id of inserted rows using bulk insert in SSIS? Example: When inserting Customer, first I have to insert record in Person table and then use this FK in Customer table. UPDATE: Here's a structure of Person and Customer tables that need to be populated from external source. One option is to have

How to change slow parametrized inserts into fast bulk copy (even from memory)

梦想与她 提交于 2020-01-12 10:09:31
问题 I had someting like this in my code (.Net 2.0, MS SQL) SqlConnection connection = new SqlConnection(@"Data Source=localhost;Initial Catalog=DataBase;Integrated Security=True"); connection.Open(); SqlCommand cmdInsert = connection.CreateCommand(); SqlTransaction sqlTran = connection.BeginTransaction(); cmdInsert.Transaction = sqlTran; cmdInsert.CommandText = @"INSERT INTO MyDestinationTable" + "(Year, Month, Day, Hour, ...) " + "VALUES " + "(@Year, @Month, @Day, @Hour, ...) "; cmdInsert

How to use elasticsearch.helpers.streaming_bulk

送分小仙女□ 提交于 2020-01-11 08:25:31
问题 Can someone advice how to use function elasticsearch.helpers.streaming_bulk instead elasticsearch.helpers.bulk for indexing data into elasticsearch. If I simply change streaming_bulk instead of bulk, nothing gets indexed, so I guess it needs to be used in different form. Code below creates index, type and index data from CSV file in chunks of 500 elemens into elasticsearch. It is working properly but I am wandering is it possible to increse prerformance. That's why I want to try out streaming

bulk insert a date in YYYYMM format to date field in MS SQL table

有些话、适合烂在心里 提交于 2020-01-10 05:47:06
问题 I have a large text file (more than 300 million records). There is a field containing date in YYYYMM format. Target field is of date type and I'm using MS SQL 2008 R2 server. Due to huge amount of data I'd prefer to use bulk insert. Here's what I've already done: bulk insert Tabela_5 from 'c:\users\...\table5.csv' with ( rowterminator = '\n', fieldterminator = ',', tablock ) select * from Tabela_5 201206 in file turns out to be 2020-12-06 on server, whereas I'd like it to be 2012-06-01 (I don

Postgres bulk insert/update that's injection-safe. Perhaps a function that takes an array? [duplicate]

六月ゝ 毕业季﹏ 提交于 2020-01-06 07:19:13
问题 This question already has answers here : Improving a function that UPSERTs based on an input array (2 answers) Closed 5 months ago . I'm working on paying back some technical debt this week, and it hit me that I have no idea how to make multi-value inserts safe from accidental or malicious SQL injections. We're on Postgres 11.4. I've got a test bed to work from that includes a small table with about 26K rows, here's the declaration for a small table I'm using for testing: BEGIN; DROP TABLE IF

BCP file format for SQL bulk insert of CSV file

梦想与她 提交于 2020-01-05 08:12:29
问题 I'm trying a bulk insert of a csv file into a SQL table using BCP but can't fix this error: "The column is too long in the data file for row 1, column 2. Verify that the field terminator and row terminator are specified correctly." - Can anyone help please? Here's my SQL code: BULK INSERT UKPostCodeStaging FROM 'C:\Users\user\Desktop\Data\TestFileOf2Records.csv' WITH ( DATAFILETYPE='char', FIRSTROW = 1, FORMATFILE = 'C:\Users\User\UKPostCodeStaging.fmt'); Here's my test data contained in

Update the results of a SELECT statement

江枫思渺然 提交于 2020-01-01 03:19:09
问题 Oracle lets you update the results of a SELECT statement. UPDATE (<SELECT Statement>) SET <column_name> = <value> WHERE <column_name> <condition> <value>; I suppose that this could be used for updating columns in one table based on the value of a matching row in another table. How is this feature called, can it efficiently be used for large updates, does it work when the SELECT joins multiple tables, and if so, how? 回答1: I haven't seen a formal name for this. The Oracle SQL Reference just