bulk

How to bulk insert only new rows in PostreSQL

孤者浪人 提交于 2019-12-28 04:27:08
问题 I have list of products (3 million items) without IDs - only titles. But I don't know which titles already exist in DB. New products (about 2.9 million items) must be added into DB. After that I must know ID for each products (new and existing). Is there the fastest way to do it in PostgreSQL? I can change DB as needed (add default values, add columns etc.). 回答1: Import data COPY everything to a temporary staging table and insert only new titles into your target table. CREATE TEMP TABLE tmp

Put files automatically in folders

余生长醉 提交于 2019-12-25 10:54:23
问题 I have thousands of JPGs named like this "aaa0001.jpg, aaa0002.jpg, aaa0003.jpg, bbb0001.jpg, bbb0002.jpg, bbb0003.jpg, ccc0001.jpg, ccc0002.jpg, ccc0003.jpg etc." in one folder. I have created 26 folders like this aaa, bbb, ccc, ddd etc. Is it possible to create a script that sets all the images in the appropriate folder? Result "aaa0001.jpg, aaa0002.jpg, aaa0003.jpg" into folder "aaa", "bbb0001.jpg, bbb0002.jpg, bbb0003.jpg" into folder "bbb" etc. Thank you! My system is windows XP prof SP3

Programmatically delete Azure blob storage objects in bulks

為{幸葍}努か 提交于 2019-12-25 08:49:29
问题 Is it possible to programmatically delete Azure blob objects in bulks? Deleting objects one by one sometimes takes us several days. We put a lot of new files on the Azure Blob Storage and all the outdated files we want deleted to avoid unwanted charges. I've googled over the web/MSDN/Stack Overflow and found only one topic on MSDN from 2014 that was referring to create feature request on the Microsoft site. 回答1: Is it possible to programmatically delete Azure blob objects in bulks? No. You

PowerShell Regex Bulk Replace Filenames [duplicate]

ε祈祈猫儿з 提交于 2019-12-24 20:51:22
问题 This question already has answers here : PowerShell: Quoting -replace & variables (4 answers) Closed last year . I am trying to replace filenames in a given folder, but using regular expressions as a filter and in the new filenames, in PowerShell. For example, a file name "CEX-13" should be renamed to "C-0013"; "CEX-14" should change to "C-0014", etc. I have this, but I get an error that I cannot create a file that already exists. My current code is: foreach ($file in get-childitem | where

Add N tags to a LOT of objects using acts_as_taggable_on

别说谁变了你拦得住时间么 提交于 2019-12-24 14:50:35
问题 I'm using the acts_as_taggable_on gem on a Rails (Ruby 1.9.3) project. I provide a form to my admins to add 1..n tag(s) to list of resources (let's say clients). I didin't find a way to do this in bulk. Right now I'm looping on every client and add one tag then save the object. This is hurting the server a lot when I'm trying on X thousand clients, eventually creating a timeout. I was wondering if there is a way to apply a tag to an ActiveRecord collection or something. If this is documented

Bulk Indexing in Elasticssearch using the ElasticLowLevelClient client

独自空忆成欢 提交于 2019-12-24 06:24:08
问题 I'm using the ElasticLowLevelClient client to index elasticsearch data as it needs to be indexed as a raw string as I don't have access to the POCO objects. I can successfully index an individual object by calling: client.Index<object>(indexName, message.MessageType, message.Id, new Elasticsearch.Net.PostData<object>(message.MessageJson)); How can I do a bulk insert into the index using the ElasticLowLevelClient client? The bulk inset APIs all require a POCO of the indexing document which I

How to handle Symfony form collection with 500+ items

人走茶凉 提交于 2019-12-23 09:31:56
问题 I have form collection which need to handle more than 500 entity instances. After I increased timeout to 60s and increased max_input_vars form work but it is annoying how slow it is. Rendering form is slow but submitting that big form is pain in the ass. I was considering creating plain HTML form but there is some other drawback suck as validation. So, is there any proper way to handle that big set of data via symfony form ? CONTROLLER: public function ratesCardAction() { $bannerList = $this-

Bulk download of pdf with Scrapy and Python3

与世无争的帅哥 提交于 2019-12-23 04:30:05
问题 I would like to bulk download free-to-download pdfs (copies of an old newspaper from 1843 to 1900 called Gaceta) from this website of the Nicaraguan National Assembly with Python3 / Scrapy . I am a absolute beginner in programming and python, but tried to start with a(n unfinished) script: #!/usr/bin/env python3 from urllib.parse import urlparse import scrapy from scrapy.http import Request class gaceta(scrapy.Spider): name = "gaceta" allowed_domains = ["digesto.asamblea.gob.ni"] start_urls =

PowerShell Command To Bulk Rename Files Sequentially

天大地大妈咪最大 提交于 2019-12-22 18:43:31
问题 I am trying to make batch file naming easier for my end users. In photographing our locations, we can sometimes have only 1 photo or 500 depending on the size of the location. The code below works beautifully to bulk rename our photos based on how many files are in the directory: $prefix = "[SomePrefix]" $files = Get-ChildItem $id = 1 $files | foreach { Rename-Item -Path $_.fullname -NewName ( $prefix + ((($id++).tostring()).padleft(($files.count.tostring()).length) -replace ' ','0' ) + $_

postgresql: how to get primary keys of rows inserted with a bulk copy_from?

痞子三分冷 提交于 2019-12-22 10:27:54
问题 The goal is this: I have a set of values to go into table A , and a set of values to go into table B . The values going into B reference values in A (via a foreign key), so after inserting the A values I need to know how to reference them when inserting the B values. I need this to be as fast as possible. I made the B values insert with a bulk copy from: def bulk_insert_copyfrom(cursor, table_name, field_names, values): if not values: return print "bulk copy from prepare..." str_vals = "\n"