pool

criticism this python code (crawler with threadpool)

人走茶凉 提交于 2019-12-11 14:45:37
问题 how good this python code ? need criticism) there is a error in this code, some times script do print "ALL WAIT - CAN FINISH!" and freeze (no more actions are happend..) but i can't find reason why this happend? site crawler with threadpool: import sys from urllib import urlopen from BeautifulSoup import BeautifulSoup, SoupStrainer import re from Queue import Queue, Empty from threading import Thread W_WAIT = 1 W_WORK = 0 class Worker(Thread): """Thread executing tasks from a given tasks

Python & Redis: Manager/Worker application best practices

我是研究僧i 提交于 2019-12-11 12:11:18
问题 I have a few general questions about using Python and Redis to create a job queue application for running asynchronous commands. Here is the code I have generated so far: def queueCmd(cmd): r_server.rpush("cmds", cmd) def printCmdQueue(): print r_server.lrange("cmds", 0 , -1) def work(): print "command being consumed: ", r_server.lpop("cmds") return -1 def boom(info): print "pop goes the weasel" if __name__ == '__main__': r_server = redis.Redis("localhost") queueCmd("ls -la;sleep 10;ls")

Bouncing ball in a circle (Python Turtle)

落爺英雄遲暮 提交于 2019-12-11 11:55:14
问题 I am currently working on a circular billiards program using Turtle. My problem is that I can't figure out what angle or position I need to give Python once the ball has reached the sides of the circle in order to make it bounce. Here is the part of my program that needs to be fixed: while nbrebonds>=0: forward(1) if (distance(0,y)>rayon): #rayon means radius print(distance(0,y)) left(2*angleinitial) #I put this angle as a test but it doesn't work forward(1) nbrebonds+=(-1) 回答1: From what I

Multiprocessing pool and queues

巧了我就是萌 提交于 2019-12-11 09:59:05
问题 I am using multiprocessing with pools. I need to pass a structure as argument to a function that has to be used in separate processes. I am facing an issue with the mapping functions of the multiprocessing.Pool , since I cannot duplicate neither Pool.Queue , nor Pool.Array . This structure is to be used on the fly to log the result of each terminated process. Here is my code: import multiprocessing from multiprocessing import Process, Manager, Queue, Array import itertools import time def do

Caused by: javax.naming.NameNotFoundException: pools

萝らか妹 提交于 2019-12-11 08:42:52
问题 I am fighting this error for a few days already. I can't find reasonable explanation about it on the Internet: I use Glassfish 3.1.2.2 (build 5) 2012-12-19T17:54:26.046+0200 javax.enterprise.resource.resourceadapter.com.sun.enterprise.connectors.util WARNING: RAR8068: Using default datasource : __ds_jdbc_ra for pool : JavaHelpPool 2012-12-19T17:54:26.093+0200 javax.enterprise.resource.resourceadapter.com.sun.enterprise.connectors.service SEVERE: RAR8061: failed to load resource-adapter-config

Multiprocessing Error with Results

我的未来我决定 提交于 2019-12-11 06:57:47
问题 I have a small problem in a big program, so I made a small example, which shows my problem: import multiprocessing class class1(): def classfunction1(self, a): self.x = a class class2(): def classfunction2(self, a): self.y = a def test(i): print("I'm in the Testfunction") b = i * class1.x * class2.y return b class1 = class1() class2 = class2() if __name__ == "__main__": x = 1 y = 2 class1.classfunction1(x) class2.classfunction2(y) print("This variable is callable", class1.x) print("And this

Getting erratic runtime exceptions trying to access persistant data in multiprocessing.Pool worker processes

你。 提交于 2019-12-11 06:09:13
问题 Inspired by this solution I am trying to set up a multiprocessing pool of worker processes in Python. The idea is to pass some data to the worker processes before they actually start their work and reuse it eventually. It's intended to minimize the amount of data which needs to be packed/unpacked for every call into a worker process (i.e. reducing inter-process communication overhead). My MCVE looks like this: import multiprocessing as mp import numpy as np def create_worker_context(): global

Python multiprocessing.Pool & memory

北慕城南 提交于 2019-12-11 05:48:33
问题 I'm using Pool.map for a scoring procedure: "cursor" with millions of arrays from a data source calculation save the result in a data sink The results are independent. I'm just wondering if I can avoid the memory demand. At first it seems that every array goes into python and then the 2 and 3 are proceed. Anyway I have a speed improvement. #data src and sink is in mongodb# def scoring(some_arguments): ### some stuff and finally persist ### collection.update({uid:_uid},{'$set':res_profile}

Is there ever a reason to set maxIdle > maxActive for connection pools?

巧了我就是萌 提交于 2019-12-11 01:58:29
问题 I'm just learning about connection pools, and I was wondering if there's ever any reason to set maxIdle > maxActive . This is my understanding: Idle connections are connections that have been created and are waiting to be used. It becomes an active connection once a client borrows it. minIdle determines the number of initial connections to create in a pool. When a client tries to use the pool, an idle connection is given. If there are no available idle connections, the pool will create one.

Creating and reusing objects in python processes

你说的曾经没有我的故事 提交于 2019-12-10 19:48:53
问题 I have an embarrassingly parallelizable problem consisting on a bunch of tasks that get solved independently of each other. Solving each of the tasks is quite lengthy, so this is a prime candidate for multi-processing. The problem is that solving my tasks requires creating a specific object that is very time consuming on its own but can be reused for all the tasks (think of an external binary program that needs to be launched), so in the serial version I do something like this: def costly