问题
Upon looking up event handler modules, I came across pydispatcher, which seemed beginner friendly. My use case for the library is that I want to send a signal if my queue size is over a threshold. The handler function can then start processing and removing items from the queue (and subsequently do a bulk insert into the database).
I would like the handler function to run in the background. I am aware that I can simply overwrite the queue.append() method checking for the queue size and calling the handler function asynchronously, but I would like to implement the listener-dispatcher model to keep the logic clean and separated.
Does pydispatcher do this out of the box? If not, is there another module that can help me do this? Would I need to manage the access to the queue, since there might be multiple threads processing and appending to the queue at the same time?
Note that in my use case there is only a single dispatcher and event handler.
回答1:
I've recently released the Akuanduba module, that may help you with this task. There's a single example on the repository that may help you understand how it works and it seems similar to what you want.
Anyway, I'll try to explain here a way of implementing your code with Akuanduba:
- First you could make a data frame that would hold your queue:
# Mandatory imports
from Akuanduba.core.messenger.macros import *
from Akuanduba.core.constants import *
from Akuanduba.core import NotSet, AkuandubaDataframe
# Your imports go here:
from queue import Queue
class MyQueue (AkuandubaDataframe):
def __init__(self, name):
# Mandatory stuff
AkuandubaDataframe.__init__(self, name)
self.__queue = Queue ()
def getQueue (self):
return self.__queue
def putQueue (self, val):
self.__queue.put(val)
def getQueueSize (self):
return self.__queue.qsize()
#
# "toRawObj" method is a mandatory method that delivers a dict with the desired data
# for file saving
#
def toRawObj(self):
d = {
"Queue" : self.getQueue(),
}
return d
- Then you could make a TriggerCondition that would check the queue size:
from Akuanduba.core import StatusCode, NotSet, StatusTrigger
from Akuanduba.core.messenger.macros import *
from Akuanduba.core import TriggerCondition
import time
class CheckQueueSize (TriggerCondition):
def __init__(self, name, maxSize):
TriggerCondition.__init__(self, name)
self._name = name
self._maxSize = maxSize
def initialize(self):
return StatusCode.SUCCESS
def execute (self):
size = self.getContext().getHandler("MyQueue").getQueueSize()
if (size > SIZE_THRESHOLD):
return StatusTrigger.TRIGGERED
else:
return StatusTrigger.NOT_TRIGGERED
def finalize(self):
return StatusCode.SUCCESS
- Make a tool that would be your handler function:
# Mandatory imports
from Akuanduba.core import AkuandubaTool, StatusCode, NotSet, retrieve_kw
# Your imports go here:
class SampleTool(AkuandubaTool):
def __init__(self, name, **kw):
# Mandatory stuff
AkuandubaTool.__init__(self, name)
def initialize(self):
# Lock the initialization. After that, this tool can not be initialized once again
self.init_lock()
return StatusCode.SUCCESS
def execute(self,context):
#
# DO SOMETHING HERE
#
# Always return SUCCESS
return StatusCode.SUCCESS
def finalize(self):
self.fina_lock()
return StatusCode.SUCCESS
- And finally, make a main script in order to make it all work together:
# Akuanduba imports
from Akuanduba.core import Akuanduba, LoggingLevel, AkuandubaTrigger
from Akuanduba import ServiceManager, ToolManager, DataframeManager
# This sample's imports
import MyQueue, CheckQueueSize, SampleTool
# Creating your handler
your_handler = SampleTool ("Your Handler's name")
# Creating dataframes
queue = MyQueue ("MyQueue")
# Creating trigger
trigger = AkuandubaTrigger("Sample Trigger Name", triggerType = 'or')
# Append conditions and tools to trigger just adding them
# Tools appended to the trigger will only run when trigger is StatusTrigger.TRIGGERED,
# and will run in the order they've been appended
trigger += CheckQueueSize( "CheckQueueSize condition", MAX_QUEUE_SIZE )
trigger += your_handler
# Creating Akuanduba
manager = Akuanduba("Akuanduba", level=LoggingLevel.INFO)
# Appending tools
#
# ToolManager += TOOL_1
# ToolManager += TOOL_2
#
ToolManager += trigger
# Apprending dataframes
DataframeManager += sampleDataframe
# Initializing
manager.initialize()
manager.execute()
manager.finalize()
That way, you'd have clean and separated code.
来源:https://stackoverflow.com/questions/57402381/does-pydispatcher-run-the-handler-function-in-a-background-thread