queue

Laravel Queued Job doesn't wait for exec to coplete

╄→гoц情女王★ 提交于 2019-12-24 15:15:12
问题 i have my queues set up and working (the jobs get run), however the script doesn't seem to wait for my exec line to run before progressing with the next line of code. This means i'm getting exceptions in the next few lines (because it's looking for a file that hasn't been produced yet) My closure is: Queue::push(function($job) use ($gid,$eid) { $phantomLoc = base_path()."/vendor/bin/phantomjs"; $scriptLoc = app_path()."/libraries/makeVideo.js"; $pageAddress = route('image_maker_video', array(

Python: interdependent process/thread queues

青春壹個敷衍的年華 提交于 2019-12-24 12:45:33
问题 I have four queues that each have multiple processes/threads that are interdependent in the following way: Queue 1 is reading a file from disk and copying to RAM Queue 2 takes the file in RAM and performs an operation on it Queue 3 takes the result of Queue 2 and performs a separate operation on it Queue 4 writes the end result back to disk I would like these 4 queues to operate in parallel as much as possible with the caveat that Queue 2 has to wait for Queue 1 to place at least one process

Bull Queue is not getting completed

浪尽此生 提交于 2019-12-24 11:27:39
问题 Am new to Bull.I have tried running bull based on their documentation code. The Process are starting but my job is not getting completed, or am not sure whether its triggering complete event or not? Am not sure where am making a mistake Attaching my code below const Queue = require('bull'); const myFirstQueue = new Queue('my-first-queue', { redis: { port: Config.redis.port, host: Config.redis.host, password: Config.redis.password }, }); (async function ad() { const job = await myFirstQueue

Laravel 5.5 redis queue is too slow

人盡茶涼 提交于 2019-12-24 11:01:34
问题 I have dispatch call in action: dispatch(new ProcessVideo($video)); logger('After dispatch at ' . Carbon::now()->format('H:i:s.u')); and job: public function handle() : void { logger('ProcessVideo@handle at ' . Carbon::now()->format('H:i:s.u')); } In logs we can see that interval between dispatch and handling from queue more than 2.5 seconds ! [2017-10-11 00:02:55] local.DEBUG: After dispatch at 00:02:55.423141 [2017-10-11 00:02:58] local.DEBUG: ProcessVideo@handle at 00:02:58.071249 What the

Is the order of batches guaranteed in Keras' OrderedEnqueuer?

空扰寡人 提交于 2019-12-24 10:44:24
问题 I have a custom keras.utils.sequence which generates batches in a specific (and critical) order. However, I need to parellelise batch generation across multiple cores. Does the name ' OrderedEnqueuer ' imply that the order of batches in the resulting queue is guaranteed to be the same as the order of the original keras.utils.sequence ? My reasons for thinking that this order is not guaranteed: OrderedEnqueuer uses python multiprocessing 's apply_async internally. Keras' docs explicitly say

Solution for simple grid computing in local network

痞子三分冷 提交于 2019-12-24 10:39:23
问题 I'd like to develop a simple solution using .NET for the following problem: We have several computers in a local network: 10 client computers that may need to execute a program that is only installed on two workstations The two workstations that are only used to execute the defined program A server that can be used to install a service available from all previously described computers When a client computer needs to execute the program, he would send a request to the server, and the server

Ruby on Rails - Users Scheduling Tasks

可紊 提交于 2019-12-24 09:57:56
问题 I am working on an application which people can sign up to and schedule tasks to be done. When the user schedules a task they enter a date and time. I want the application to wake up and send an email to the user at the date and time they have entered. I am unsure what gems/plugins to use in rails to achieve this, anyone have a suggestion? Cheers Eef 回答1: delayed_job should do the trick. It allows you to schedule tasks to run at a particular time. All you need to do is schedule a function

Can't put Future in Manager().Queue in Python

筅森魡賤 提交于 2019-12-24 09:28:26
问题 I have a bunch of client processes which make queries to a single worker process (which then processes them in batches) using a shared queue. However the clients need to know when the results are ready, so I tried putting a tuple with a concurrent.futures.Future along with the request in the queue, but Python throws a "cannot pickle" exception (this only happens with the future object). Is there a picklable version of Future that I can use for this purpose, or a better design altogether? 来源:

If Javascript is not multithreaded, is there any reason to implement asynchronous Ajax Queuing?

最后都变了- 提交于 2019-12-24 07:17:51
问题 I am having issues with my php server (my computer is the only connection). I initially thought part of the reason was because of too many ajax requests (I have a script that executes an ajax request per keystroke) so I implemented a design to control the flow of ajax requests into a queue. Below is my code: //global vars: activeAjaxThread = 0; //var describing ajax thread state ajaxQue = []; //array of ajax request objects in queue to be fired after the completion of the previous request

How to use Queue in concurrent.futures.ProcessPoolExecutor()?

断了今生、忘了曾经 提交于 2019-12-24 04:12:32
问题 Disclaimer : I'm new to Python in general. I have a small experience with Go, where implementing a queue using a channel is really easy. I want to know how can I implement a Queue with ProcessPoolExecutor in Python 3. I want my N number of process to access a single queue so that I can just insert many jobs in the queue via the main thread, then the processes will just grab the jobs in the Queue. Or if there is a better way to share a list/queue between multiple processes. (Job Queue/ Worker