jobs

Laravel 5.1 failed queued jobs fails on failed() method, prevents queue failure event handler from being called

一笑奈何 提交于 2019-12-03 14:43:23
I am testing the queue functions in Laravel 5.1. I can make jobs queue up in my db table, called jobs, and I can get them to run successfully. I also created a queue failure table called failed_jobs. To test it, inside the jobs table I manipulate the payload data to make it fail then I run the queue worker daemon like so, so it will put the job in the failed_jobs table after one failed attempt: php artisan queue:work --daemon --tries=1 --queue=myqueue When the job fails it is immediately put into failed_jobs table as expected. FYI I have set things up just like the Laravel 5.1 docs recommend:

SQL Agent Job - “Run As” drop down list is empty

ε祈祈猫儿з 提交于 2019-12-03 10:33:53
Why is the "Run As" drop down list is always empty when I try to set up a SQL Agent Job? I am trying to set up some SQL Agent Jobs to run using a proxy account. I am a member of the SQLAgentUserRole, SQLAgentReaderRole, and SQLAgentOperatorRole. When I try to add a step to to the job, I select SQL Integration Services Package and the Run As drop down list is empty. Anyone who is a sysadmin can view the proxy. Shouldn't I be able to use the proxy as a member of SQLAgentUserRole, SQLAgentReaderRole, and SQLAgentOperatorRole? What am I missing here? (The proxy account is active to the subsystem:

mongodb status of index creation job

老子叫甜甜 提交于 2019-12-03 09:10:52
问题 I'm using MongoDB and have a collection with roughly 75 million records. I have added a compound index on two "fields" by using the following command: db.my_collection.ensureIndex({"data.items.text":1, "created_at":1},{background:true}). Two days later I'm trying to see the status of the index creation. Running db.currentOp() returns {} , however when I try to create another index I get this error message: cannot add index with a background operation in progress. Is there a way to check the

What's the best way to start a background process, that can get accessed later on

旧街凉风 提交于 2019-12-03 08:51:13
I am currently developing a RubyGem that provides an executable. The executable keeps track of the state of some log files using the FSSM gem. This executable should get started, do something in background, and get stopped later on. For example: $ my_executable start # do something different... $ my_executable stop I would first start a new process, that does the file watching stuff, inside the start method. But I don't know how to address this process for stopping it. What's the best way to provide such a behavior? Regards pid = Process.fork{exec 'gcalctool'} #don't use 'system' or

Automatically pulling data from a PowerShell job while running

前提是你 提交于 2019-12-03 08:38:53
While trying to do something quite possibly beyond the means of PowerShell I seem to have ran into a brick wall. I have a main form script, which orchestrates most of my functions but I need another script to open a listener (system.Net.Sockets.Udpclient.Receive) and keep feeding in information to a textbox in the main form throughout the entire program's running. For the life of me I can't get around this daft non-child environment that jobs suffer from; no dot sourcing, no global scoped variables, nothing. I can put an object-listener on it for statechanged to completion and then open

Online Job Portal System Use Case Diagrams

送分小仙女□ 提交于 2019-12-03 08:36:18
I want to have a correct use case diagram for an online job portal system. Here is my attemp: I have some doubts: I can't see where making "Login" use case witch is an important use case for this system. This use case diagram is not showing the difference between a simple visitor and a registered one. The former could view vacancies, view advice without the obligation for having an account. The latter could view vacancies, view advice, upload CV (after be logged), apply for a job (after be logged) ... Is having two actors "Simple visitor" and "Registred Visitor" in my diagram will be correct?

How to make the client download a very large file that is genereted on the fly

守給你的承諾、 提交于 2019-12-03 08:26:59
I have an export function that read the entire database and create a .xls file with all the records. Then the file is sent to the client. Of course, the time of export the full database requires a lot of time and the request will soon end in a timeout error. What is the best solution to handle this case? I heard something about making a queue with Redis for example but this will require two requests: one for starting the job that will generate the file and the second to download the generated file. Is this possible with a single request from the client? Excel Export: Use Streams . Following is

How to schedule a job once every Thursday using Kue?

↘锁芯ラ 提交于 2019-12-03 03:11:56
Using Kue , how do I schedule a job to be executed once every Thursday? The Kue readme mentions that I can delay a Job, but what about repeatedly executing the Job at a specific time? I can do what I want with a cron job, but I like Kue's features. What I want is to process a Job once anytime on Thursday, but only once. I had a similar question and I basically came up with the following. If anyone else has a different solution I would love to see some other ideas. var jobQueue = kue.createQueue(); // Define job processor jobQueue.process('thursday-jobs', function (job, done) { var

Use qdel to delete all my jobs at once, not one at a time

喜你入骨 提交于 2019-12-03 00:37:23
问题 This is a rather simple question but I haven't been able to find an answer. I have a large number of jobs running in a cluster (>20) and I'd like to delete them all and start over. According to this site I should be able to just do: qdel -u netid to get rid of them all, but in my case that returns: qdel: invalid option -- 'u' usage: qdel [{ -a | -c | -p | -t | -W delay | -m message}] [<JOBID>[<JOBID>]|'all'|'ALL']... -a -c, -m, -p, -t, and -W are mutually exclusive which obviously indicates

Spring Batch - How to generate parallel steps based on params created in a previous step

爷,独闯天下 提交于 2019-12-02 17:30:34
问题 Introduction I am trying to use jobparameters created in a tasklet to create steps following the execution of the tasklet. A tasklet tries to finds some files (findFiles()) and if it finds some files it saves the filenames to a list of strings. In the tasklet I pass the data as following: chunkContext.getStepContext().getStepExecution().getExecutionContext().put("files", fileNames); The next step is a parallel flow where for each file a simple reader-processor-writer step will be executed (if