Hangfire - Prevent multiples of the same job being enqueued

拟墨画扇 提交于 2020-12-29 09:34:14

问题


Scenario:

Job 1 is scheduled to run every 5 minutes, and takes ~1 minute to complete.

A lot of work piles up and Job 1 takes 15 minutes to run.

There are now three Job 1's being processed concurrently - I don't want this.


How do I prevent Job 1 being added to the queue again if it is already there?

Is there Hangfire setting, or do I need to poll job statuses manually?


回答1:


You can use DisableConcurrentExecution attribute to prevent multiple executions of a method concurrently. Just put this attribute above your method -

[DisableConcurrentExecution(timeoutInSeconds: 10 * 60)]
public void Job1()
{
    // Metohd body
}



回答2:


Sounds like this could be something that you might be interested in: https://discuss.hangfire.io/t/job-reentrancy-avoidance-proposal/607/8

The discussion is about skipping jobs that would be executed concurrently to a already running job.




回答3:


There is an attribute called DisableConcurrentExecution, that prevents 2 Jobs of the same type running concurrently.

Though, in your case it could be best, to check if a task runs and skip accordingly.




回答4:


a bit late but i was using this class to prevent duplicate jobs to run concurrently

public class SkipConcurrentExecutionAttribute : JobFilterAttribute, IServerFilter, IElectStateFilter
{
    private readonly int _timeoutSeconds;
    private const string DistributedLock = "DistributedLock";

    public SkipConcurrentExecutionAttribute(int timeOutSeconds)
    {
        if (timeOutSeconds < 0) throw new ArgumentException("Timeout argument value should be greater that zero.");
        this._timeoutSeconds = timeOutSeconds;
    }

    public void OnPerformed(PerformedContext filterContext)
    {
        if (!filterContext.Items.ContainsKey(DistributedLock))
            throw new InvalidOperationException("Can not release a distributed lock: it was not acquired.");

        var distributedLock = (IDisposable)filterContext.Items[DistributedLock];
        distributedLock?.Dispose();
    }



    public void OnPerforming(PerformingContext filterContext)
    {
        filterContext.WriteLine("Job Started");

        var resource = String.Format(
                           "{0}.{1}",
                          filterContext.BackgroundJob.Job.Type.FullName,
                          filterContext.BackgroundJob.Job.Method.Name);

        var timeOut = TimeSpan.FromSeconds(_timeoutSeconds);

        filterContext.WriteLine($"Waiting for running jobs to complete. (timeout: { _timeoutSeconds })");

        try
        {
            var distributedLock = filterContext.Connection.AcquireDistributedLock(resource, timeOut);
            filterContext.Items[DistributedLock] = distributedLock;
        }
        catch (Exception ex)
        {
            filterContext.WriteLine(ex);
            filterContext.WriteLine("Another job is already running, aborted.");
            filterContext.Canceled = true; 
        }

    }

    public void OnStateElection(ElectStateContext context)
    {
        //if (context.CandidateState as FailedState != null)
        //{

        //}
    }
}

Hope that helps, thx!




回答5:


If you want to discard attempts to run something twice if it's already running you can always just do this (note no attributes applied):

    private static bool _isRunningUpdateOrders;
    public void UpdateOrders()
    {
        try
        {
            if (_isRunningUpdateOrders)
            {
                return; 
            }

            _isRunningUpdateOrders = true;

            // Logic...

        }
        finally 
        {
            _ isRunningUpdateOrders = false;
        }
   }



回答6:


Yes.It is possible as below:

            RecurringJob.AddOrUpdate(Environment.MachineName, () => MyJob(Environment.MachineName), Cron.HourInterval(2));

and MyJob should define like this:

    public void MyJob(string taskId)
    {
        if (!taskId.Equals(Environment.MachineName))
        {
            return;
        }
        //Do whatever you job should do.
    }


来源:https://stackoverflow.com/questions/45164369/hangfire-prevent-multiples-of-the-same-job-being-enqueued

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!