Make Parallel.ForEach wait to get work until a slot opens

北战南征 提交于 2019-12-06 14:06:23

There is a way built in (kinda) to support exactly the situation you are describing.

When you create the ForEach you will need to pass in a ParallelOptions with a non-standard TaskScheduler. The hard part is creating a TaskSchedueler to do that priority system for you, fortunately Microsoft released a pack of examples that contains one such scheduler called "ParallelExtensionsExtras" with its scheduler QueuedTaskScheduler

private static void Main(string[] args)
{
    int totalMaxConcurrancy = Environment.ProcessorCount;
    int highPriorityMaxConcurrancy = totalMaxConcurrancy / 2;

    if (highPriorityMaxConcurrancy == 0)
        highPriorityMaxConcurrancy = 1;

    QueuedTaskScheduler qts = new QueuedTaskScheduler(TaskScheduler.Default, totalMaxConcurrancy);
    var highPriortiyScheduler = qts.ActivateNewQueue(0);
    var lowPriorityScheduler = qts.ActivateNewQueue(1);

    BlockingCollection<Foo> highPriorityWork = new BlockingCollection<Foo>();
    BlockingCollection<Foo> lowPriorityWork = new BlockingCollection<Foo>();

    List<Task> processors = new List<Task>(2);

    processors.Add(Task.Factory.StartNew(() =>
    {
        Parallel.ForEach(highPriorityWork.GetConsumingPartitioner(),  //.GetConsumingPartitioner() is also from ParallelExtensionExtras, it gives better performance than .GetConsumingEnumerable() with Parallel.ForEeach(
                         new ParallelOptions() { TaskScheduler = highPriortiyScheduler, MaxDegreeOfParallelism = highPriorityMaxConcurrancy }, 
                         ProcessWork);
    }, TaskCreationOptions.LongRunning));

    processors.Add(Task.Factory.StartNew(() =>
    {
        Parallel.ForEach(lowPriorityWork.GetConsumingPartitioner(), 
                         new ParallelOptions() { TaskScheduler = lowPriorityScheduler}, 
                         ProcessWork);
    }, TaskCreationOptions.LongRunning));


    //Add some work to do here to the highPriorityWork or lowPriorityWork collections


    //Lets the blocking collections know we are no-longer going to be adding new items so it will break out of the `ForEach` once it has finished the pending work.
    highPriorityWork.CompleteAdding();
    lowPriorityWork.CompleteAdding();

    //Waits for the two collections to compleatly empty before continueing
    Task.WaitAll(processors.ToArray());
}

private static void ProcessWork(Foo work)
{
    //...
}

Even though you have two instances of Parallel.ForEach running the combined total of both of them will not use more than the value you passed in for MaxConcurrency in to the QueuedTaskScheduler constructor and it will give preference to emptying the highPriorityWork collection first if there is work to do in both (up to a limit of 1/2 of all of the available slots so that you don't choke the low priority queue, you could easily adjust this to be a higher or lower ratio depending on your performance needs).

If you don't want the high priority to always win and you rather have a "round-robin" style scheduler that alternates between the two lists (so you don't want the quick items to always win, but just have them shuffled in with the slow items) you can set the same priority level to two or more queues (or just use the RoundRobinTaskSchedulerQueue which does the same thing)

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!