Parallel ForEach wait 500 ms before spawning

房东的猫 提交于 2019-12-08 07:51:09

问题


I have this situation:

var tasks = new List<ITask> ...
Parallel.ForEach(tasks, currentTask => currentTask.Execute() );

Is it possible to instruct PLinq to wait for 500ms before the next thread is spawned?

System.Threading.Thread.Sleep(5000);

回答1:


You are using Parallel.Foreach totally wrong, You should make a special Enumerator that rate limits itself to getting data once every 500 ms.

I made some assumptions on how your DTO works due to you not providing any details.

private IEnumerator<SomeResource> GetRateLimitedResource()
{
    SomeResource someResource = null;
    do
    {
        someResource = _remoteProvider.GetData();

        if(someResource != null)
        {
             yield return someResource;
             Thread.Sleep(500);
        }
    } while (someResource != null);
}

here is how your paralell should look then

Parallel.ForEach(GetRateLimitedResource(), SomeFunctionToProcessSomeResource);



回答2:


There are already some good suggestions. I would agree with others that you are using PLINQ in a manner it wasn't meant to be used.

My suggestion would be to use System.Threading.Timer. This is probably better than writing a method that returns an IEnumerable<> that forces a half second delay, because you may not need to wait the full half second, depending on how much time has passed since your last API call.

With the timer, it will invoke a delegate that you've provided it at the interval you specify, so even if the first task isn't done, a half second later it will invoke your delegate on another thread, so there won't be any extra waiting.

From your example code, it sounds like you have a list of tasks, in this case, I would use System.Collections.Concurrent.ConcurrentQueue to keep track of the tasks. Once the queue is empty, turn off the timer.




回答3:


You could use Enumerable.Aggregate instead.

var task = tasks.Aggregate((t1, t2) =>
                                t1.ContinueWith(async _ =>
                                    { Thread.Sleep(500); return t2.Result; }));

If you don't want the tasks chained then there is also the overload to Select assuming the tasks are in order of delay.

var tasks = Enumerable
              .Range(1, 10)
              .Select(x => Task.Run(() => x * 2))
              .Select((x, i) => Task.Delay(TimeSpan.FromMilliseconds(i * 500))
                                    .ContinueWith(_ => x.Result));

foreach(var result in tasks.Select(x => x.Result))
{
    Console.WriteLine(result);
}

From the comments a better options would be to guard the resource instead of using the time delay.

static object Locker = new object();

static int GetResultFromResource(int arg)
{
    lock(Locker)
    {
        Thread.Sleep(500);
        return arg * 2;
    }
}

var tasks = Enumerable
          .Range(1, 10)
          .Select(x => Task.Run(() => GetResultFromResource(x)));

foreach(var result in tasks.Select(x => x.Result))
{
    Console.WriteLine(result);
}



回答4:


In this case how about a Producer-Consumer pattern with a BlockingCollection<T>?

var tasks = new BlockingCollection<ITask>();

// add tasks, if this is an expensive process, put it out onto a Task
// tasks.Add(x);

// we're done producin' (allows GetConsumingEnumerable to finish)
tasks.CompleteAdding();

RunTasks(tasks);

With a single consumer thread:

static void RunTasks(BlockingCollection<ITask> tasks)
{
    foreach (var task in tasks.GetConsumingEnumerable())
    {
        task.Execute();

        // this may not be as accurate as you would like
        Thread.Sleep(500);
    }
}

If you have access to .Net 4.5 you can use Task.Delay:

static void RunTasks(BlockingCollection<ITask> tasks)
{
    foreach (var task in tasks.GetConsumingEnumerable())
    {
        Task.Delay(500)
            .ContinueWith(() => task.Execute())
            .Wait();
    }
}


来源:https://stackoverflow.com/questions/17655427/parallel-foreach-wait-500-ms-before-spawning

标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!