How to reuse threads in .NET 3.5

烈酒焚心 提交于 2019-11-28 19:52:46
dashton

This sounds like a fairly common requirement which can be solved by a multi-threaded producer-consumer queue. The threads are kept 'alive' and are signaled to do work when new work is added to the queue. The work is represented by a delegate (in your case ComputePartialDataOnThread) and the data passed to the delegate is what is queued (in your case the params to ComputePartialDataOnThread). The useful feature is that the implementation of managing worker threads and the actual algorithms are separate. Here is the p-c queue:

public class SuperQueue<T> : IDisposable where T : class
{
    readonly object _locker = new object();
    readonly List<Thread> _workers;
    readonly Queue<T> _taskQueue = new Queue<T>();
    readonly Action<T> _dequeueAction;

    /// <summary>
    /// Initializes a new instance of the <see cref="SuperQueue{T}"/> class.
    /// </summary>
    /// <param name="workerCount">The worker count.</param>
    /// <param name="dequeueAction">The dequeue action.</param>
    public SuperQueue(int workerCount, Action<T> dequeueAction)
    {
        _dequeueAction = dequeueAction;
        _workers = new List<Thread>(workerCount);

        // Create and start a separate thread for each worker
        for (int i = 0; i < workerCount; i++)
        {
            Thread t = new Thread(Consume) { IsBackground = true, Name = string.Format("SuperQueue worker {0}",i )};
            _workers.Add(t);
            t.Start();
        }
    }

    /// <summary>
    /// Enqueues the task.
    /// </summary>
    /// <param name="task">The task.</param>
    public void EnqueueTask(T task)
    {
        lock (_locker)
        {
            _taskQueue.Enqueue(task);
            Monitor.PulseAll(_locker);
        }
    }

    /// <summary>
    /// Consumes this instance.
    /// </summary>
    void Consume()
    {
        while (true)
        {
            T item;
            lock (_locker)
            {
                while (_taskQueue.Count == 0) Monitor.Wait(_locker);
                item = _taskQueue.Dequeue();
            }
            if (item == null) return;

            // run actual method
            _dequeueAction(item);
        }
    }

    /// <summary>
    /// Performs application-defined tasks associated with freeing, releasing, or resetting unmanaged resources.
    /// </summary>
    public void Dispose()
    {
        // Enqueue one null task per worker to make each exit.
        _workers.ForEach(thread => EnqueueTask(null));

        _workers.ForEach(thread => thread.Join());

    }
}

As previous posters have said, there are many built in structures (look at TPL ), which use the Threadpool, which you may want to look at before implementing your own queue.

Aurojit Panda

So the usual way one would do this is to have each thread's entrypoint essentially do something similar to (this is just an algorithm, not C# code, sorry):

  1. Check to see if you have work to do
  2. Do work if found
  3. Wait on a signal

On the other side whenever you have more work for your thread add it to the queue of work to do and then your thread in essence is being reused. This is pretty similar to how one would implement a thread pool by themselves (if you are in the runtime you could do some other things to help you out, but it's not a super big deal).

seairth

Here's a thread that talks about this very thing: A custom thread-pool/queue class.

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!