问题
A number of devices are sending messages which end up in a single Azure Service Bus queue (or topic). We want to process multiple messages in parallel, but we want to avoid concurrent processing of two messages of the same device at any given time.
The following pictures illustrates the goal. There are 3 processing threads (in reality there might be several dozens, distributed between several servers). Each box denotes processing time of a single message, and the color shows which device it belongs to.
You can see that at no point in time there are two or more overlapping messages originated from the same device.
As there are multiple processing servers involved, I can imagine that the only way to prevent concurrent processing is to partition messages with device ID as partitioning key, and then only have a single consumer for each partition:
So, all messages from "yellow device" go to partition 1, and so on.
I still want to run multiple processing threads in the single process. Right now, we do something simple like
var client = QueueClient.CreateFromConnectionString(connectionString, queueName);
var options = new OnMessageOptions { MaxConcurrentCalls = x };
client.OnMessage(m =>
{
// Process...
m.Complete();
});
How do I incorporate the concurrency limitation into such code?
I can imagine some client-side solutions based on actors or other concurrency mechanisms. But is there a way to solve that on Broker level?
回答1:
This looks as a good candidate to take advantage of the ASBs Sessions feature. You will be able to use
OnMessage` API, but processing of a given session will only be done by a single consumer, not multiple. Also, you'll be able to run concurrently, handling the load as it comes.
A good starting point would be to look at QueueClient.AcceptMessageSessionAsync API. If you want a solid documentation with explanation how it works, this sample is the best doco you'll find.
来源:https://stackoverflow.com/questions/40284979/azure-service-bus-avoid-processing-messages-from-same-device-in-parallel