Client Server Command Design pattern with variable delays

亡梦爱人 提交于 2019-12-08 06:13:43

问题


I am writing a client program to control a server which is in turn controlling some large hardware. The server needs to receive commands to initialize, start, stop and control the hardware.

The connection from the client to the server is via a TCP or UDP socket. Each command is encapsulated in an appropriate message using a SCADA protocol (e.g. Modbus or DNP3).

Part of the initialization phase involves sending a sequence of commands from the client to the server. In some cases there must be a delay in seconds between the commands to prevent multiple sub-systems being initialized at the same time. The value of the delay depends on the type of command.

I'm thinking that the Command Design Pattern is a good approach to follow here. The client instantiates ConcreteCommands and the Invoker places it in a queue. I'm not sure how to incorporate the variable delay and whether there's a better pattern which involves a timer and a queue to handle sending messages with variable delays.

I'm using C# but this is probably irrelevant since it's more of a design pattern question.


回答1:


It sounds like you need to store a mapping of types to delay. When your server starts, could you cache those delay times? Then call a method that processes the command after a specified delay?

When the server starts:

Dictionary<Type, int> typeToDelayMapping = GetTypeToDelayMapping();

When a command reaches the server, the server can call this:

InvokeCommand(ICommand command, int delayTimeInMilliseconds)

Like so:

InvokeCommand(command, typeToDelayMapping[type]);


来源:https://stackoverflow.com/questions/12016314/client-server-command-design-pattern-with-variable-delays

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!