Best practice for WCF service with large amounts of data?

时光总嘲笑我的痴心妄想 提交于 2019-12-03 14:19:06

Using a streaming binding configuration on client and server with a MessageContract having only a Stream [MessageBodyMember] (and any other metadata sent as [MessageHeader]s) would let you do the whole thing in one call without worrying about paging (just use an enumerator on the server side to feed the stream and process individual entities as they appear on the client), but you'd have to roll your own framing within the stream (eg, serialize/deserialize entities manually on the stream with DataContractSerializer or whatever). I've done this, and it works great, but it's kind of a pain.

If you want to do paging, the easy way is to use a sessionful WCF channel in conjunction with a snapshot transaction (if you're using SQL Server or something else that supports them as your entity source). Start the snapshot tx on the first request, then tie the life of the tx to the session, so that you're looking at a stable picture of the data between page requests- the tx will be released when the session is closed (or times out, if the client disconnected unexpectedly). Then the client requests the last key value it saw + how many records it wants (careful of maxReceivedMessageSize- leave LOTS of headroom). Since you're in a snapshot, you don't have to worry about changes- you'll see a consistent view for the duration of the dump. If you can't snapshot your source data to prevent it from changing mid-download, life is a lot harder. Always doable, but designing for that is very specific to the data.

标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!