DatagramChannel, blocking mode and cpu

我们两清 提交于 2019-12-11 12:05:11

问题


I get the following code snippet:

datagramChannel = DatagramChannel
    .open(StandardProtocolFamily.INET).setOption(StandardSocketOptions.SO_REUSEADDR, true)
    .setOption(StandardSocketOptions.IP_MULTICAST_IF, networkInterface);
datagramChannel.configureBlocking(true);
datagramChannel.bind(new InetSocketAddress(filter.getType()
    .getPort(filter.getTimeFrameType())));
datagramChannel.join(group, networkInterface);
datagramChannel.receive(buffer);

This code is located in a Callable and I create up to 12 Callables (12 threads thus) to retrieve multicast packets with different data from 12 different ports. It only reads from the information which is broacasted on the network each 3-8 seconds.

When pooling the 12 ports continuously (wait for the information, get the information, and so on), it eats 100% of one of my CPU.

Profiling the execution with JVisualVM, I see that 90% of the execution time is devoted to java.nio.channels.DatagramChannel#receive(), and more precisely com.sun.nio.ch.DatagramChannelImpl#receiveIntoBuffer().

  1. I don't understand well why the blocking mode eat so much CPU.

  2. I have read some articles on using Selectors instead of blocking mode, but I don't really see why a while (true) with Selector would be less consuming than a blocking channel.


回答1:


The problem is you are using NIO without Selector.

NIO without Selector is ok to use but then Channel.receive would be constantly trying to read which would show up as high CPU usage for one thread.

There are 2 solutions :-

  • Use Selector to detect if there is something to read. Call channel.receive only when Selector indicates there is data to be read
  • Use java.net.DatagramSocket/DatagramPacket to send/receive in blocking mode.


来源:https://stackoverflow.com/questions/21826013/datagramchannel-blocking-mode-and-cpu

标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!