setSoTimeout on a client socket doesn't affect the socket

感情迁移 提交于 2019-12-31 04:56:06

问题


I have a Java application with three threads that open, each, a socket and connect to a server on different ports. I set so_timeout on each of these sockets after the connection to the server is established. After that the threads block waiting on read(). Only one of the threads times out after 20 seconds (which is the timeout I set). The other two ignore the timeout. Is it possible that the TCP layer handles only one timeout at a time? Is there any other explanation?


回答1:


The documentation says:

The option must be enabled prior to entering the blocking operation to have effect.

maybe you should set it before the connection to the server is established, at least before calling read() on the socket.
But hard to say without the code...




回答2:


I've had several problems in the past dealing with SO_TIMEOUT in windows. I believe setting this is "supposed" to set the underlying socket implementation that could be OS dependent and conflicting with registry settings and such.

My advice is to not use SO_TIMEOUT to force a thrown exception on a timeout. Use either non-blocking I/O or check that you have bytes available() before you read().



来源:https://stackoverflow.com/questions/1306119/setsotimeout-on-a-client-socket-doesnt-affect-the-socket

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!