packet-loss

Why is Android dropping TCP packets, occurs in droid 5.x, not in 4.x?

ぃ、小莉子 提交于 2021-01-27 06:48:03
问题 I have an android smartphone connecting via WIFI to an embedded AP. I am sniffing WIFI traffic with a laptop running Tshark on Linux. I am transferring small (234 bytes) TCP packets 5 times every 100ms, followed by 500ms with no data. Periodically, packets will be ignored, forcing retransmission. Some level of packet retransmission is expected when transferring data over TCP sockets, but this is excessive. Especially so because the packets are received by the sniffer without problem (i.e.,

Forcing packet loss

邮差的信 提交于 2020-01-02 05:46:14
问题 For testing purposes, to determine how a protocol implementation behaves in the presence of packet loss, I would like to force packet loss on one of my network devices. Specifically, I would like to be able to tune the packet loss anywhere between 0% and 100%. I have a little experience with iptables and it seems to me I should be able to achieve it using that, but I haven't been able to. Achieving 100% packet loss is not a problem though ;). Any ideas on how to do this? 回答1: Look into

The most reliable and efficient udp packet size?

浪尽此生 提交于 2019-12-29 03:27:25
问题 Would sending lots a small packets by UDP take more resources (cpu, compression by zlib, etc...). I read here that sending one big packet of ~65kBYTEs by UDP would probably fail so I'm thought that sending lots of smaller packets would succeed more often, but then comes the computational overhead of using more processing power (or at least thats what I'm assuming). The question is basically this; what is the best scenario for sending the maximum successful packets and keeping computation down

Android UDP packet loss

本小妞迷上赌 提交于 2019-12-24 11:28:12
问题 So i'm writing an app that sends 5Kb packets out 15 times a second through UDP. I understand I will lose some packets but I seem to be losing all my packets after the first couple seconds. Even if I slow it down to send the 5Kb packets out once every 10 seconds I will still lose them. What would cause this? 回答1: It's not surprising that they are all dropped. A payload bigger than 512 bytes is unlikely to make it out of the network. It depends on the MTU of your router and how much bandwidth

Packet Loss and Packet duplication

夙愿已清 提交于 2019-12-23 08:56:14
问题 I am trying to find out what the difference between packet loss and packet duplication problems is. Does anyone know what 'packet duplication' is all about? Is it the same as re-transmitting packets when a loss is detected in TCP? 回答1: No. In TCP delivery of "packets" is reliable(I think the term data should be better in this case, since it's a stream oriented protocol). Packet loss and duplication are problem related to unreliable protocols datagram oriented like UDP . In UDP when you send a

Android DatagramSocket receive buffer size

丶灬走出姿态 提交于 2019-12-11 02:25:52
问题 I am trying to set/increase the receive buffer size of a datagram socket. I would like to do this since I am experiencing some random packet loss when sending data between a PC and the Android device which lie on the same local network and I would like to test if increasing the buffer size would have any effect in reducing this UDP packet loss. I am trying to set the buffer size using the below code. DatagramSocket socket = new DatagramSocket(); socket.setReceiveBufferSize(655360); and then

What are the chances of losing a UDP packet?

为君一笑 提交于 2019-12-09 14:38:30
问题 Okay, so I am programming for my networking course and I have to implement a project in Java using UDP. We are implementing an HTTP server and client along with a 'gremlin' function that corrupts packets with a specified probability. The HTTP server has to break a large file up into multiple segments at the application layer to be sent to the client over UDP. The client must reassemble the received segments at the application layer. What I am wondering however is, if UDP is by definition

How to simulate network packet loss when streaming video?

杀马特。学长 韩版系。学妹 提交于 2019-12-07 23:27:21
问题 Please help me solve this tricky problem, making me suffering for almost one week. How to make streaming video suffering packet loss? Switch: Pica8 3290 Computer: core i7 2600, 8GB Link: 100Mps Streaming video : RTP (1080P、4K) I've already tried "iperf", "iperf3" and "Packeth" to generate UDP packets. However, these 3 sofwares seem to measure the residual capacity of the link first and then send the amount of the packets fit the capacity. E.g.: (No Video streaming) iperf send almost 100Mps

How to simulate network packet loss when streaming video?

浪子不回头ぞ 提交于 2019-12-06 09:21:00
Please help me solve this tricky problem, making me suffering for almost one week. How to make streaming video suffering packet loss? Switch: Pica8 3290 Computer: core i7 2600, 8GB Link: 100Mps Streaming video : RTP (1080P、4K) I've already tried "iperf", "iperf3" and "Packeth" to generate UDP packets. However, these 3 sofwares seem to measure the residual capacity of the link first and then send the amount of the packets fit the capacity. E.g.: (No Video streaming) iperf send almost 100Mps (With video streaming) iperf only send almost 70Mbps Thus, these packet generator won't help me to make

UDP packet drops by linux kernel

人走茶凉 提交于 2019-11-30 15:32:33
I have a server which sends UDP packets via multicast and a number of clients which are listing to those multicast packets. Each packet has a fixed size of 1040 Bytes, the whole data size which is sent by the server is 3GByte. My environment is follows: 1 Gbit Ethernet Network 40 Nodes, 1 Sender Node and 39 receiver Nodes. All Nodes have the same hardware configuration: 2 AMD CPUs, each CPU has 2 Cores @2,6GHz On the client side, one thread reads the socket and put the data into a queue. One additional thread pops the data from the queue and does some light weight processing. During the