Why Random loss symbol does not work well

浪子不回头ぞ 提交于 2019-12-25 18:44:42

问题


I try to make random loss from a given bit stream. Assume that I have a bit stream as

10 01 10 11 00

Now I will create a code to implement random loss. The function with two inputs are original bit stream and percent loss. Output function is output bit stream

int* bitloss(int* orbit,int size_orbit,float loss_percent)
{
srand(time(NULL));
int* out_bitstream=(int*)malloc(sizeof(int)*size_orbit);
double randval ;
for(int i=0;i<size_orbit,i++)
{
    randval = (double)rand()/(double)RAND_MAX;
    if(randval<loss_percent) 
         out_bitstream[i]=-1;
     else out_bitstream[i]=orbit[i];

}
return out_bitstream;
}

This code will change value of original bit to -1 if the random belows than loss_percent.I call -1 bit is loss bit. So given loss_percent equals 20%. That mean I will loss 2 packets from 10 original bits. But when I do it. I show that some time I loss 0 bit, some time 4 bit, and sometime 2 bit. It is not stable. How to edit my code to stable loss. For example, I want to loss 20%. So the number of -1 bits are 2. Thank you so much


回答1:


Assuming each bit has a probability p of being lost, and that bit loss are independent (this may not be the case in for example some fading channels where bit loss are more likely to occur in bursts), the number of bit lost in N bits follows a binomial distribution.

Thus, for 10 bits and a loss rate of 20%, you would get the following distribution:

Similarly, for 1000 bits and the same loss rate of 20%, you would get the following distribution:

Note that as the total number of bits gets larger, the binomial distribution approaches a Gaussian distribution with average Np and variance Np(1-p) . Specifically, for the case of N=1000 and p=0.2 overlapping the Gaussian distribution over the binomial distribution gives:

As you can see it is a pretty good approximation.




回答2:


That's the problem with random numbers : They're random. If it always dropped 2 packets it wouldn't be random.

If you want to always lose 2 packets out of 10, then just randomly pick those packets. Something like...

int firstLoss, secondloss;
firstLoss = rand() % 10;

do {
  secondloss = rand() % 10;
} while (secondLoss == firstLoss);

We need the while loop (or a similar kind of 'tweak') to avoid selecting the same packet twice...



来源:https://stackoverflow.com/questions/24079138/why-random-loss-symbol-does-not-work-well

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!