I\'m talking about this surprisingly simple implementation of rand() from the C standard:
static unsigned long int next = 1;
int rand(void)
Early computations tended to concern themselves with the bits and bytes and played tricks with the registers to minimize bytes of code (before lines there were bytes)
I have only found one reasonable clue below:
The output of this generator isn’t very random. If we use the sample generator listed above, then the sequence of 16 key bytes will be highly non-random. For instance, it turns out that the low bit of each successive output of rand() will alternate (e.g., 0,1,0,1,0,1, . . . ). Do you see why? The low bit of x * 1103515245 is the same as the low bit of x, and then adding 12345 just flips the low bit. Thus the low bit alternates. This narrows down the set of possible keys to only 2113 possibilities;much less than the desired value of 2128.
http://inst.eecs.berkeley.edu/~cs161/fa08/Notes/random.pdf
And two reasonable answers:
Improving a poor random number generator (1976) by Bays, Durham Bays, Carter, S D Durham
http://en.wikipedia.org/wiki/TRNG