PHP and the million array baby

前端 未结 4 1034
一个人的身影
一个人的身影 2021-02-19 19:37

Imagine you have the following array of integers:

array(1, 2, 1, 0, 0, 1, 2, 4, 3, 2, [...] );

The integers go on up to one million entries; on

4条回答
  •  迷失自我
    2021-02-19 20:14

    Say the integers are all 0-15. Then you can store 2 per byte:

    To run: php ints.php > ints.ser

    Now you have a file with a 500000 byte string containing 1,000,000 random integers from 0 to 15.

    To load:

    > 1]);
    
      return ($i & 1) ? $data & 0xf : $data >> 4;
    }
    
    for ($i = 0; $i < 1000; ++$i)
      echo get_data_at($data, $i), "\n";
    

    The loading time on my machine is about .002 seconds.

    Of course this might not be directly applicable to your situation, but it will be much faster than a bloated PHP array of a million entries. Quite frankly, having an array that large in PHP is never the proper solution.

    I'm not saying this is the proper solution either, but it definitely is workable if it fits your parameters.

    Note that if your array had integers in the 0-255 range, you could get rid of the packing and just access the data as ord($data[$i]). In that case, your string would be 1M bytes long.

    Finally, according to the documentation of file_get_contents(), php will memory map the file. If so, your best performance would be to dump raw bytes to a file, and use it like:

    $ints = file_get_contents('ints.raw');
    echo ord($ints[25]);
    

    This assumes that ints.raw is exactly one million bytes long.

提交回复
热议问题