Stream remote file with PHP and Guzzle

倾然丶 夕夏残阳落幕 提交于 2019-12-08 07:56:23

问题


My application should stream back to the browser a large file, that is server remotely. Currently the file is served from a local NodeJS server.

I am using a VirtualBox Disk image of 25GB, just to be sure that it is not stored in memory while streaming. This is the related code I'm struggling with

    require __DIR__ . '/vendor/autoload.php';
    use GuzzleHttp\Stream\Stream;
    use GuzzleHttp\Stream\LimitStream;

    $client = new \GuzzleHttp\Client();
    logger('==== START REQUEST ====');
    $res = $client->request('GET', 'http://localhost:3002/', [
      'on_headers' => function (\Psr\Http\Message\ResponseInterface $response) use ($res) {
        $length = $response->getHeaderLine('Content-Length');
        logger('Content length is: ' . $length);
        header('Content-Description: File Transfer');
        header('Content-Type: application/octet-stream');
        header('Content-Disposition: attachment; filename="testfile.zip"');
        header('Expires: 0');
        header('Cache-Control: must-revalidate');
        header('Pragma: public');
        header('Content-Length: ' . $length);

      }
    ]);

    $body = $res->getBody();
    $read = 0;
    while(!$body->eof()) {
      logger("Reading chunk. " . $read);
      $chunk = $body->read(8192);
      $read += strlen($chunk);
      echo $chunk;
    }
    logger('Read ' . $read . ' bytes');
    logger("==== END REQUEST ====\n\n");

    function logger($string) {
      $myfile = fopen("log.txt", "a") or die ('Unable to open log file');
      fwrite($myfile, "[" . date("d/m/Y H:i:s") . "] " . $string . "\n");
      fclose($myfile);
    }

Even though $body = $res->getBody(); should return a stream, it quickly full the disk with swap data, meaning that it is trying to save that in memory before streaming back to the client, but this is not the expected behavior. What am I missing?


回答1:


You have to specify stream and sink options like this:

$res = $client->request('GET', 'http://localhost:3002/', [
    'stream' => true,
    'sink' => STDOUT, // Default output stream.
    'on_headers' => ...
]);

After these additions you will be able to stream responses chunk by chunk, without any additional code to copy from response body stream to STDOUT (with echo).

But usually you don't want to do this, because you will need to have one process of PHP (php-fpm or Apache's mod_php) for each active client.

If you just want to serve secret files, try to use an "internal redirect": through X-Accel-Redirect header for nginx or X-Sendfile for Apache. You will get the same behavior, but with less resource usage (because of high optimized event loop in case of nginx). For configuration details you can read an official documentation or, of course, other SO questions (like this one).



来源:https://stackoverflow.com/questions/38702570/stream-remote-file-with-php-and-guzzle

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!