phpleague flysystem read and write to large file on server

我与影子孤独终老i 提交于 2019-12-07 18:26:04

问题


I am using flysystem with IRON IO queue and I am attempting to run a DB query that will be taking ~1.8 million records and while doing 5000 at at time. Here is the error message I am receiving with file sizes of 50+ MB:

PHP Fatal error:  Allowed memory size of ########## bytes exhausted

Here are the steps I would like to take:

1) Get the data

2) Turn it into a CSV appropriate string (i.e. implode(',', $dataArray) . "\r\n")

3) Get the file from the server (in this case S3)

4) Read that files' contents and append this new string to it and re-write that content to the S3 file

Here is a brief run down of the code I have:

public function fire($job, $data)
{
    // First set the headers and write the initial file to server
    $this->filesystem->write($this->filename, implode(',', $this->setHeaders($parameters)) . "\r\n", [
        'visibility' => 'public',
        'mimetype' => 'text/csv',
    ]);

    // Loop to get new sets of data
    $offset = 0;

    while ($this->exportResult) {
        $this->exportResult = $this->getData($parameters, $offset);

        if ($this->exportResult) {
            $this->writeToFile($this->exportResult);

            $offset += 5000;
        }
    }
}

private function writeToFile($contentToBeAdded = '')
{
    $content = $this->filesystem->read($this->filename);

    // Append new data
    $content .= $contentToBeAdded;

    $this->filesystem->update($this->filename, $content, [
        'visibility' => 'public'
    ]);
}

I'm assuming this is NOT the most efficient? I am going off of these docs: PHPLeague Flysystem

If anyone can point me in a more appropriate direction, that would be awesome!


回答1:


If you are working with S3, I would use the AWS SDK for PHP directly to solve this particular problem. Appending to a file is actually very easy using the SDK's S3 streamwrapper, and doesn't force you to read the entire file into memory.

$s3 = \Aws\S3\S3Client::factory($clientConfig);
$s3->registerStreamWrapper();

$appendHandle = fopen("s3://{$bucket}/{$key}", 'a');
fwrite($appendHandle, $data);
fclose($appendHandle);



回答2:


Flysystem supports read/write/update stream

Please check latest API https://flysystem.thephpleague.com/api/

$stream = fopen('/path/to/database.backup', 'r+');
$filesystem->writeStream('backups/'.strftime('%G-%m-%d').'.backup', $stream);

// Using write you can also directly set the visibility
$filesystem->writeStream('backups/'.strftime('%G-%m-%d').'.backup', $stream, [
    'visibility' => AdapterInterface::VISIBILITY_PRIVATE
]);

if (is_resource($stream)) {
    fclose($stream);
}

// Or update a file with stream contents
$filesystem->updateStream('backups/'.strftime('%G-%m-%d').'.backup', $stream);

// Retrieve a read-stream
$stream = $filesystem->readStream('something/is/here.ext');
$contents = stream_get_contents($stream);
fclose($stream);

// Create or overwrite using a stream.
$putStream = tmpfile();
fwrite($putStream, $contents);
rewind($putStream);
$filesystem->putStream('somewhere/here.txt', $putStream);

if (is_resource($putStream)) {
    fclose($putStream);
}


来源:https://stackoverflow.com/questions/26020515/phpleague-flysystem-read-and-write-to-large-file-on-server

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!