What is solution for downloading arbitrarily large files to Cloud Storage using Google App Engine php55 or php7?

六月ゝ 毕业季﹏ 提交于 2019-12-11 05:32:12

问题


I have a google app engine php55 service that periodically checks a public website and downloads a file. This file is typically small (<1MB). My simple app is based on the following:

<?php
$strSource = 'https://example.com/file.zip';

$strBucket = 'bucket-1234';
$strDirectory = '/path/to/file/'; // Google Cloud Storage directory
$strName = 'file.zip';
$strDestination = 'gs://' . $strBucket . '.appspot.com' . $strDirectory . $strName;

copy($strSource,$strDestination);
?>

I found this file occasionally is larger (over the 32MB response size limit). How do I write this script to handle the file whether it is 1MB or 100MB?

I see people recommend "Blobstore," which is something I do not have experience with. Even if I understood that solution (which seems to be focused on a very different use case), it does not appear to be available for PHP at all. Am I missing something?


回答1:


I would recommend you to use a Compute Engine Instance, since GAE has the 32MB limit on the responses size. I found this post, where the user checks if there are new files available, and if there is some file, he upload directly to GCS.

In order to do it, and as specified in the documentation, you should create an instance in GCE,and install and configure the client library for the language that you are going to use (as you mentioned in your post that you were using PHP, all the links will be refering this language, but keep in mind that you can choose also other language as C++, Java, Python...).

You can find here an example in PHP about how to upload an object to GCS:

function upload_object($bucketName, $objectName, $source)
{
    $storage = new StorageClient();
    $file = fopen($source, 'r');
    $bucket = $storage->bucket($bucketName);
    $object = $bucket->upload($file, [
        'name' => $objectName
    ]);
    printf('Uploaded %s to gs://%s/%s' . PHP_EOL, basename($source), $bucketName, $objectName);
}

You can also find other samples in the Github repository from Google Cloud Platform.

Hope this helps!




回答2:


Use Google Storage Transfer Service (STS). It can be called from the Google Cloud SDK from your existing App Engine application, and will transfer the files directly from S3 to GCS without hitting any of yout App Engine limits. Based on your description, I believe it meets your requirements:

  • No data transfer limit
  • Minimal code changes
  • "Serverless" and simple to configure

STS has some additional benefits:

  • Zero runtime cost. That is, App Engine simply makes the STS API call to start the transfer job which is handled by STS, so you're not billed for the time GAE would normally use to download/upload the files itself.
  • You can go even more serverless by invoking STS from a Cloud Function trigged by Cloud Scheduler. I doubt you'd save much on costs, but it sure would be a neat setup.

The GCP docs have a guide on How to set up a Transfer from Amazon S3 to Cloud Storage.


Additional notes:

  • You can test out Storage Transfer Service directly in the GCP Console and see if it's right for you before you decide on whether you make any major code or architecture decisions.


来源:https://stackoverflow.com/questions/58458282/what-is-solution-for-downloading-arbitrarily-large-files-to-cloud-storage-using

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!