amazon-s3

Get file's signed URL from amazon s3 using Filesystem Laravel 5.2

让人想犯罪 __ 提交于 2020-05-09 18:01:46
问题 I'm looking for a good solution to get the signed url from amazon s3. I have a version working with it, but not using laravel: private function getUrl () { $distribution = $_SERVER["AWS_CDN_URL"]; $cf = Amazon::getCFClient(); $url = $cf->getSignedUrl(array( 'url' => $distribution . self::AWS_PATH.rawurlencode($this->fileName), 'expires' => time() + (session_cache_expire() * 60))); return $url; } I don't know if this is the best way to do with laravel, considering it has a entire file system

Amazon S3 static hosting with Namecheap DNS - How to correctly route non-www prefixed url

僤鯓⒐⒋嵵緔 提交于 2020-05-09 17:50:06
问题 I have been reading other posts to try to get down to the bottom of this issue... but I need some clarification. I am able to get all of my domain requests to hit my Amazon S3 bucket perfectly when entering www.FOO.com/MyDirectory If I enter FOO.com/MyDirectory without the www it will fail. What is the proper method to make url requests without the www route correctly to the same Amazon S3 bucket? Any tips would help greatly. Thanks 回答1: I finally came to the following solution: I am using

How to make a ReactJS app active/visible on AWS

£可爱£侵袭症+ 提交于 2020-05-09 07:38:06
问题 I developed a reactJS project (front-end) on AWS which has it RESTFUL API coming from heroku. They are completly separated i.e the frontend and backend. I have successfully uploaded my files to S3, and have activated my CloudFront Distributions, but I can't really figure out what is wrong because I can't see my react app when I hit the URL generated from the Domain name. I have checked this SO answer, but it doesn't help. Please any help will be greatly appreciated. 回答1: Firstly, it is

using role instead of keys to get signed url in s3 but nothing returned and no errors

柔情痞子 提交于 2020-05-09 06:03:51
问题 I tried if I use access key, it works fine but I am trying to get ride of access key and using role instead, but once I get ride of access key. what I get in return is www.aws.amazon.com const AWS = require('aws-sdk'); const s3 = new AWS.S3(); const params = {Bucket: config.bucket, Expires: config.time, Key}; const url = s3.getSignedUrl('getObject', params); console.log('The URL is', url); I even made sure my role is set right by going into my ec2 and run the cli command aws s3 presign s3:/

Codepipeline: Insufficient permissions Unable to access the artifact with Amazon S3 object key

余生颓废 提交于 2020-05-06 19:55:27
问题 Hello I created a codepipeline project with the following configuration: Source Code in S3 pulled from Bitbucket. Build with CodeBuild, generating an docker image and storing it into a Amazon ECS repository. Deployment provider Amazon ECS. All the process works ok until when it tries to deploy, for some reason I am getting the following error during deployment: Insufficient permissions Unable to access the artifact with Amazon S3 object key 'FailedScanSubscriber/MyAppBuild/Wmu5kFy' located in

Setting up media file access on AWS S3

五迷三道 提交于 2020-04-30 11:25:06
问题 Im using boto3 and django-storage libraries for apload media files of my django project. storage_backends.py class PrivateMediaStorage(S3Boto3Storage): location = settings.AWS_STORAGE_LOCATION default_acl = 'private' file_overwrite = False custom_domain = False class PublicStaticStorage(S3Boto3Storage): location = settings.AWS_PUBLIC_STATIC_LOCATION settings.py AWS_STORAGE_LOCATION = 'media/private' AWS_LOCATION = 'static' AWS_PUBLIC_STATIC_LOCATION = 'static/' DEFAULT_FILE_STORAGE = 'path.to

Setting up media file access on AWS S3

巧了我就是萌 提交于 2020-04-30 11:24:27
问题 Im using boto3 and django-storage libraries for apload media files of my django project. storage_backends.py class PrivateMediaStorage(S3Boto3Storage): location = settings.AWS_STORAGE_LOCATION default_acl = 'private' file_overwrite = False custom_domain = False class PublicStaticStorage(S3Boto3Storage): location = settings.AWS_PUBLIC_STATIC_LOCATION settings.py AWS_STORAGE_LOCATION = 'media/private' AWS_LOCATION = 'static' AWS_PUBLIC_STATIC_LOCATION = 'static/' DEFAULT_FILE_STORAGE = 'path.to

AWS-CLI acccess to S3 on Linux Machine

北战南征 提交于 2020-04-30 11:19:11
问题 I am wanting to set up a recursive sync from a Linux machine (Fedora) to an AWS S3 bucket. I am logged into Linux as root and have an AWS Key and Secret associated with a specific AWS user "Lisa". I have installed aws-cli, s3cmd, and attempted to configure both. I have verified the aws/configure and aws/credentials files both have a default user and a "Lisa" user with Access Key and Secret pairs. I receive errors stating that Access is Denied, access key and secret pair not found. I have

Save Image generated with Reportlab in my MEDIA folder (in Amazon S3)

你离开我真会死。 提交于 2020-04-30 07:42:13
问题 I implemented this library for generate barcodes images (http://kennethngedo.wordpress.com/2014/02/07/how-to-generate-barcode-in-django-using-reportlab/) Everything works fine, the image is generated correctly, BUT... the image is created in a folder outside the project, and such I'm using Heroku for Production, I can't access to the image. I'm using this Django structure (http://django-skel.readthedocs.org/en/latest/) specially adapted to work on Heroku with Amazon S3. Do you know guys how

Stream data from S3 bucket to redshift periodically

与世无争的帅哥 提交于 2020-04-30 07:34:22
问题 I have some data stored in S3 . I need to clone/copy this data periodically from S3 to Redshift cluster. To do bulk copy , I can use copy command to copy from S3 to redshift. Similarly is there any trivial way to copy data from S3 to Redshift periodically . Thanks 回答1: Try using AWS Data Pipeline which has various templates for moving data from one AWS service to other. The "Load data from S3 into Redshift" template copies data from an Amazon S3 folder into a Redshift table. You can load the