amazon-s3

AWS Bucket Policy Error: Policy has invalid action

梦想与她 提交于 2020-01-14 11:52:52
问题 I have a very basic goal: to share all content of my bucket to a list of specific users, read only. This used to work with a tool called s3cmd. All I need to do was to add a user (identified by email) to the Access Control List with Read Permission and they could list or download data smoothly. But recently, this suddenly did not work any more. The system just denies any attempt to access my bucket. I then started thinking of editing the bucket policy. Here is the draft of my policy,

“The AWS Access Key Id you provided does not exist in our records.” when trying to use AWS CLI

随声附和 提交于 2020-01-14 10:24:09
问题 I'm trying to access my S3 Bucket through the cli. I have everything setup up such as having a credentials file where I'm doing this cli work, making sure my environment variables for $AWS_SECRET_ACCESS_KEY and $AWS_ACCESS_KEY_ID have the right things in it, and manually setting them using aws configure . credentials contains the following information, all blanked out that is: [sts] aws_access_key_id = ASIAXXXXXXXXXXXXXXXX aws_secret_access_key = XXXXXXXXXXXXXXXXXXXXXXXXX aws_security_token =

Django amazon s3 SuspiciousOperation

徘徊边缘 提交于 2020-01-14 09:52:09
问题 So when i try accessing a certain image on S3 from my browser everything works fine. But when python is doing it i get a SuspiciousOperation error. My static folder is public on S3 so i really have no idea where this is coming from. Publication.objects.get(id=4039).cover.url Traceback (most recent call last): File "<console>", line 1, in <module> File "/home/vagrant/.pyenv/versions/blook/lib/python2.7/site-packages/django/db/models/fields/files.py", line 64, in _get_url return self.storage

Django amazon s3 SuspiciousOperation

[亡魂溺海] 提交于 2020-01-14 09:52:05
问题 So when i try accessing a certain image on S3 from my browser everything works fine. But when python is doing it i get a SuspiciousOperation error. My static folder is public on S3 so i really have no idea where this is coming from. Publication.objects.get(id=4039).cover.url Traceback (most recent call last): File "<console>", line 1, in <module> File "/home/vagrant/.pyenv/versions/blook/lib/python2.7/site-packages/django/db/models/fields/files.py", line 64, in _get_url return self.storage

Configuring environment variables for static web site on AWS S3

谁说胖子不能爱 提交于 2020-01-14 08:30:10
问题 I am trying to setup a simple static Angular website on S3 per the info: http://docs.aws.amazon.com/gettingstarted/latest/swh/website-hosting-intro.html I want to send email via a form that needs to send sendgrid api keys. Obviously, I want to use environment variables for this to avoid having keys in code. How do you setup environment variables in S3? I looked into aws-cli tool but it only shows examples of what appear to be AWS specific enviroment variables. Is there somewhere in AWS/S3

How to limit the consuming rate from a topic?

♀尐吖头ヾ 提交于 2020-01-14 08:09:03
问题 Has anyone else solved the following problem? I have SNS topic filled with events from S3 and there is Lambda function which is subscribed on this topic and when thousand of events are put to this topic, lambda function is throttled because of exceeding the limit of concurrency. I don't want to request a limit increase for concurrent executions but I would decrease concurrent consuming from the topic, but I didn't find information how to do it. Thanks. 回答1: A couple of options regarding SNS:

How to limit the consuming rate from a topic?

北慕城南 提交于 2020-01-14 08:08:30
问题 Has anyone else solved the following problem? I have SNS topic filled with events from S3 and there is Lambda function which is subscribed on this topic and when thousand of events are put to this topic, lambda function is throttled because of exceeding the limit of concurrency. I don't want to request a limit increase for concurrent executions but I would decrease concurrent consuming from the topic, but I didn't find information how to do it. Thanks. 回答1: A couple of options regarding SNS:

what's best way to check if a S3 object exists?

懵懂的女人 提交于 2020-01-14 07:32:27
问题 Currently, I make a GetObjectMetaDataRequest , if the GetObjectMetaDataResponse throw an exception means the object doesn't exist. Is there a better way to check whether the file exists without downloading the file. 回答1: you can use S3FileInfo class and Exists method of this class it will hep you to check if file exists without download the file .see the example below I used the AWSSDK 3.1.6 .net(3.5) : public static bool ExistsFile() { BasicAWSCredentials basicCredentials = new

How to assign the access control list (ACL) when writing a CSV file to AWS in pyspark (2.2.0)?

余生颓废 提交于 2020-01-14 06:27:27
问题 I know I can output my spark dataframe to AWS S3 as a CSV file by df.repartition(1).write.csv('s3://my-bucket-name/df_name') My question is that is there an easy way to set the Access Control List (ACL) of this file to 'bucket-owner-full-control' when writing it to S3 using pyspark? 回答1: Don't know about the EMR s3 connector; in the ASF S3A connector you set the option fs.s3a.acl.default when you open the connection: you can't set it on a file-by-file basis 回答2: Access Control List (ACL) can

How to make an AWS S3 file public accessible using AW S3 SDK's createPresignedPost method?

杀马特。学长 韩版系。学妹 提交于 2020-01-14 06:17:07
问题 I have a use case to keep the AWS S3 Bucket Private as default but, Make certain objects Public while uploading to AWS S3. I am using the following code to sign the AWS S3 url using and ACL setting as public-read - module.exports.generateS3PostSignedUrl = async (bucketName, bucketKey, objectExpiry) => { let s3Client = new AWS.S3({ region: 'some-region' }); let signingParams = { Expires: objectExpiry, Bucket: bucketName, Fields: { key: bucketKey, }, Conditions: [ ['acl', 'public-read'] ], ACL: