amazon-s3

how to get folder size which consists of files in aws using java

╄→尐↘猪︶ㄣ 提交于 2020-01-30 13:03:32
问题 lets think we have bucket with name "bucket1" and inside that having a folder with name 'new folder' Inside 'new folder' are files new folder/a1.pdf-->2mb new folder/a2.pdf-->2mb new folder/new folder2/b.pdf-->3mb when we use amazons3client.listObjects("bucket1","new folder")---->it will return the list of files and folders inside that for each 's3object' there is 'size' parameter i can loop through all those s3 objects and i can get folder size but it is heavy operation. /* will you please

Can I restrict access to S3 from a specific subdomain only?

自作多情 提交于 2020-01-30 10:47:56
问题 I have several subdomains that will have their own S3 bucket. Is there a way to set S3 bucket to allow GET requests only from a specific subdomain? 回答1: Use an aws:Referer condition. From Restricting Access to a Specific HTTP Referrer: { "Version":"2012-10-17", "Id":"http referer policy example", "Statement":[ { "Sid":"Allow get requests originated from www.example.com and example.com", "Effect":"Allow", "Principal":"*", "Action":"s3:GetObject", "Resource":"arn:aws:s3:::examplebucket/*",

Can I restrict access to S3 from a specific subdomain only?

这一生的挚爱 提交于 2020-01-30 10:34:46
问题 I have several subdomains that will have their own S3 bucket. Is there a way to set S3 bucket to allow GET requests only from a specific subdomain? 回答1: Use an aws:Referer condition. From Restricting Access to a Specific HTTP Referrer: { "Version":"2012-10-17", "Id":"http referer policy example", "Statement":[ { "Sid":"Allow get requests originated from www.example.com and example.com", "Effect":"Allow", "Principal":"*", "Action":"s3:GetObject", "Resource":"arn:aws:s3:::examplebucket/*",

AWS service to verify data integrity of file in S3 via checksum?

无人久伴 提交于 2020-01-30 08:05:07
问题 One method of ensuring a file in S3 is what it claims to be is to download it, get its checksum, and match the result against the checksum you were expecting. Does AWS provide any service that allows this to happen without the user needing to first download the file? (i.e. ideally a simple request/url that provides the checksum of an S3 file, so that it can be verified before the file is downloaded) What I've tried so far I can think of a DIY solution along the lines of Create an API endpoint

Amazon CloudFront Alternate Domain Names

醉酒当歌 提交于 2020-01-30 05:14:54
问题 I'm totally new to Amazon and all of its services. I have set up Amazon S3 and created a CloudFront distribution but what I want is to give a custom domain name to this CloudFront distribution. I have created a sub-domain on my server and changed the CNAME to the CloudFront distribution link but I can not access to my content on S3. Can anyone tell me full go through how I can set up with alternate domain names? 回答1: To give a custom domain name to an Amazon CloudFront distribution: Provide

s3 cross account access with default kms key

落爺英雄遲暮 提交于 2020-01-25 11:11:11
问题 I have an s3 bucket in my account which has SSE enabled using default aws-kms key. I wish to provide read access to another account to my bucket. I have followed the following link to provide access: https://aws.amazon.com/premiumsupport/knowledge-center/cross-account-access-denied-error-s3/ I am using aws s3 ls <s3://bucket_name> and aws s3 cp <path to s3 object> . to download the object I tried providing cross-account access to a bucket without SSE enabled. I was successfully able to

fs.s3 configuration with two s3 account with EMR

若如初见. 提交于 2020-01-25 10:10:23
问题 I have pipeline using lambda and EMR, where I read csv from one s3 account A and write parquet to another s3 in account B. I created EMR in account B and has access to s3 in account B. I cannot add account A s3 bucket access in EMR_EC2_DefaultRole(as this account is enterprise wide data storage), so i use accessKey, secret key to access account A s3 bucket.This is done through congnito token. METHOD1 I am using fs.s3 protocol to read csv from s3 from account A and writing to s3 on account B.

String/INT96 to Datatime - Amazon Athena/SQL - DDL/DML

喜欢而已 提交于 2020-01-25 09:06:08
问题 I have hosted my data on S3 Bucket in parquet format and i am trying to access it using Athena. I can see i can successfully access the hosted table. I detected something fishy when i try to access a column "createdon". createdon is a timestamp column and it reflects same on Athena table, but when i try to query it using the provided SQL below query SELECT createdon FROM "uat-raw"."opportunity" limit 10; Unexpected output : +51140-02-21 19:00:00.000 +51140-02-21 21:46:40.000 +51140-02-22 00

String/INT96 to Datatime - Amazon Athena/SQL - DDL/DML

萝らか妹 提交于 2020-01-25 09:04:16
问题 I have hosted my data on S3 Bucket in parquet format and i am trying to access it using Athena. I can see i can successfully access the hosted table. I detected something fishy when i try to access a column "createdon". createdon is a timestamp column and it reflects same on Athena table, but when i try to query it using the provided SQL below query SELECT createdon FROM "uat-raw"."opportunity" limit 10; Unexpected output : +51140-02-21 19:00:00.000 +51140-02-21 21:46:40.000 +51140-02-22 00

AWS Batch job getting Access Denied on S3 despite user role

陌路散爱 提交于 2020-01-25 08:44:10
问题 I am deploying my first batch job on AWS. When I run my docker image in an EC2 instance, the script called by the job runs fine. I have assigned an IAM role to this instance to allow S3 access. But when I run the same script as a job on AWS Batch, it fails due to Access Denied errors on S3 access. This is despite the fact that in the Job Definition, I assign an IAM role (created for Elastic Container Service Task) that has full S3 access. If I launch my batch job with a command that does not