amazon-s3

DevPay and Mfa are mutually exclusive authorization methods

↘锁芯ラ 提交于 2020-01-24 10:24:25
问题 I'm trying to add MFA-deletion to my S3 bucket with the AWS-cli with the following command: aws s3api put-bucket-versioning --bucket <my-bucket-name> --versioning-configuration '{"MFADelete":"Enabled","Status":"Enabled"}' --mfa 'arn:aws:iam::<code-found-at-iam-page>:mfa/root-account-mfa-device <my-google-authenticator-code>' but the response I get is this: An error occurred (InvalidRequest) when calling the PutBucketVersioning operation: DevPay and Mfa are mutually exclusive authorization

How to use S3DistCp in java code

别说谁变了你拦得住时间么 提交于 2020-01-24 09:33:50
问题 I want to copy output of job from EMR cluster to Amazon S3 pro-grammatically. How to use S3DistCp in java code to do the same. 回答1: hadoop ToolRunner can run this.. since S3DistCP extends Tool Below is the usage example: import org.apache.commons.logging.Log; import org.apache.commons.logging.LogFactory; import org.apache.hadoop.util.ToolRunner; import com.amazon.external.elasticmapreduce.s3distcp.S3DistCp public class CustomS3DistCP{ private static final Log log = LogFactory.getLog

How to use S3DistCp in java code

非 Y 不嫁゛ 提交于 2020-01-24 09:33:11
问题 I want to copy output of job from EMR cluster to Amazon S3 pro-grammatically. How to use S3DistCp in java code to do the same. 回答1: hadoop ToolRunner can run this.. since S3DistCP extends Tool Below is the usage example: import org.apache.commons.logging.Log; import org.apache.commons.logging.LogFactory; import org.apache.hadoop.util.ToolRunner; import com.amazon.external.elasticmapreduce.s3distcp.S3DistCp public class CustomS3DistCP{ private static final Log log = LogFactory.getLog

large file from ec2 to s3

此生再无相见时 提交于 2020-01-24 03:30:12
问题 I have a 27GB file that I am trying to move from an AWS Linux EC2 to S3. I've tried both the 'S3put' command and the 'S3cmd put' command. Both work with a test file. Neither work with the large file. No errors are given, the command returns immediately but nothing happens. s3cmd put bigfile.tsv s3://bucket/bigfile.tsv 回答1: Though you can upload objects to S3 with sizes up to 5TB, S3 has a size limit of 5GB for an individual PUT operation. In order to load files larger than 5GB (or even files

How to copy a file with spaces in its file name from one bucket to the other using AWS CLI (Dos)

感情迁移 提交于 2020-01-24 02:17:08
问题 I am trying to copy a file by the same My CV 2017.pdf from one AWS bucket to the other using the AWS command line. But I am getting error doing that. I tried using My\ Cv\ 2017.pdf and 'My CV 2017.pdf' , both did not work. 回答1: Use double quotes. For example: aws s3 cp "s3://source-bucket/My CV 2017.pdf" "s3://destination-bucket/My CV 2017.pdf" 来源: https://stackoverflow.com/questions/46997354/how-to-copy-a-file-with-spaces-in-its-file-name-from-one-bucket-to-the-other-usi

photos not uploading to s3 bucket using php

二次信任 提交于 2020-01-24 00:32:14
问题 I am trying to upload images to s3 bucket via ec2 instance. But the images are not being uploaded to s3. I could verify that they are getting uploaded to ec2 instance. there is some problem in uploading to s3. I realised that upload function is not working, but I am not sure. I have been trying to solve this since two days. Any help would be really appreciated. EC2 instance is a linux machine. it has cURL and PHP-cURL installed. images are getting saved in the folder "uploads". Amazon SDK are

Is it safe to return an ResponseEntity<InputStreamResource> that wraps S3Object.getObjectContent() in REST controller?

一曲冷凌霜 提交于 2020-01-23 17:59:08
问题 I'm developing an Spring Boot Application, that should allow users to download files indirectly from Amazon S3 via specified application REST interface. For this purpose I have an REST-Controller, that returns an InputStreamResource to the user like following: @GetMapping(path = "/download/{fileId}") public ResponseEntity<InputStreamResource> downloadFileById(@PathVariable("fileId") Integer fileId) { Optional<LocalizedFile> fileForDownload = fileService.getConcreteFileForDownload(fileId); if

Moving data from S3 -> RDS using AWS Glue

↘锁芯ラ 提交于 2020-01-23 17:13:06
问题 Does AWS Glue provide ability to move data from S3 bucket to RDS database? I'm trying to setup serverless app that picks up dynamic data uploaded to S3 and migrates it to RDS. Glue provides Crawlers service that determines schema. Glue also provides ETL Jobs, but this seems to be where target source is only another S3 bucket. Any ideas? 回答1: Yes, Glue can send to an RDS datastore. If you are using the job wizard it will give you a target option of "JDBC". If you select JDBC you can setup a

Can't access S3 bucket using IAM Role from an EC2 instance

懵懂的女人 提交于 2020-01-23 13:11:50
问题 I'm trying to download a file from a private S3 bucket using the PHP SDK (on an EC2 instance). I create an IAM role and attached the AmazonS3FullAccess to it. I created the S3 bucket and this is the bucket policy: { "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Principal": { "AWS": "arn:aws:iam::206193043625:role/MyRoleName" }, "Action": [ "s3:PutObject", "s3:GetObject", "s3:DeleteObject" ], "Resource": "arn:aws:s3:::config-files/*" } ] } Then on the PHP side I make a curl

Retrieve/List objects using metadata in s3 - aws sdk

社会主义新天地 提交于 2020-01-23 10:04:31
问题 I have used User-Defined Metadata data to store the file in S3 bucket. Lets say my meta data would be like metaData = { "title": "some random user title", "description": "some random user description" } I understand that i can download file using the object key and the bucket name. I am looking whether there any way/options to get/retrieve/list the file by passing only the bucket name and User-Defined Metadata used for the object to upload in S3. And also to know the actual usage of User