amazon-s3

Uploading image to S3 using phonegap, how to?

左心房为你撑大大i 提交于 2020-01-05 23:28:01
问题 I'm trying to get some image onto a S3 but not quite succeeding... Here is my work so far $cordovaCamera.getPicture(options).then(function(imageURI) { // imageURI will be something like: file:///some_path // get the base64 data from the image var img = Utils.encodeImageUri(imageURI); // get the base64 from a new image, not sure if this is needed var image = new Image(); image.src = img; Utils.uploadToS3(image.src); }, function(err) {}) ... // boilerplate function to create a Blob

S3 multiple index files

假装没事ソ 提交于 2020-01-05 17:23:27
问题 I'm using s3 with cloudfront. I have an application that has two index files. /index /admin/index The /index works fine the /admin/index requires me to put /admin/index.html without including index.html it throws <Error> <Code>AccessDenied</Code> <Message>Access Denied</Message> <RequestId>D989FEFADF688159</RequestId> <HostId> GvoytrXvDOLPu26AiYYaq6Zi4ck42xyZy3mdxlSF8q5AZc4WEphayr5o6WVDxNM7+qutIAfn53k= </HostId> </Error> I checked the permissions on the file they are correctly set.

S3 multiple index files

本秂侑毒 提交于 2020-01-05 17:19:01
问题 I'm using s3 with cloudfront. I have an application that has two index files. /index /admin/index The /index works fine the /admin/index requires me to put /admin/index.html without including index.html it throws <Error> <Code>AccessDenied</Code> <Message>Access Denied</Message> <RequestId>D989FEFADF688159</RequestId> <HostId> GvoytrXvDOLPu26AiYYaq6Zi4ck42xyZy3mdxlSF8q5AZc4WEphayr5o6WVDxNM7+qutIAfn53k= </HostId> </Error> I checked the permissions on the file they are correctly set.

Using GroupBy while copying from HDFS to S3 to merge files within a folder

谁说我不能喝 提交于 2020-01-05 08:48:09
问题 I have the following folders in HDFS : hdfs://x.x.x.x:8020/Air/BOOK/AE/DOM/20171001/2017100101 hdfs://x.x.x.x:8020/Air/BOOK/AE/INT/20171001/2017100101 hdfs://x.x.x.x:8020/Air/BOOK/BH/INT/20171001/2017100101 hdfs://x.x.x.x:8020/Air/BOOK/IN/DOM/20171001/2017100101 hdfs://x.x.x.x:8020/Air/BOOK/IN/INT/20171001/2017100101 hdfs://x.x.x.x:8020/Air/BOOK/KW/DOM/20171001/2017100101 hdfs://x.x.x.x:8020/Air/BOOK/KW/INT/20171001/2017100101 hdfs://x.x.x.x:8020/Air/BOOK/ME/INT/20171001/2017100101 hdfs://x.x

One IAM Role across multiple AWS accounts

给你一囗甜甜゛ 提交于 2020-01-05 08:30:09
问题 For security reasons, we have a pre-prod and a prod AWS account. We're now beginning to use IAM Roles for S3 access to js/css files through django-storage / boto. While this is working correctly on a per account basis, now a need has risen where the QA instance needs to access one S3 bucket on a the prod account. Is there a way to have one IAM role that can grant access to the pre-prod And prod S3 buckets? As I'm writing it seems impossible, but it never hearts to ask! 回答1: Here's the AWS doc

Encrypting large streams to be sent via Amazon S3

☆樱花仙子☆ 提交于 2020-01-05 06:02:07
问题 I want to encrypt stream and then send it using Amazon S3. I'm using legacy code and have two important parameters: non-encrypted InputStream and its length. This is important as AmazonS3Client wants to know the length of the stream before it uploads it. Encrypting a stream is not very difficult task: InputStream in = new FileInputStream("path-to-file"); KeyGenerator keygen = KeyGenerator.getInstance("AES"); Key key = keygen.generateKey(); Cipher cipher = Cipher.getInstance("AES/CBC/NoPadding

How rename S3 files not HDFS in spark scala

99封情书 提交于 2020-01-05 05:32:06
问题 I have approx 1 millions text files stored in S3 . I want to rename all files based on their folders name. How can i do that in spark-scala ? I am looking for some sample code . I am using zeppelin to run my spark script . Below code I have tried as suggested from answer import org.apache.hadoop.fs._ val src = new Path("s3://trfsmallfffile/FinancialLineItem/MAIN") val dest = new Path("s3://trfsmallfffile/FinancialLineItem/MAIN/dest") val conf = sc.hadoopConfiguration // assuming sc = spark

Node.js AWS Lambda inconsistent s3.putObject upload of large data object

*爱你&永不变心* 提交于 2020-01-05 05:31:16
问题 Here is the Lambda code I am using to read a table and then upload the results to S3: 'use strict'; const pg = require('pg'); const aws = require('aws-sdk'); const awsParamStore = require( 'aws-param-store' ); exports.handler = async function (context) { function putObjectToS3(bucket, key, data){ var s3 = new aws.S3(); var params = { Bucket : bucket, Key : key, Body : data } s3.putObject(params, function(err, data) { if (err) console.log(err, err.stack); // an error occurred else console.log

AWS Import-image User does not have access to the S3 object

有些话、适合烂在心里 提交于 2020-01-05 05:10:13
问题 When running the AWS (Amazon Web Services) import-image task: aws ec2 import-image --description "My OVA" --disk-containers file://c:\TEMP\containers.json I get the following error: An error occurred (InvalidParameter) when calling the ImportImage operation: User does not have access to the S3 object.(mys3bucket/vms/myOVA.ova) I followed all of the instructions in this AWS document on importing a VM (including Steps 1, 2, and 3). Specifically, I setup a vmimport role and the recommended

Copy files from S3 to EMR local using Lambda

被刻印的时光 ゝ 提交于 2020-01-05 04:57:28
问题 I need to move the files from S3 to EMR's local dir /home/hadoop programmatically using Lambda. S3DistCp copies over to HDFS. I then login into EMR and run a CopyToLocal hdfs command on commandline to get the files to /home/hadoop. Is there a programmatic way using boto3 in Lambda to copy from S3 to Emr's local dir? 回答1: I wrote a test Lambda function to submit a job step to EMR that copies files from S3 to EMR's local dir. This worked. emrclient = boto3.client('emr', region_name='us-west-2')