aws-sdk

Calculate S3 object(folder) size in java

余生颓废 提交于 2020-01-01 15:09:16
问题 Im storing all types of files on amazon S3. In amazon S3 bucket, All files store in different folders, I know there is no concept of folder in amazon s3. Objects are only identified by their keys. if i store any file with key like 'mydocs/personal/profile-pic.jpg' that mean two parents folders(personal folder inside mydocs folder) will be created there. I want to calculate the size of any folder like 'mydocs' in java. I calculated bucket total size by using this code given below: public long

AmazonS3: Getting warning: S3AbortableInputStream:Not all bytes were read from the S3ObjectInputStream, aborting HTTP connection

允我心安 提交于 2020-01-01 08:21:11
问题 Here's the warning that I am getting: S3AbortableInputStream:Not all bytes were read from the S3ObjectInputStream, aborting HTTP connection. This is likely an error and may result in sub-optimal behavior. Request only the bytes you need via a ranged GET or drain the input stream after use. I tried using try with resources but S3ObjectInputStream doesn't seem to close via this method. try (S3Object s3object = s3Client.getObject(new GetObjectRequest(bucket, key)); S3ObjectInputStream

SonataMediaBundle - S3 AWS: 'The configured bucket “my-bucket” does not exist

自闭症网瘾萝莉.ら 提交于 2020-01-01 06:34:52
问题 I'm trying to configure the AWS s3 filesystem on my Sonata-Project, but I always get the following error: The configured bucket "my-bucket" does not exist. My sonata_media.yml : cdn: server: path: http://%s3_bucket_name%.s3-website-%s3_region%.amazonaws.com providers: image: filesystem: sonata.media.filesystem.s3 file: resizer: false allowed_extensions: ['pdf'] allowed_mime_types: ['application/pdf', 'application/x-pdf'] filesystem: s3: bucket: %s3_bucket_name% accessKey: %s3_access_key%

SonataMediaBundle - S3 AWS: 'The configured bucket “my-bucket” does not exist

大城市里の小女人 提交于 2020-01-01 06:34:04
问题 I'm trying to configure the AWS s3 filesystem on my Sonata-Project, but I always get the following error: The configured bucket "my-bucket" does not exist. My sonata_media.yml : cdn: server: path: http://%s3_bucket_name%.s3-website-%s3_region%.amazonaws.com providers: image: filesystem: sonata.media.filesystem.s3 file: resizer: false allowed_extensions: ['pdf'] allowed_mime_types: ['application/pdf', 'application/x-pdf'] filesystem: s3: bucket: %s3_bucket_name% accessKey: %s3_access_key%

AWS S3 Generating Signed Urls ''AccessDenied''

六眼飞鱼酱① 提交于 2020-01-01 06:13:43
问题 I am using NodeJs to upload files to AWS S3. I want the client to be able to download the files securely. So I am trying to generate signed URLs, that expire after one usage. My code looks like this: Uploading const s3bucket = new AWS.S3({ accessKeyId: 'my-access-key-id', secretAccessKey: 'my-secret-access-key', Bucket: 'my-bucket-name', }) const uploadParams = { Body: file.data, Bucket: 'my-bucket-name', ContentType: file.mimetype, Key: `files/${file.name}`, } s3bucket.upload(uploadParams,

AWS S3 Generating Signed Urls ''AccessDenied''

感情迁移 提交于 2020-01-01 06:13:12
问题 I am using NodeJs to upload files to AWS S3. I want the client to be able to download the files securely. So I am trying to generate signed URLs, that expire after one usage. My code looks like this: Uploading const s3bucket = new AWS.S3({ accessKeyId: 'my-access-key-id', secretAccessKey: 'my-secret-access-key', Bucket: 'my-bucket-name', }) const uploadParams = { Body: file.data, Bucket: 'my-bucket-name', ContentType: file.mimetype, Key: `files/${file.name}`, } s3bucket.upload(uploadParams,

AWS Cognito user pool identity REST examples

非 Y 不嫁゛ 提交于 2020-01-01 03:13:12
问题 We are looking into using user pools for our application. I would like to try out API in REST manner. Documentation at https://docs.aws.amazon.com/cognito-user-identity-pools/latest/APIReference/Welcome.html doesn't have request and response examples like others. Looking for SignUp, ResendConfirmationCode,ChangePassword and ConfirmSignUp examples. 回答1: Currently it is not in Cognito user pools documentation, but following example should work for SignUp. Similarly you can formulate it for

Mock a dependency's constructor Jest

怎甘沉沦 提交于 2019-12-31 12:30:14
问题 I'm a newbie to Jest. I've managed to mock my own stuff, but seem to be stuck mocking a module. Specifically constructors. usage.js const AWS = require("aws-sdk") cw = new AWS.CloudWatch({apiVersion: "2010-08-01"}) ... function myMetrics(params) { cw.putMetricData(params, function(err, data){}) } I'd like to do something like this in the tests. const AWS = jest.mock("aws-sdk") class FakeMetrics { constructor() {} putMetricData(foo,callback) { callback(null, "yay!") } } AWS.CloudWatch = jest

InvalidParameterValueException: The role defined for the function cannot be assumed by Lambda

为君一笑 提交于 2019-12-30 04:41:07
问题 I'm using the AWS SDK for JavaScript and it is returning the following error when I try to create a Lambda function: InvalidParameterValueException: The role defined for the function cannot be assumed by Lambda. I've double-checked my role and it is perfectly valid. However, I'm still unable to create the Lambda function. My role trust relationship is: { "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Principal": { "Service": [ "lambda.amazonaws.com" ] }, "Action": [ "sts

How to get the pure Json string from DynamoDB stream new image?

ⅰ亾dé卋堺 提交于 2019-12-29 05:22:12
问题 I've a Dynamodb table with streaming enabled. Also I've created a trigger for this table which calls an AWS Lambda function. Within this lambda function, I'm trying read the new image (Dynamodb item after the modification) from the Dynamodb stream and trying to get the pure json string out of it. My Question is how can i get the pure json string of the DynamoDB item that's been sent over the stream? I'm using the code snippet given below to get the new Image, but I've no clue how to get the