aws-sdk

The authorization mechanism you have provided is not supported. Please use AWS4-HMAC-SHA256

|▌冷眼眸甩不掉的悲伤 提交于 2020-01-08 09:15:46
问题 I get an error AWS::S3::Errors::InvalidRequest The authorization mechanism you have provided is not supported. Please use AWS4-HMAC-SHA256. when I try upload file to S3 bucket in new Frankfurt region. All works properly with US Standard region. Script: backup_file = '/media/db-backup_for_dev/2014-10-23_02-00-07/slave_dump.sql.gz' s3 = AWS::S3.new( access_key_id: AMAZONS3['access_key_id'], secret_access_key: AMAZONS3['secret_access_key'] ) s3_bucket = s3.buckets['test-frankfurt'] # Folder and

AWS Cognito for API token authentication

百般思念 提交于 2020-01-07 08:59:10
问题 I am building an HTTP API in java that uses AWS cognito, and developer authenticated identities to provide tokens to secure the API. I have configured the login system to issue a cognito token when a user logs in and the cognito identity pool contains each user and a developer identity associated with my backend, but I am having a very difficult time finding a straightforward way to retrieve a users specific developer identifier from the cognito token. I have attempted to use the

How to pass access credentials in request with AWS API

有些话、适合烂在心里 提交于 2020-01-07 04:40:05
问题 I'm trying to pass a request with AWS API but I get this error AWS was not able to validate the provided access credentials My request is: https://ec2.amazonaws.com/?Action=RunInstances&ImageId=ami-6df1e514&KeyName=key1&InstanceType=t2.micro&Placement.AvailabilityZone=us-west-2&AWSSecretAccessKey=**********************&AWSAccessKeyId=****************** My access credentials are right, there is no doubt about that. I found that the problem could be caused by clock delay but My PC's clock is

Jest Mock Promise with Params

橙三吉。 提交于 2020-01-06 07:14:42
问题 This is the method I am trying to write Unit test in Jest async function getParameter(parameter: string, withDecryption: boolean = false): Promise<String> { const params = { Name: parameter, WithDecryption: withDecryption, }; try { const request = await ssmClient.getParameter(params).promise(); return request.Parameter.Value; } catch (err) { logger.error(`Error ${err}`); throw Error(err); } } Test Method : test('getParameterFromSystemManager', async () => { const mockedResponseData = {

Jest Mock Promise with Params

自作多情 提交于 2020-01-06 07:14:00
问题 This is the method I am trying to write Unit test in Jest async function getParameter(parameter: string, withDecryption: boolean = false): Promise<String> { const params = { Name: parameter, WithDecryption: withDecryption, }; try { const request = await ssmClient.getParameter(params).promise(); return request.Parameter.Value; } catch (err) { logger.error(`Error ${err}`); throw Error(err); } } Test Method : test('getParameterFromSystemManager', async () => { const mockedResponseData = {

AWS SDK - getObject convert returned file from ASCII buffer to json/csv

半世苍凉 提交于 2020-01-06 06:57:07
问题 I'm using the aws-sdk for Nodejs, and I'm getting an object back from AWS bucket in the form of a buffer, like so: [31, 139, 8, 0, 0, 0 ....] The original object located in the bucket is in csv format, can I convert the buffer to a usable format like csv or json? 回答1: You can do it easily, here's an example: s3.getObject(params, function (error, data) { if (error) { throw error } else { // Convert the provided array to a string. You can save it as CSV if you want const csvString = data.Body

aws-sdk s3 upload not working from mocha test

二次信任 提交于 2020-01-06 04:20:09
问题 Trying to run s3 upload from a mocha test: 'use strict'; describe('S3 test', function() { it.only('S3 test 1', function*() { var AWS = require('aws-sdk'); //AWS.config.region = 'us-west-2'; var s3 = new AWS.S3({ params: { Bucket: 'test-1-myBucket', Key: 'myKey' } }); s3.createBucket(function(err) { if (err) { console.log("Error:", err); } else { s3.upload({ Body: 'Hello!' }, function() { console.log("Successfully uploaded data to myBucket/myKey"); }); } }); }); }); but nothing happens, it is

Keeping AWS CloudSearchDomain in sync with DynamoDB

北城以北 提交于 2020-01-05 08:18:17
问题 Im trying to add some flexible search functionality to my dynamoDB, so I set up the AWS CloudSearchDomain service. Which I believed could add a wrapper around my dynamoDB and retrieve documents with flexible search options. Since implementing I realised that adding an item to my dynamoDB does not automatically add the item to the searchable CloudSearchDomain documents. The AWS docs advise to sync the DB items with the CloudSearchDomain periodically, such as at the end of each day. But I want

const char * to std::basic_iostream

吃可爱长大的小学妹 提交于 2020-01-05 07:20:40
问题 I have a pointer to a const *char buffer as well as it's length, and am trying to use an API (in this case, the AWS S3 C++ upload request) that accepts an object of type: std::basic_iostream <char, std::char_traits <char>> Is there a simple standard C++11 way to convert my buffer into a compatible stream, preferably without actually copying over the memory? 回答1: Thanks to Igor's comment, this seems to work: func(const * char buffer, std::size_t buffersize) { auto sstream = std::make_shared

How can a cognito user be assigned to a group in iOS app

↘锁芯ラ 提交于 2020-01-05 05:32:26
问题 I have created a cognito user group in AWS console. I am able to create a cognito user from my iOS app and I can see the user record from the AWS console. Now I need to add this user to a group. Is it possible to do this from the app without using AWS console. i.e When a new user is created the user should be added to a group. This should be handled in the iOS app. 回答1: You can create and manage groups in a user pool from the AWS Management Console, the APIs, and the CLI. As a developer