amazon-sqs

Amazon AWS SQS - Apply QueuePolicy to existing Queue

。_饼干妹妹 提交于 2019-12-12 14:26:33
问题 if i am creating an SQS Queue via Cloudformation, are you able to attach an second QueuePolicy after the SQS Queue has been created? if i do run this following config: Resources: SQSQueue: Properties: QueueName: !Ref SQSQueuename Type: 'AWS::SQS::Queue' QueuePolicy: Type: 'AWS::SQS::QueuePolicy' Properties: PolicyDocument: Id: !Ref SQSQueuename Statement: - Sid: QueuePolicy2-SendMessage-To-Queue-From-SNS-Topic Effect: Allow Principal: AWS: !Ref AccountID Action: - 'sqs:*' Resource: 'arn:aws

Not able to read messages from SQS using @SqsListener

筅森魡賤 提交于 2019-12-12 12:33:20
问题 I'm trying to develop a SQS Listener which runs in background and reads message from AWS SQS whenever a new message arrives and it should never delete the message, since there will be a separate process for deleting message. This is a standalone application, just started developing. But not able to proceed further since the basic thing is not working. I'm sure I'm missing something. I am using spring-cloud-aws-messaging (version: 1.2.0.BUILD-SNAPSHOT). It's a very simple standalone

Retrying messages where my code fails with AWS SQS

北城以北 提交于 2019-12-12 10:45:09
问题 I've been trying to find out more information on having a retry and error queue where my code fails rather than having in-memory retries in my application. Here's my scenario: I'm sending a message saying something along the lines of: "Process the output for this task - It want's you to update this xml file with the contents of this stream". I have code that writes output to an xml file but it can occasionally fail and need retried as It's possible that another part of my application/person

Release a message back to SQS

半腔热情 提交于 2019-12-12 10:29:13
问题 I have a some EC2 servers pulling work off of a SQS queue. Occasionally, they encounter a situation where the can't finish the job. I have the process email me of the condition. As it stands now, the message stays "in flight" until it times out. I would like for the process to immediately release it back to the queue after the email is sent. But, I'm not sure how to accomplish this. Is there a way? If so, can you please point me to the call or post a code snippet. I'm using Python 2.7.3 and

How can I keep an Amazon SQS PHP reciever script running forever?

馋奶兔 提交于 2019-12-12 10:08:03
问题 I've previously used Gearman along with supervisor to manage jobs. In this case we are using Amazon SQS which I have spent some time trying to get my head around. I have set up a separate micro instance from our main webserver to use as an Image processing server (purely for testing at the moment, it will be upgraded and become part of a cluster before this implementation goes live) On this micro instance I have installed PHP and ImageMagick in order to perform the image processing. I have

Example of .net application using Amazon SQS

核能气质少年 提交于 2019-12-12 09:30:47
问题 I am looking for a sample .Net application that continuously checks Amazon SQS for new messages and when one is found, perform an action and remove it from the queue. My goal is to have an app running on EC2 that watches my SQS queue for new messages. When one is found, a call will be made to one of several web based APIs and the message will be deleted from the queue. Can someone point me to an example of something similar ? edit Would this type of application best be created as a windows

Access denied to SQS via AWS SDK

萝らか妹 提交于 2019-12-12 09:28:34
问题 I'm currently working on a website developed with Symfony2 and I need to send messages in an Amazon SQS. In order to do that I added to my composer.json : "aws/aws-sdk-php": "2.4.*" Then when I try to create a queue or list queues I've got a 403 error saying: Access to the resource https://sqs.us-west-2.amazonaws.com/ is denied. EDIT: added the full error message AWS Error Code: AccessDenied, Status Code: 403, AWS Request ID: 2fe34c11-7af8-5445-a768-070159a0953e, AWS Error Type: client, AWS

AWS SQS FIFO - How to get more than 10 messages at a time?

筅森魡賤 提交于 2019-12-12 08:14:00
问题 Currently we want to pull down an entire FIFO queue , and process the contents, and if any issues, release messages back into the queue. The problem is, that currently AWS only gives us 10 messages, and won't give us 10 more (which is the way you get bulk messages in SQS, multiple 10 max message requests) until we delete or release the first 10. We need to get more than 10 though. Is this not possible? We understand we can set the group_id to a random string, and that allows processing more,

managing AWS SQS and DLQ

為{幸葍}努か 提交于 2019-12-12 07:00:53
问题 Scenario : create a lambda and it will be triggered whenever a message comes to SQS(let's assume SQS-A). The lambda (written in python)is responsible for sending the incoming payload to the another endpoint. The problem is, whenever the target endpoint or server is down, I was trying to place it into the another SQS (let's assume SQS-B), if other exceptions comes than placing it into Deal Letter Queue. Here I want to two things. If ConnectionError (it is the python exception says which says

sharing SQS across multiple environments

扶醉桌前 提交于 2019-12-12 05:39:07
问题 We're exploring SQS to improve the reliability of some asynchronous job queues. I'm trying to identify the best SQS deployment strategy to support multiple queues and environments (farm clusters, developer sandboxes, developer laptops, etc.). Our previous job queue used a separate queue server for each environment which provided for nice isolation. Since SQS is a global resource I don't quite yet see the optimal path for security/isolation and maintenance. In the worse case I think we'd need