amazon-sqs

SQS Messages Not Deleting

我只是一个虾纸丫 提交于 2019-12-05 02:56:14
I have a small set of messages in an SQS queue, that are not deleted even though a deletion request sent to the AWS endpoint returns with a 200 response. The messages are processed by my application fine, and the deletion request is sent fine too. I'm using the Java AWS SDK 1.3.6. Has anyone else experienced this problem? Whoops - the queue was accidentally set to defaultVisibilityTimeout=0 . Changing this to a positive value fixed the problem. This still raises a few questions though: Why did this only affect some messages? Perhaps some took longer to process? Why did Amazon return a 200 for

Amazon Cloudwatch alarm not triggered

浪尽此生 提交于 2019-12-05 00:48:13
I have a cloudwatch alarm configured : Threshold : "GreaterThan 0" for 1 consecutive period, Period : 1 minute, Statistic : Sum The alarm is configured on top of AWS SQS NumberOfMessagesSent. The queue was empty and no messages were being published to it. I sent a message manually. I could see the spike in metric but state of alarm was still OK. I am a bit confused why this alarm is not changing its state even though all the conditions to trigger this are met. I just overcame this problem with the help of AWS support. You need to set the period on your alarm to ~15 minutes. It's got to do with

Example of .net application using Amazon SQS

删除回忆录丶 提交于 2019-12-04 23:02:32
I am looking for a sample .Net application that continuously checks Amazon SQS for new messages and when one is found, perform an action and remove it from the queue. My goal is to have an app running on EC2 that watches my SQS queue for new messages. When one is found, a call will be made to one of several web based APIs and the message will be deleted from the queue. Can someone point me to an example of something similar ? edit Would this type of application best be created as a windows service? The AWS SDK for .NET features samples for several Amazon Web Services , including an Amazon SQS

Access denied to SQS via AWS SDK

◇◆丶佛笑我妖孽 提交于 2019-12-04 22:17:59
I'm currently working on a website developed with Symfony2 and I need to send messages in an Amazon SQS. In order to do that I added to my composer.json : "aws/aws-sdk-php": "2.4.*" Then when I try to create a queue or list queues I've got a 403 error saying: Access to the resource https://sqs.us-west-2.amazonaws.com/ is denied. EDIT: added the full error message AWS Error Code: AccessDenied, Status Code: 403, AWS Request ID: 2fe34c11-7af8-5445-a768-070159a0953e, AWS Error Type: client, AWS Error Message: Access to the resource https://sqs.us-west-2.amazonaws.com/ is denied., User-Agent: aws

How to delete events from an Amazon SQS (Simple Queue Service) queue really fast?

妖精的绣舞 提交于 2019-12-04 21:46:46
问题 Suppose that I have many millions of events in a SQS queue and I want to get rid of them quickly, but I cannot just delete the queue and make a new one. What is the fastest way to delete/drain those events out of the queue? 回答1: I'm assuming that you don't care about the values in the messages, since you appear to want to drain it rather than process it. You can set the MessageRetentionPeriod to a very low value, and then drain any remaining messages out of the queue. After its drained, set

Celery Consumer SQS Messages

放肆的年华 提交于 2019-12-04 19:02:27
I am new to Celery and SQS , and would like to use it to periodically check messages stored in SQS and then fire a consumer. The consumer and Celery both live on EC2 , while the messages are sent from GAE using boto library. Currently, I am confused about: In the message body of creating_msg_gae.py , what task information I should put here? I assume this information would be the name of my celery task ? In the message body of creating_msg_gae.py , is url considered as the argument to be processed by my consumer ( function do_something_url(url) in tasks.py )? Currently, I am running celery with

Amazon SNS -> SQS message body

三世轮回 提交于 2019-12-04 18:52:36
问题 I'm sending a message from an SNS topic to an SQS. When I'm checking the body of the SQS message on my client, the whole of the message metadata is being sent in the SQS body. I.E. if I'm sending a message "Hello World" from the topic, my client is receiving: BenFlowers { "Type" : "Notification", "MessageId" : "84102bd5-8890-4ed5-aeba-c15fafc926dc", "TopicArn" : "arn:aws:sns:eu-west-1:534706846367:HelloWorld", "Message" : "hello World", "Timestamp" : "2012-06-05T13:44:22.360Z",

AWS SQS Asynchronous Queuing Pattern (Request/Response)

我怕爱的太早我们不能终老 提交于 2019-12-04 10:27:34
I'm looking for help with an architectural design decision I'm making with a product. We've got multiple producers (initiated by API Gateway calls into Lambda) that put messages on a SQS queue (the request queue). There can be multiple simultaneous calls, so there would be multiple Lambda instances running in parallel. Then we have consumers (lets say twenty EC2 instances) who long-poll on the SQS for the message to process them. They take about 30-45 seconds to process a message each. I would then ideally like to send the response back to the producer that issued the request - and this is the

SQS triggers Lambda with multiple records/messages?

穿精又带淫゛_ 提交于 2019-12-04 09:24:05
I've observed an abnormal (well, in my POV) feature, where when I setup SQS to trigger a Lambda, when new messages arrive, lambdas get triggered with more than 1 record/message inside its event body. Full setup is S3 (PutObjectEvent) -> SNS topic -> SQS -> Lambda. The abnormal behaviour is that for example, let's say I put 15 objects inside S3, which then forwards an event to SNS per each object, which then I can observe, SQS gets populated with 15 messages. However, when Lambdas start triggering, out of those 15 messages, only 11 Lambdas trigger, some of them containing more than 1 record

How to use AWS SQS/SNS as a push notification queue for heavy processing tasks via PHP?

对着背影说爱祢 提交于 2019-12-04 09:19:15
问题 I have a single server running on rackspace which is hosting a single PHP web app. The PHP web app will accept a form submission that then needs to perform a task based on the form field entries. The task (let's called it the generate metadata task) requires quite a lot of processing time. I was wondering how to allow the form submission to be a straightforward save to database and immediately show success page to user while running the generate metadata task in the background. I have