amazon-sqs

Accessing local SQS service from another docker container using environment variables

佐手、 提交于 2019-12-02 09:56:22
I have a flask application which needs to interact with an SQS service whenever an endpoint is hit. I'm mimicing the SQS service locally using docker image sukumarporeddy/sqs:fp whose base image is https://github.com/vsouza/docker-SQS-local with two more queues added in configuration. I need to access this service from another app which is run as app_service . These two services are run using docker-compose.yml file where I mentioned two services. app_service sqs_service While building the app image, I'm setting environment variables to access the sqs_service as QUEUE_ENDPOINT=http://sqs

Elastic Beanstalk SQSD Error on worker start

孤者浪人 提交于 2019-12-02 03:22:41
I've deployed a nodejs worker. However whenever I try to start it, it gets red and this error is showned: ERROR Instance: i-6eef007a Module: AWSEBAutoScalingGroup ConfigSet: null Command failed on instance. Return code: 1 Output: Error occurred during build: Command 01-start-sqsd failed . I don't know if it's related, sometimes I get this error on the screen: IamInstanceProfile: The environment does not have an IAM instance profile associated with it. To improve deployment speed please associate an IAM instance profile with the environment. I've already given permission to SQS and set key and

Max AWS SQS Queues

本秂侑毒 提交于 2019-12-01 15:08:49
Does anyone know what the maximum amount of queues I can create is? I've look around on AWS and can't seem to find the answer. I might have almost 50 different queues at the end of this project and want to make sure I am not running out of runway... There is no limit for the number of queues and for the number of messages in a queue. http://aws.amazon.com/sqs/faqs/#How_big_can_Amazon_SQS_queues_be I don't see any mention of a limit. But 50 is small. You should be fine. To confirm, just create 50 queues. I don't know of any limit. We're currently running about 45 queues and I'm sure we hit the

How to determine size of string, and compress it

纵然是瞬间 提交于 2019-12-01 04:52:20
I'm currently developing an application in C# that uses Amazon SQS The size limit for a message is 8kb. I have a method that is something like: public void QueueMessage(string message) Within this method, I'd like to first of all, compress the message (most messages are passed in as json, so are already fairly small) If the compressed string is still larger than 8kb, I'll store it in S3. My question is: How can I easily test the size of a string, and what's the best way to compress it? I'm not looking for massive reductions in size, just something nice and easy - and easy to decompress the

AWS SQS message retention period

女生的网名这么多〃 提交于 2019-11-30 23:47:07
问题 According to documentation the maximum for AWS SQS message retention period is 14 days. After that time message will be deleted from the queue. Is any way with SQS to not lose these messages after their retention period expired ? For example, it is not clear or is it possible to use Dead Letter Queue for this purpose ? 回答1: Well 14 days is max limit you can keep the message. After 14 days you can move that massage to S3 Bucket for backup. Also there is a hack you can do with DLQ. Here is a

How to enumerate all SQS queues in an AWS account

房东的猫 提交于 2019-11-30 20:25:34
How do I list all SQS queues in an AWS account programmatically via the API and .Net SDK? I am already doing something similar with DynamoDb tables, and that's fairly straightforward - you can page through results using ListTables in a loop until you have them all. However the equivalent SQS Api endpoint, ListQueues is different and not as useful. It returns up to 1000 queues, with no option of paging. Yes, there can be over 1000 queues in my case. I have had a query return exactly 1000 results. It's all in 1 region, so it's not the same as this question . You can retrieve SQS queue names from

Push notifications in PHP using Amazon SNS/SQS?

走远了吗. 提交于 2019-11-30 06:58:36
On my site I'd like to do push notifications of comments like Stackoverflow does. Amazon SNS/SQS seems to provide a framework to do this but I'm having difficulty finding any code/explanation on the web for anything beyond a "hello world" equivalent. From reading the AWS SNS/SQS documentation it looks like I need the following: logic: post comment/answer to a new question create topic (for first comment/answer only) publish message subscribe to topic PHP on the page where comments are posted (http://mysite.com/postCommentOrAnswer.php): $comment=$_POST['comment']; //posted comment require_once

Elastic Beanstalk Worker's SQS daemon getting 504 gateway timeout after 1 minute

老子叫甜甜 提交于 2019-11-30 05:47:58
I have an Elastic Beanstalk worker that can only run one task at a time and it takes some time to do so (from a few minutes to, hopefully, less than 30 minutes), so I'm queuing my tasks on a SQS. On my worker configuration, I have: HTTP connections: 1 Visibility timeout: 3600 Error visibility timeout: 300 (On "Advanced") Inactivity timeout: 1800 The problem is that there seems to be a 1 minute timeout (on nginx?) that overrides the "Inactivity timeout", returning a 504 (Gateway timeout). This is what I can find on the aws-sqsd.log file: 2016-02-03T16:16:27Z init: initializing aws-sqsd 2.0

SQS - Delivery Delay of 30 minutes

假装没事ソ 提交于 2019-11-30 03:51:45
问题 From the documentation of SQS, Max time delay we can configure for a message to hide from its consumers is 15 minutes - http://docs.aws.amazon.com/AWSSimpleQueueService/latest/SQSDeveloperGuide/sqs-delay-queues.html Suppose if I need to hide the messages for a day, what is the pattern? For eg. I want to mimic a daily cron for doing some action. Thanks 回答1: Visibility timeout can do up to 12 hours. I think you can hack something together where you process a message but don't delete it and next

Finding certain messages in SQS

≯℡__Kan透↙ 提交于 2019-11-29 05:35:11
I know SQS ain't build for that, but I'm curious is it possible to find messages in a queue that meet some criteria? I can pull messages in a loop, search the message bodies for some pattern (without even deserializing them), and filter the messages I needed. But then it is possible to end up with an infinite loop - the first messages I read will be back to the queue by the time when I reach the end of the queue... Extending visibility of the messages possible, but how do I know exactly how long it will take to scan the entire queue, and for how long should I extend the visibility? What if I