amazon-sqs

How to determine size of string, and compress it

雨燕双飞 提交于 2019-12-19 06:59:42
问题 I'm currently developing an application in C# that uses Amazon SQS The size limit for a message is 8kb. I have a method that is something like: public void QueueMessage(string message) Within this method, I'd like to first of all, compress the message (most messages are passed in as json, so are already fairly small) If the compressed string is still larger than 8kb, I'll store it in S3. My question is: How can I easily test the size of a string, and what's the best way to compress it? I'm

How to determine size of string, and compress it

大憨熊 提交于 2019-12-19 06:59:27
问题 I'm currently developing an application in C# that uses Amazon SQS The size limit for a message is 8kb. I have a method that is something like: public void QueueMessage(string message) Within this method, I'd like to first of all, compress the message (most messages are passed in as json, so are already fairly small) If the compressed string is still larger than 8kb, I'll store it in S3. My question is: How can I easily test the size of a string, and what's the best way to compress it? I'm

Can't trigger lambda's on SQS FIFO

China☆狼群 提交于 2019-12-18 14:00:55
问题 I am trying to trigger lambda execution to execute an item on a FIFO queue. Other than polling, what options do we have to accomplish that? We just learned that we cannot directly trigger a lambda execution from a FIFO queue, which is only supported from the standard queue at this time. I also learned that we cannot subscribe an SNS topic to a FIFO queue – which is only supported on the standard queue as well. Has anybody found a work around for this yet until Amazon releases an update? 回答1:

How to process SQS queue with lambda function (not via scheduled events)?

佐手、 提交于 2019-12-18 10:03:09
问题 Here is the simplified scheme I am trying to make work: http requests --> (Gateway API + lambda A) --> SQS --> (lambda B ?????) --> DynamoDB So it should work as shown: data coming from many http requests (up to 500 per second, for example) is placed into SQS queue by my lambda function A. Then the other function, B, processes the queue: reads up to 10 items (on some periodical basis) and writes them to DynamoDB with BatchWriteItem. The problem is that I can't figure out how to trigger the

AWS Lambda for objects moved to glacier

那年仲夏 提交于 2019-12-14 04:19:35
问题 I am working on a POC where I have setup a Lifecycle rule on S3 to move objects to glacier after certain no of days (if objects have specified tag). Rule is working fine for me, objects are getting moved to glacier by lifecycle rule and storage type is change to Glacier from Standard. (so far so good). As I need to restrict user to use that file (archived file) from my application, I am looking for a way to get notification (either through SQS) or invoke Lambda function (to call my

Laravel email queue infinite processing?

为君一笑 提交于 2019-12-14 02:21:08
问题 I am using Laravel 5.4 with AWS SQS. I have tested it to jobs that are doing all sort of stuff and it works fine, however when I try to queue an email I get: [2017-08-18 09:21:48] Processing: App\Mail\WelcomeEmail [2017-08-18 09:21:48] Processing: App\Mail\WelcomeEmail [2017-08-18 09:21:48] Processing: App\Mail\WelcomeEmail [2017-08-18 09:21:48] Processing: App\Mail\WelcomeEmail [2017-08-18 09:21:49] Processing: App\Mail\WelcomeEmail [2017-08-18 09:21:49] Processing: App\Mail\WelcomeEmail

Processing AWS Lambda messages in Batches

陌路散爱 提交于 2019-12-14 01:13:21
问题 I am wondering something, and I really can't find information about it. Maybe it is not the way to go but, I would just like to know. It is about Lambda working in batches. I know I can set up Lambda to consume batch messages. In my Lambda function I iterate each message, and if one fails, Lambda exits. And the cycle starts again. I am wondering about slightly different approach Let's assume I have three messages: A , B and C . I also take them in batches. Now if the message B fails (e.g. API

VPC-running AWS Lambda sends SQS message only once

為{幸葍}努か 提交于 2019-12-13 20:09:39
问题 I have a NodeJS Lambda function running in a private subnet, with allow all incoming/outgoing policies in both the security group and the NACL (not safe, but they do the job). The private subnet has a NAT gateway sitting in a public subnet of the same VPC, so internet connectivity works. My goal is to send messages to an SQS queue. The Lambda code is this: const AWS = require('aws-sdk') const sqs = new AWS.SQS() exports.handler = (event, context, callback) => { sqs.sendMessage({ MessageBody:

Check for an incoming message in aws sqs

你。 提交于 2019-12-13 17:22:47
问题 How does my function continuously check for an incoming message? The following function exits, after receiving a message. Considering, long polling has been enabled for the queue how do I continuously check for a new message? function checkMessage(){ var params = { QueueUrl : Constant.QUEUE_URL, VisibilityTimeout: 0, WaitTimeSeconds: 0 } sqs.receiveMessage(params,(err,data) => { if(data){ console.log("%o",data); } }); } 回答1: Your function would need to continually poll Amazon SQS. Long

SQS Lambda - retry logic?

百般思念 提交于 2019-12-13 12:57:22
问题 When the message has been added to an SQS queue and it is configured to trigger a lambda function (nodejs). When a lambda function is triggered - I may want to retry same message again after 5 minute without deleting the message from the Queue. The reason I want to do this if Lambda could not connect external host (eg: API) - i like to try again after 5 minutes for 3 attempts only. How can that be written in node js? For example in Laravel, we can Specifying Max Job Attempts functionality.