How to keep desired amount of AWS Lambda function containers warm

可紊 提交于 2019-12-03 09:19:28

问题


On my project there is REST API which implemented on AWS API Gateway and AWS Lambda. As AWS Lambda functions are serverless and stateless while we make a call to it, AWS starts a container with code of the Lambda function which process our call. According AWS documentation after finishing of lambda function execution AWS don't stop the container and we are able to process next call in that container. Such approach improves performance of the service - only in time of first call AWS spend time to start container (cold start of Lambda function) and all next calls are executed faster because their use the same container (warm starts).

As a next step for improving the performance we created cron job which calls periodically our Lambda function (we use Cloudwatch rules for that). Such approach allow to keep Lambda function "warm" allowing to avoid stopping and restarting of containers. I.e. when the real user will call our REST API, Lambda will not spent time to start a new container.

But we faced with the issue - such approach allow to keep warm only one container of Lambda function while the actual number of parallel calls from different users can be much larger (in our case that's hundreds and sometimes even thousands of users). Is there any way to implement warm up functionality for Lambda function which could warm not only single container, but some desired number of them?

I understand that such approach can affect cost of Lambda function's using and possibly, at all it will be better to use good old application server, but comparison of these approaches and their costs will be the next steps, I think, and in current moment I would like just to find the way to warm desired count of Lambda function containers.


回答1:


This can be long but bear with me as this would probably give you workaround and may be would make you understand better How Lambda Works ?

Alternatively You can Skip to Bottom "The Workaround" if you are not interested in reading.

For folks who are not aware about cold starts please read this blog post to better understand it. To describe this in short:

Cold Starts

  • When a function is executed for the first time or after having the functions code or resource configuration updated, a container will be spun up to execute this function. All the code and libraries will be loaded into the container for it to be able to execute. The code will then run, starting with the initialisation code. The initialisation code is the code written outside the handler. This code is only run when the container is created for the first time. Finally, the Lambda handler is executed. This set-up process is what is considered a cold start.
  • For performance, Lambda has the ability to re-use containers created by previous invocations. This will avoid the initialisation of a new container and loading of code. Only the handler code will be executed. However, you cannot depend on a container from a previous invocation to be reused. if you haven’t changed the code and not too much time has gone by, Lambda may reuse the previous container.
  • If you change the code, resource configuration or some time has passed since the previous invocation, a new container will be initialized and you will experience a cold start.

Now Consider these scenarios for better understanding:

  • Consider the Lambda function, in the example, is invoked for the first time. Lambda will create a container, load the code into the container and run the initialisation code. The function handler will then be executed. This invocation will have experienced a cold start. As mentioned in the comments, the function takes 15 seconds to complete. After a minute, the function is invoked again. Lambda will most likely re-use the container from the previous invocation. This invocation will not experience a cold start.
  • Now consider the second scenario, where the second invocation is executed 5 seconds after the first invocation. Since the previous function takes 15 seconds to complete and has not finished executing, the new invocation will have to create a new container for this function to execute. Therefore this invocation will experience a cold start.

Now to Come up First Part of Problem that you have solved :

Regarding preventing cold starts, this is a possibility, however, it is not guaranteed, the common workaround will only keep warm one container of the Lambda function. To do, you would run a CloudWatch event using a schedule event (cron expression) that will invoke your Lambda function every couple of minutes to keep it warm.


The Workaround:

For your use-case, your Lambda function will be invoked very frequently with a very high concurrency rate. To avoid as many cold starts as possible, you will need to keep warm as many containers as you expect your highest concurrency to reach. To do this you will need to invoke the functions with a delay to allow the concurrency of this function to build and reach the desired amount of concurrent executions. This will force Lambda to spin up the number of containers you desire. This, as a result, can bring up costs and will not guarantee to avoid cold starts.

That being said, here is a break down on how you can keep multiple containers for your function warm at one time:

  • You should have a CloudWatch Events Rule that is triggered on a schedule. This schedule can be a fixed rate or a cron expression. for example, You can set this rule to trigger every 5 minutes. You will then specify a Lambda function (Controller function) as the target of this rule.

  • Your Controller Lambda function will then invoke the Lambda function (Function that you want to be kept warm) for as many concurrent running containers as you desire.

There are a few things to consider here:

  1. You will have to build concurrency because if the first invocation is finished before another invocation starts then this invocation may reuse the previous invocations container and not create a new one. To do this you will need to add some sort of delay on the Lambda function if the function is invoked by the controller function. This can be done by passing in a specific payload to the function with these invocations. The lambda function that you want to be kept warm will then check if this payload exists. If it does then the function will wait (to build concurrent invocations), if it does not then the function can execute as expected.

  2. You will also need to ensure you are not getting throttled on the Invoke Lambda API call if you are calling it repeatedly. Your Lambda function should be written to handle this throttling if it occurs and consider adding a delay between API calls to avoid throttling.

At the End this solution can reduce cold starts but it will increase costs and will not guarantee that cold starts will occur as they are inevitable when working with Lambda.If your application needs faster response times then what occurs with a Lambda cold start, I would recommend looking into having your server on a EC2 instance.




回答2:


We are using java (spring boot) lambdas and have come to pretty much an identical solution as Kush Vyas's answer above which works very well.

We did find during load testing, however, that a legitimate user request would often occur during the period that the "Controller function" was executing, again causing the inevitable cold start...

So, now in our "Controller function", we have our regular number of X concurrent warm-up requests, however every 5th execution of the function we call our target lambda an additional 2 times. Theory being that we will end up with X+2 lambdas staying warm, but for 4 out of 5 warm up calls there will still be 2 redundant lambdas that can service user requests.

It did reduce our number of cold starts even further (but obviously still not completely) and we are still playing with concurrency/frequency of warm-ups/sleep-time combinations to find optimum solution for us - these values will always likely be dependent on load requirements for a specific situation.




回答3:


If you use the serverless framework with AWS Lambda, you can use this plugin to keep all your lambdas warm with a certain level of concurrency.




回答4:


I'd like to share small but useful tip which we use to reduce 'observed by user' delay related to cold starts. In our case the Lambda function handles HTTP requests from front-end via AWS API Gateway, in particular executes search functionality when user type something in the input field. Usually user start to type with some delay after UI is rendered, so we have some time to execute ping call to our Lambda function for warming it up. And when user will make requests to the back-end, most likely the Lambda will be ready for work.

Actually such approach do nothing for fixing the issue with cold starts on the back-end side and you will need to look for other options how to fix it, but it can be an user experience improvement without much efforts (something like hotfix).

One thing you should remember - if your service is public and you care about Google Insights score you should be careful implementing such approach.



来源:https://stackoverflow.com/questions/51210445/how-to-keep-desired-amount-of-aws-lambda-function-containers-warm

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!