Shared python libraries between multiple APIs on AWS

£可爱£侵袭症+ 提交于 2019-12-11 16:59:31

问题


I have several different python APIs (i.e python scripts) that run using AWS lambda. The standard approach is to generate a zip file including all the external libraries that are necessary for the lambda function and then upload it to AWS. Now, I have some functions that are in common between different APIs (e.g. custom utils functions such as parse text files or dates). Currently, I am simpling duplicating the file utils.py in every zip file. However, this approach is quite inefficient (I don't like to duplicate code). I'd like to have a S3 bucket that contains all my .py shared files and have my APIs directly loading those. Is this possible? A simple approach would be to download the files to a tmp folder and load them, but I am not sure this is the best/fastest way:

import boto3
client_s3 = boto3.client("s3")
client_s3.download_file("mybucket", "utils.py", "/tmp/utils.py")

Can this be done in a more elegant way?


回答1:


It's actually not a simple problem to solve. We've been using lambda layers for a while, that is designed to solve that issue, so you can share common code. The problem with lambda layers is that you have to re-deploy twice when you change something inside your layer (the layer + your lambda function). It's rapidly a pain in the neck, and in terms of CICD you might also have issues.

We tried this for some time, now we're back to packaging code and including the code inside the lambda. Not efficient if you want to avoid code duplicates but at least you don't have all the bugs related to the fact you forgot to deploy the dependency function.



来源:https://stackoverflow.com/questions/49730389/shared-python-libraries-between-multiple-apis-on-aws

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!