How do I add python libraries to an AWS lambda function for Alexa?

筅森魡賤 提交于 2019-11-26 19:35:30

问题


I was following the tutorial to create an Alexa app using Python:

Python Alexa Tutorial

I was able to successfully follow all the steps and get the app to work.I now want to modify the python code and use external libraries such as import requests or any other libraries that I install using pip. How would I setup my lambda function to include any pip packages that I install locally on my machine?


回答1:


As it is described in the Amazon official documentation link here It is as simple as just creating a zip of all the folder contents after installing the required packages in your folder where you have your python lambda code.

As Vineeth pointed above in his comment, The very first step in moving from an inline code editor to a zip file upload approach is to change your lambda function handler name under configuration settings to include the python script file name that holds the lambda handler.

lambda_handler => {your-python-script-file-name}.lambda_handler.

Other solutions like python-lambda and lambda-uploader help with simplifying the process of uploading and the most importantly LOCAL TESTING. These will save a lot of time in development.




回答2:


The official documentation is pretty good. In a nutshell, you need to create a zip file of a directory containing both the code of your lambda function and all external libraries you use at the top level.

You can simulate that by deactivating your virtualenv, copying all your required libraries into the working directory (which is always in sys.path if you invoke a script on the command line), and checking whether your script still works.




回答3:


You may want to look into using frameworks such as zappa which will handle packaging up and deploying the lambda function for you.

You can use that in conjunction with flask-ask to have an easier time making Alexa skills. There's even a video tutorial of this (from the zappa readme) here




回答4:


Echoing @d3ming's answer, a framework is a good way to go at this point. Creating the deployment package manually isn't impossible, but you'll need to be uploading your packages' compiled code, and if you're compiling that code on a non-linux system, the chance of running into issues with differences between your system and the Lambda function's deployed environment are high.

You can then work around that by compiling your code on a linux machine or Docker container.. but between all that complexity you can get much more from adopting a framework.

Serverless is well adopted and has support for custom python packages. It even integrates with Docker to compile your python dependencies and build the deployment package for you.

If you're looking for a full tutorial on this, I wrote one for Python Lambda functions here.




回答5:


To solve this particular problem we're using a library called juniper. In a nutshell, all you need to do is create a very simple manifest file that looks like:

functions:
  # Name the zip file you want juni to create
  router:
    # Where are your dependencies located?
    requirements: ./src/requirements.txt.
    # Your source code.
    include:
    - ./src/lambda_function.py

From this manifest file, calling juni build will create the zip file artifact for you. The file will include all the dependencies you specify in the requirements.txt.

The command will create this file ./dist/router.zip. We're using that file in conjunction with a sam template. However, you can then use that zip and upload it to the console, or through the awscli.



来源:https://stackoverflow.com/questions/38877058/how-do-i-add-python-libraries-to-an-aws-lambda-function-for-alexa

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!