GPU based algorithm on AWS Lambda

雨燕双飞 提交于 2020-05-12 11:39:07

问题


I have a function which perform some mathematical operations and need a 16gb GPU system, But this function will not be triggered always and rest of time my system will not be in use. I came to know about AWS Lambda. Can I run GPU based algorithm on Lambda?? So that whenever I need GPU, I will get the system on cloud. I need a little description about it.


回答1:


You can't specify the runtime environment for AWS Lambda functions, so no, you can't require the presence of a GPU (in fact the physical machines AWS chooses to put into its Lambda pool will almost certainly not have one).

Your best bet would be to run the GPU-requiring function as a Batch job on a compute cluster configured to use p-type instances. The guide here might be helpful.




回答2:


Currently lambda doesn't have GPU.

However, if you just need to do inference; the emulation via CPU works fine on AWS lambda; here is an article that goes into more details:

https://aws.amazon.com/blogs/machine-learning/how-to-deploy-deep-learning-models-with-aws-lambda-and-tensorflow/




回答3:


Batch is a good solution for certain types of workload. Another option is GPUs on ECS, which could be used for running frequent tasks utilising GPU.



来源:https://stackoverflow.com/questions/52554184/gpu-based-algorithm-on-aws-lambda

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!