How to get memory and cpu usage by a Spark application?

笑着哭i 提交于 2019-12-11 05:13:55

问题


I want to get the average resource utilization of a spark job for monitoring purposes, how can I poll the resource ie cpu and memory utilization of a Spark Application.?


回答1:


You may check the stderr log for completed Spark application. Go to Yarn Resource Manager. Click on an application ID and then "Logs" on the right side of appattempt_* line. Scroll to Log Type:stderr and click "Click here for the full log". Look at the log for something like this:

"yarn.YarnAllocator: Will request 256 executor containers, each with 5 cores and 8576 MB memory including 384 MB overhead"



回答2:


You have to pull the logs from YARN

Command line : yarn application -logs {YourAppID} You can get the applicationID from the stack of the spark job or from the yarn application -list command or from UI too. More on the yarn commands are here

FROM UI : If you are using Cloudera you can see from http://${LOCALHOST}:7180/cmf/services/17/applications you can get to the DAG with http://${LOCALHOST}:8088/cluster



来源:https://stackoverflow.com/questions/48168127/how-to-get-memory-and-cpu-usage-by-a-spark-application

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!