spark executor memory cut to 1/2

扶醉桌前 提交于 2020-01-01 18:21:05

问题


I am doing a spark-submit like this spark-submit --class com.mine.myclass --master yarn-cluster --num-executors 3 --executor-memory 4G spark-examples_2.10-1.0.jar

in the web ui, I can see indeed there are 3 executor nodes, but each has 2G of memory. When I set --executor-memory 2G, then ui shows 1G per node.

How did it figure to reduce my setting by 1/2?


回答1:


The executor page of the Web UI is showing the amount of storage memory, which is equal to 54% of Java heap by default (spark.storage.safetyFraction 0.9 * spark.storage.memoryFraction 0.6)



来源:https://stackoverflow.com/questions/29191547/spark-executor-memory-cut-to-1-2

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!