How to execute Spark programs with Dynamic Resource Allocation?

后端 未结 2 1991
不知归路
不知归路 2020-12-13 07:43

I am using spark-summit command for executing Spark jobs with parameters such as:

spark-submit --master yarn-cluster --driver-cores 2 \\
 --driver-memory 2G          


        
2条回答
  •  余生分开走
    2020-12-13 08:37

    I just did a small demo with Spark's dynamic resource allocation. The code is on my Github. Specifically, the demo is in this release.

提交回复
热议问题