How to execute Spark programs with Dynamic Resource Allocation?

后端 未结 2 1984
不知归路
不知归路 2020-12-13 07:43

I am using spark-summit command for executing Spark jobs with parameters such as:

spark-submit --master yarn-cluster --driver-cores 2 \\
 --driver-memory 2G          


        
相关标签:
2条回答
  • 2020-12-13 08:22

    In Spark dynamic allocation spark.dynamicAllocation.enabled needs to be set to true because it's false by default.

    This requires spark.shuffle.service.enabled to be set to true, as spark application is running on YARN. Check this link to start the shuffle service on each NodeManager in YARN.

    The following configurations are also relevant:

    spark.dynamicAllocation.minExecutors, 
    spark.dynamicAllocation.maxExecutors, and 
    spark.dynamicAllocation.initialExecutors
    

    These options can be configured to Spark application in 3 ways

    1. From Spark submit with --conf <prop_name>=<prop_value>

    spark-submit --master yarn-cluster \
        --driver-cores 2 \
        --driver-memory 2G \
        --num-executors 10 \
        --executor-cores 5 \
        --executor-memory 2G \
        --conf spark.dynamicAllocation.minExecutors=5 \
        --conf spark.dynamicAllocation.maxExecutors=30 \
        --conf spark.dynamicAllocation.initialExecutors=10 \ # same as --num-executors 10
        --class com.spark.sql.jdbc.SparkDFtoOracle2 \
        Spark-hive-sql-Dataframe-0.0.1-SNAPSHOT-jar-with-dependencies.jar
    

    2. Inside Spark program with SparkConf

    Set the properties in SparkConf then create SparkSession or SparkContext with it

    val conf: SparkConf = new SparkConf()
    conf.set("spark.dynamicAllocation.minExecutors", "5");
    conf.set("spark.dynamicAllocation.maxExecutors", "30");
    conf.set("spark.dynamicAllocation.initialExecutors", "10");
    .....
    

    3. spark-defaults.conf usually located in $SPARK_HOME/conf/

    Place the same configurations in spark-defaults.conf to apply for all spark applications if no configuration is passed from command-line as well as code.

    Spark - Dynamic Allocation Confs

    0 讨论(0)
  • 2020-12-13 08:37

    I just did a small demo with Spark's dynamic resource allocation. The code is on my Github. Specifically, the demo is in this release.

    0 讨论(0)
提交回复
热议问题