How to set amount of Spark executors?

前端 未结 4 804
情歌与酒
情歌与酒 2021-01-30 05:08

How could I configure from Java (or Scala) code amount of executors having SparkConfig and SparkContext? I see constantly 2 executors. Looks like

4条回答
  •  没有蜡笔的小新
    2021-01-30 05:48

    In Spark 2.0+ version

    use spark session variable to set number of executors dynamically (from within program)

    spark.conf.set("spark.executor.instances", 4)
    spark.conf.set("spark.executor.cores", 4)
    

    In above case maximum 16 tasks will be executed at any given time.

    other option is dynamic allocation of executors as below -

    spark.conf.set("spark.dynamicAllocation.enabled", "true")
    spark.conf.set("spark.executor.cores", 4)
    spark.conf.set("spark.dynamicAllocation.minExecutors","1")
    spark.conf.set("spark.dynamicAllocation.maxExecutors","5")
    

    This was you can let spark decide on allocating number of executors based on processing and memory requirements for running job.

    I feel second option works better that first option and is widely used.

    Hope this will help.

提交回复
热议问题