How to set amount of Spark executors?

前端 未结 4 810
情歌与酒
情歌与酒 2021-01-30 05:08

How could I configure from Java (or Scala) code amount of executors having SparkConfig and SparkContext? I see constantly 2 executors. Looks like

4条回答
  •  半阙折子戏
    2021-01-30 06:06

    You could also do it programmatically by setting the parameters "spark.executor.instances" and "spark.executor.cores" on the SparkConf object.

    Example:

    SparkConf conf = new SparkConf()
          // 4 executor per instance of each worker 
          .set("spark.executor.instances", "4")
          // 5 cores on each executor
          .set("spark.executor.cores", "5");
    

    The second parameter is only for YARN and standalone mode. It allows an application to run multiple executors on the same worker, provided that there are enough cores on that worker.

提交回复
热议问题