Apache Spark: setting executor instances does not change the executors

前端 未结 4 672
盖世英雄少女心
盖世英雄少女心 2020-12-01 01:09

I have an Apache Spark application running on a YARN cluster (spark has 3 nodes on this cluster) on cluster mode.

When the application is running the Spark-UI shows

4条回答
  •  一个人的身影
    2020-12-01 01:25

    To utilize the spark cluster to its full capacity you need to set values for --num-executors, --executor-cores and --executor-memory as per your cluster:

    • --num-executors command-line flag or spark.executor.instances configuration property controls the number of executors requested ;
    • --executor-cores command-line flag or spark.executor.cores configuration property controls the number of concurrent tasks an executor can run ;
    • --executor-memory command-line flag or spark.executor.memory configuration property controls the heap size.

提交回复
热议问题