Apache Spark: setting executor instances does not change the executors

前端 未结 4 658
盖世英雄少女心
盖世英雄少女心 2020-12-01 01:09

I have an Apache Spark application running on a YARN cluster (spark has 3 nodes on this cluster) on cluster mode.

When the application is running the Spark-UI shows

4条回答
  •  再見小時候
    2020-12-01 01:32

    You only have 3 nodes in the cluster, and one will be used as the driver, you have only 2 nodes left, how can you create 6 executors?

    I think you confused --num-executors with --executor-cores.

    To increase concurrency, you need more cores, you want to utilize all the CPUs in your cluster.

提交回复
热议问题