How to allocate more executors per worker in Standalone cluster mode?

前端 未结 4 1446
执念已碎
执念已碎 2020-12-08 05:52

I use Spark 1.3.0 in a cluster of 5 worker nodes with 36 cores and 58GB of memory each. I\'d like to configure Spark\'s Standalone cluster with many executors per worker.

4条回答
  •  情歌与酒
    2020-12-08 06:04

    Starting in Spark 1.4 it should be possible to configure this:

    Setting: spark.executor.cores

    Default: 1 in YARN mode, all the available cores on the worker in standalone mode.

    Description: The number of cores to use on each executor. For YARN and standalone mode only. In standalone mode, setting this parameter allows an application to run multiple executors on the same worker, provided that there are enough cores on that worker. Otherwise, only one executor per application will run on each worker.

    http://spark.apache.org/docs/1.4.0/configuration.html#execution-behavior

提交回复
热议问题