How to set amount of Spark executors?
问题 How could I configure from Java (or Scala) code amount of executors having SparkConfig and SparkContext ? I see constantly 2 executors. Looks like spark.default.parallelism does not work and is about something different. I just need to set amount of executors to be equal to cluster size but there are always only 2 of them. I know my cluster size. I run on YARN if this matters. 回答1: You could also do it programmatically by setting the parameters "spark.executor.instances" and "spark.executor