Spark on YARN too less vcores used
问题 I'm using Spark in a YARN cluster (HDP 2.4) with the following settings: 1 Masternode 64 GB RAM (50 GB usable) 24 cores (19 cores usable) 5 Slavenodes 64 GB RAM (50 GB usable) each 24 cores (19 cores usable) each YARN settings memory of all containers (of one host): 50 GB minimum container size = 2 GB maximum container size = 50 GB vcores = 19 minimum #vcores/container = 1 maximum #vcores/container = 19 When I run my spark application with the command spark-submit --num-executors 30 -