I have an Apache Spark application running on a YARN cluster (spark has 3 nodes on this cluster) on cluster mode.
When the application is running the Spark-UI shows
To utilize the spark cluster to its full capacity you need to set values for --num-executors, --executor-cores and --executor-memory as per your cluster:
--num-executors command-line flag or spark.executor.instances configuration property controls the number of executors requested ;--executor-cores command-line flag or spark.executor.cores configuration property controls the number of concurrent tasks an executor can run ;--executor-memory command-line flag or spark.executor.memory configuration property controls the heap size.