问题
I know there are several ways to monitor storage memory utilization of a Spark application but does anyone know a way to monitor execution memory utilization. I am also looking for a way to monitor the "user memory", that is memory that is not used for execution nor storage memory. Going by Spark's documentation on memory management https://spark.apache.org/docs/latest/tuning.html the memory that is not allocated to M or spark.memory.fraction.
来源:https://stackoverflow.com/questions/48670247/monitor-spark-execution-and-storage-memory-utilization