I\'m running some operations in PySpark, and recently increased the number of nodes in my configuration (which is on Amazon EMR). However, even though I tripled the number
On pyspark you could still call the scala getExecutorMemoryStatus API using pyspark's py4j bridge:
getExecutorMemoryStatus
sc._jsc.sc().getExecutorMemoryStatus().size()