getting number of visible nodes in PySpark

前端 未结 5 674
长发绾君心
长发绾君心 2020-12-24 07:49

I\'m running some operations in PySpark, and recently increased the number of nodes in my configuration (which is on Amazon EMR). However, even though I tripled the number

5条回答
  •  自闭症患者
    2020-12-24 08:14

    On pyspark you could still call the scala getExecutorMemoryStatus API using pyspark's py4j bridge:

    sc._jsc.sc().getExecutorMemoryStatus().size()
    

提交回复
热议问题