How does Spark running on YARN account for Python memory usage?

前端 未结 1 1389

After reading through the documentation I do not understand how does Spark running on YARN account for Python memory consumption.

Does it count towards spark.

1条回答
  •  粉色の甜心
    2020-12-28 16:07

    I'd try to increase memory to spark.python.worker.memory default (512m) because of heavy Python code and this property value does not count in spark.executor.memory.

    Amount of memory to use per python worker process during aggregation, in the same format as JVM memory strings (e.g. 512m, 2g). If the memory used during aggregation goes above this amount, it will spill the data into disks. link

    ExecutorMemoryOverhead calculation in Spark:

    MEMORY_OVERHEAD_FRACTION = 0.10 
    MEMORY_OVERHEAD_MINIMUM = 384 
    val executorMemoryOverhead = 
      max(MEMORY_OVERHEAD_FRACTION * ${spark.executor.memory}, MEMORY_OVERHEAD_MINIMUM))
    

    The property is spark.{yarn|mesos}.executor.memoryOverhead for YARN and Mesos.

    YARN kills the processes which are taking more memory than they requested which is sum of executorMemoryOverhead and executorMemory.

    In given image python processes in worker uses spark.python.worker.memory, then spark.yarn.executor.memoryOverhead + spark.executor.memory is specific JVM.

    Image credits

    Additional resource Apache mailing thread

    0 讨论(0)
提交回复
热议问题