Spark java.lang.OutOfMemoryError: Java heap space

后端 未结 12 2141
半阙折子戏
半阙折子戏 2020-11-22 13:55

My cluster: 1 master, 11 slaves, each node has 6 GB memory.

My settings:

spark.executor.memory=4g, Dspark.akka.frameSize=512

12条回答
  •  不知归路
    2020-11-22 14:09

    I have few suggession for the above mentioned error.

    ● Check executor memory assigned as an executor might have to deal with partitions requiring more memory than what is assigned.

    ● Try to see if more shuffles are live as shuffles are expensive operations since they involve disk I/O, data serialization, and network I/O

    ● Use Broadcast Joins

    ● Avoid using groupByKeys and try to replace with ReduceByKey

    ● Avoid using huge Java Objects wherever shuffling happens

提交回复
热议问题