Spark java.lang.OutOfMemoryError: Java heap space

后端 未结 12 2135
半阙折子戏
半阙折子戏 2020-11-22 13:55

My cluster: 1 master, 11 slaves, each node has 6 GB memory.

My settings:

spark.executor.memory=4g, Dspark.akka.frameSize=512

12条回答
  •  旧时难觅i
    2020-11-22 14:14

    To add a use case to this that is often not discussed, I will pose a solution when submitting a Spark application via spark-submit in local mode.

    According to the gitbook Mastering Apache Spark by Jacek Laskowski:

    You can run Spark in local mode. In this non-distributed single-JVM deployment mode, Spark spawns all the execution components - driver, executor, backend, and master - in the same JVM. This is the only mode where a driver is used for execution.

    Thus, if you are experiencing OOM errors with the heap, it suffices to adjust the driver-memory rather than the executor-memory.

    Here is an example:

    spark-1.6.1/bin/spark-submit
      --class "MyClass"
      --driver-memory 12g
      --master local[*] 
      target/scala-2.10/simple-project_2.10-1.0.jar 
    

提交回复
热议问题