Spark java.lang.OutOfMemoryError: Java heap space

后端 未结 12 2134
半阙折子戏
半阙折子戏 2020-11-22 13:55

My cluster: 1 master, 11 slaves, each node has 6 GB memory.

My settings:

spark.executor.memory=4g, Dspark.akka.frameSize=512

12条回答
  •  旧时难觅i
    2020-11-22 14:16

    You should configure offHeap memory settings as shown below:

    val spark = SparkSession
         .builder()
         .master("local[*]")
         .config("spark.executor.memory", "70g")
         .config("spark.driver.memory", "50g")
         .config("spark.memory.offHeap.enabled",true)
         .config("spark.memory.offHeap.size","16g")   
         .appName("sampleCodeForReference")
         .getOrCreate()
    

    Give the driver memory and executor memory as per your machines RAM availability. You can increase the offHeap size if you are still facing the OutofMemory issue.

提交回复
热议问题