Spark 1.4 increase maxResultSize memory

后端 未结 7 2160
花落未央
花落未央 2020-12-13 00:06

I am using Spark 1.4 for my research and struggling with the memory settings. My machine has 16GB of memory so no problem there since the size of my file is only 300MB. Alth

相关标签:
7条回答
  • 2020-12-13 00:53

    There is also a Spark bug https://issues.apache.org/jira/browse/SPARK-12837 that gives the same error

     serialized results of X tasks (Y MB) is bigger than spark.driver.maxResultSize
    

    even though you may not be pulling data to the driver explicitly.

    SPARK-12837 addresses a Spark bug that accumulators/broadcast variables prior to Spark 2 were pulled to driver unnecessary causing this problem.

    0 讨论(0)
提交回复
热议问题