Spark Failure : Caused by: org.apache.spark.shuffle.FetchFailedException: Too large frame: 5454002341

后端 未结 5 1159
遥遥无期
遥遥无期 2021-01-06 07:51

I am generating a hierarchy for a table determining the parent child.

Below is the configuration used, even after getting the error with regards to the too large fra

5条回答
  •  庸人自扰
    2021-01-06 08:31

    Suresh is right. Here's a better documented & formatted version of his answer with some useful background info:

    • bug report (link to the fix is at the very bottom)
    • fix (fixed as of 2.2.0 - already mentioned by Jared)
    • change of config's default value (changed as of 2.4.0)

    If you're on a version 2.2.x or 2.3.x, you can achieve the same effect by setting the value of the config to Int.MaxValue - 512, i.e. by setting spark.maxRemoteBlockSizeFetchToMem=2147483135. See here for the default value used as of September 2019.

提交回复
热议问题