Spark Failure : Caused by: org.apache.spark.shuffle.FetchFailedException: Too large frame: 5454002341

后端 未结 5 1168
遥遥无期
遥遥无期 2021-01-06 07:51

I am generating a hierarchy for a table determining the parent child.

Below is the configuration used, even after getting the error with regards to the too large fra

5条回答
  •  谎友^
    谎友^ (楼主)
    2021-01-06 08:38

    use this spark config, spark.maxRemoteBlockSizeFetchToMem < 2g

    Since there is lot of issues with> 2G partition (cannot shuffle, cannot cache on disk), Hence it is throwing failedfetchedexception too large data frame.

提交回复
热议问题