spark cache only keeps a fraction of RDD

孤街醉人 提交于 2019-12-04 07:15:45

cache() is the same as persist(StorageLevel.MEMORY_ONLY), and your amount of data probably exceeds the available memory. Spark then evicts caches in a "least recently used" manner.

You can tweak the reserved memory for caching by setting configuration options. See the Spark Documentation for details and look out for: spark.driver.memory, spark.executor.memory, spark.storage.memoryFraction

Not an expert, but I do not think that textFile() automatically caches anything; the Spark Quick Start explicitly caches a text file RDD: sc.textFile(logFile, 2).cache()

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!