save each element of rdd in text file hdfs

倖福魔咒の 提交于 2019-12-08 06:50:54

问题


I am using spark application. In each element of rdd contains good amount of data. I want to save each element of rdd into multiple hdfs files respectively. I tried rdd.saveAsTextFile("foo.txt") But I will create a single file for whole rdd. rdd size is 10. I want 10 files in hdfs. How can I achieve this??


回答1:


If I understand your question, you can create a custom output format like this

class RDDMultipleTextOutputFormat extends MultipleTextOutputFormat[Any, Any] {
       override def generateActualKey(key: Any, value: Any): Any = NullWritable.get()
       override def generateFileNameForKeyValue(key: Any, value: Any, name: String): String = key.asInstanceOf[String]
}

Then convert your RDD into a key/val one where the key is the file path, and you can use saveAsHadoopFile function insted of saveAsTextFile, like this:

myRDD.saveAsHadoopFile(OUTPUT_PATH, classOf[String], classOf[String],classOf[RDDMultipleTextOutputFormat])


来源:https://stackoverflow.com/questions/46682540/save-each-element-of-rdd-in-text-file-hdfs

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!