Writing to a file in Apache Spark

只谈情不闲聊 提交于 2019-12-03 13:54:07

create RDD with your data (int/string) using Seq: see parallelized-collections for details:

sc.parallelize(Seq(5))  //for writing int (5)
sc.parallelize(Seq("Test String")) // for writing string

val conf = new SparkConf().setAppName("Writing Int to File").setMaster("local")
val sc = new SparkContext(conf) 
val intRdd= sc.parallelize(Seq(5))   
intRdd.saveAsTextFile("out\\int\\test")

val conf = new SparkConf().setAppName("Writing string to File").setMaster("local")
val sc = new SparkContext(conf)   
val stringRdd = sc.parallelize(Seq("Test String"))
stringRdd.saveAsTextFile("out\\string\\test")
Ronak Patel

Follow up Example: (Tested as below)

val conf = new SparkConf().setAppName("Total Countries having Icon").setMaster("local")
val sc = new SparkContext(conf)

val headerRDD= sc.parallelize(Seq("HEADER"))

//Replace BODY part with your DF
val bodyRDD= sc.parallelize(Seq("BODY"))

val footerRDD = sc.parallelize(Seq("FOOTER"))

//combine all rdds to final    
val finalRDD = headerRDD ++ bodyRDD ++ footerRDD 

//finalRDD.foreach(line => println(line))

//output to one file
finalRDD.coalesce(1, true).saveAsTextFile("test") 

output:

HEADER
BODY
FOOTER

more examples here. . .

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!