Spark - How to write a single csv file WITHOUT folder?

前端 未结 9 1209
北恋
北恋 2020-12-28 13:44

Suppose that df is a dataframe in Spark. The way to write df into a single CSV file is

df.coalesce(1).write.option(\"header\", \"tru

9条回答
  •  梦毁少年i
    2020-12-28 14:23

    df.write.mode("overwrite").format("com.databricks.spark.csv").option("header", "true").csv("PATH/FOLDER_NAME/x.csv")
    

    you can use this and if you don't want to give the name of CSV everytime you can write UDF or create an array of the CSV file name and give it to this it will work

提交回复
热议问题