How to export data from Spark SQL to CSV

前端 未结 7 1241
迷失自我
迷失自我 2020-12-04 15:31

This command works with HiveQL:

insert overwrite directory \'/data/home.csv\' select * from testtable;

But with Spark SQL I\'m getting an e

7条回答
  •  离开以前
    2020-12-04 16:01

    You can use below statement to write the contents of dataframe in CSV format df.write.csv("/data/home/csv")

    If you need to write the whole dataframe into a single CSV file, then use df.coalesce(1).write.csv("/data/home/sample.csv")

    For spark 1.x, you can use spark-csv to write the results into CSV files

    Below scala snippet would help

    import org.apache.spark.sql.hive.HiveContext
    // sc - existing spark context
    val sqlContext = new HiveContext(sc)
    val df = sqlContext.sql("SELECT * FROM testtable")
    df.write.format("com.databricks.spark.csv").save("/data/home/csv")
    

    To write the contents into a single file

    import org.apache.spark.sql.hive.HiveContext
    // sc - existing spark context
    val sqlContext = new HiveContext(sc)
    val df = sqlContext.sql("SELECT * FROM testtable")
    df.coalesce(1).write.format("com.databricks.spark.csv").save("/data/home/sample.csv")
    

提交回复
热议问题