Save Spark dataframe as dynamic partitioned table in Hive

后端 未结 6 995

I have a sample application working to read from csv files into a dataframe. The dataframe can be stored to a Hive table in parquet format using the method df.sav

6条回答
  •  误落风尘
    2020-12-02 09:45

    I was able to write to partitioned hive table using df.write().mode(SaveMode.Append).partitionBy("colname").saveAsTable("Table")

    I had to enable the following properties to make it work.

    hiveContext.setConf("hive.exec.dynamic.partition", "true")
    hiveContext.setConf("hive.exec.dynamic.partition.mode", "nonstrict")
    

提交回复
热议问题