How to connect Spark SQL to remote Hive metastore (via thrift protocol) with no hive-site.xml?

后端 未结 8 1370
我寻月下人不归
我寻月下人不归 2020-11-22 12:07

I\'m using HiveContext with SparkSQL and I\'m trying to connect to a remote Hive metastore, the only way to set the hive metastore is through including the hive-site.xml on

8条回答
  •  眼角桃花
    2020-11-22 12:39

    Below code worked for me. We can ignore the config of hive.metastore.uris for local metastore, spark will create hive objects in spare-warehouse directory locally.

    import org.apache.spark.sql.SparkSession;
    
    object spark_hive_support1 
    {
      def main (args: Array[String]) 
       {
        val spark = SparkSession
          .builder()
          .master("yarn")
          .appName("Test Hive Support")
          //.config("hive.metastore.uris", "jdbc:mysql://localhost/metastore")
          .enableHiveSupport
          .getOrCreate();
    
        import spark.implicits._
    
        val testdf = Seq(("Word1", 1), ("Word4", 4), ("Word8", 8)).toDF;
        testdf.show;
        testdf.write.mode("overwrite").saveAsTable("WordCount");
      }
    }
    

提交回复
热议问题