How to connect Spark SQL to remote Hive metastore (via thrift protocol) with no hive-site.xml?

后端 未结 8 1390
我寻月下人不归
我寻月下人不归 2020-11-22 12:07

I\'m using HiveContext with SparkSQL and I\'m trying to connect to a remote Hive metastore, the only way to set the hive metastore is through including the hive-site.xml on

8条回答
  •  梦谈多话
    2020-11-22 12:39

    Some of the similar questions are marked as duplicate, this is to connect to Hive from Spark without using hive.metastore.uris or separate thrift server(9083) and not copying hive-site.xml to the SPARK_CONF_DIR.

    import org.apache.spark.sql.SparkSession
    val spark = SparkSession
      .builder()
      .appName("hive-check")
      .config(
        "spark.hadoop.javax.jdo.option.ConnectionURL",
        "JDBC_CONNECT_STRING"
      )
      .config(
        "spark.hadoop.javax.jdo.option.ConnectionDriverName",
        "org.postgresql.Driver"
      )
      .config("spark.sql.warehouse.dir", "/user/hive/warehouse")
      .config("spark.hadoop.javax.jdo.option.ConnectionUserName", "JDBC_USER")
      .config("spark.hadoop.javax.jdo.option.ConnectionPassword", "JDBC_PASSWORD")
      .enableHiveSupport()
      .getOrCreate()
    spark.catalog.listDatabases.show(false)
    

提交回复
热议问题