I\'m using HiveContext with SparkSQL and I\'m trying to connect to a remote Hive metastore, the only way to set the hive metastore is through including the hive-site.xml on
Some of the similar questions are marked as duplicate, this is to connect to Hive from Spark without using hive.metastore.uris or separate thrift server(9083) and not copying hive-site.xml to the SPARK_CONF_DIR.
import org.apache.spark.sql.SparkSession
val spark = SparkSession
.builder()
.appName("hive-check")
.config(
"spark.hadoop.javax.jdo.option.ConnectionURL",
"JDBC_CONNECT_STRING"
)
.config(
"spark.hadoop.javax.jdo.option.ConnectionDriverName",
"org.postgresql.Driver"
)
.config("spark.sql.warehouse.dir", "/user/hive/warehouse")
.config("spark.hadoop.javax.jdo.option.ConnectionUserName", "JDBC_USER")
.config("spark.hadoop.javax.jdo.option.ConnectionPassword", "JDBC_PASSWORD")
.enableHiveSupport()
.getOrCreate()
spark.catalog.listDatabases.show(false)