I have a spark application which will successfully connect to hive and query on hive tables using spark engine.
To build this, I just added hive-site.xml
This doesn't seem to be possible in the current version of Spark. Reading the HiveContext code in the Spark Repo it appears that hive.metastore.uris
is something that is configurable for many Metastores, but it appears to be used only for redundancy across the same metastore, not totally different metastores.
More information here https://cwiki.apache.org/confluence/display/Hive/AdminManual+MetastoreAdmin
But you will probably have to aggregate the data somewhere in order to work on it in unison. Or you could create multiple Spark Contexts for each store.
You could try configuring the hive.metastore.uris
for multiple different metastores, but it probably won't work. If you do decide to create multiple Spark contexts for each store than make sure you set spark.driver.allowMultipleContexts
but this is generally discouraged and may lead to unexpected results.