I am trying to run SparkSQL :
val sqlContext = new org.apache.spark.sql.hive.HiveContext(sc)
But the error i m getting is below:
I am getting this error while running test cases in my multi maven spark setup. I was creating sparkSession in my test classes separately as unit test cases required different spark parameters every time which I am passing it through a configuration file. To resolve this I followed this approach. While creating the sparkSession in Spark 2.2.0
//This is present in my Parent Trait.
def createSparkSession(master: String, appName: String, configList: List[(String, String)]): SparkSession ={
val sparkConf = new SparkConf().setAll(configList)
val spark = SparkSession
.builder()
.master(master)
.config(sparkConf)
.enableHiveSupport()
.appName(appName)
.getOrCreate()
spark
}
In my test classes
//metastore_db_test will test class specific folder in my modules.
val metaStoreConfig = List(("javax.jdo.option.ConnectionURL", "jdbc:derby:;databaseName=hiveMetaStore/metastore_db_test;create=true"))
val configList = configContent.convertToListFromConfig(sparkConfigValue) ++ metaStoreConfig
val spark = createSparkSession("local[*]", "testing", configList)
And post that in maven clean plugin I am cleaning this hiveMetaStore directory.
//Parent POM
org.apache.maven.plugins
maven-clean-plugin
3.1.0
metastore_db
spark-warehouse
Child Module POM
maven-clean-plugin
hiveMetaStore
**
spark-warehouse