The problem is the same as described here Error when starting spark-shell local on Mac
... but I have failed to find a solution. I also used to get the malformed URI err
I faced the same issue while using the SharedSparkContext with my tests. Adding those two lines (in my beforeAll method) as @dennis suggested solved the problem for me :
override def beforeAll(): Unit = {
super.beforeAll()
sc.getConf.setMaster("local").set("spark.driver.host", "localhost")
}
I hope this will be solved in the next versions of Spark.