I\'m having problems with a \"ClassNotFound\" Exception using this simple example:
import org.apache.spark.SparkContext import org.apache.spark.SparkContext.
I also had same issue. I think --jars is not shipping the jars to executors. After I added this into SparkConf, it works fine.
val conf = new SparkConf().setMaster("...").setJars(Seq("/a/b/x.jar", "/c/d/y.jar"))
This web page for trouble shooting is useful too.