I\'m having problems with a \"ClassNotFound\" Exception using this simple example:
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext.
You should set the SPARK_CLASS_PATH in spark-env.sh file like this:
SPARK_LOCAL_IP=your local ip
SPARK_CLASSPATH=your external jars
and you should submit with spark shell like this:spark-submit --class your.runclass --master spark://yourSparkMasterHostname:7077 /your.jar
and your java code like this:
SparkConf sparkconf = new SparkConf().setAppName("sparkOnHbase"); JavaSparkContext sc = new JavaSparkContext(sparkconf);
then it will work.