I\'m having problems with a \"ClassNotFound\" Exception using this simple example:
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext.
If you are using Maven and Maven Assembly plugin to build your jar file with mvn package, ensure that the assembly plugin is configured correctly to point to your Spark app's main class.
Something like this should be added to your pom.xml to avoid any java.lang.ClassNotFoundException's:
org.apache.maven.plugins
maven-assembly-plugin
2.4.1
com.my.package.SparkDriverApp
jar-with-dependencies
false
package
package
single