According to the Spark on Mesos docs one needs to set the spark.executor.uri pointing to a Spark distribution:
val conf = new SparkConf()
.set
Create sample maven project with your all dependencies and then use maven plugin maven-shade-plugin. It will create one shade jar in your target folder.
Here is sample pom
4.0.0
com
test
0.0.1
1.7
2.4.1
1.4.0
1.1.0
1.0.0
org.apache.maven.plugins
maven-compiler-plugin
3.1
${java.version}
${java.version}
org.apache.maven.plugins
maven-shade-plugin
2.3
package
shade
*:*
META-INF/*.SF
META-INF/*.DSA
META-INF/*.RSA
org/bdbizviz/**
spark-${project.version}
org.apache.hadoop
hadoop-client
${hadoop.version}
servlet-api
javax.servlet
guava
com.google.guava
joda-time
joda-time
2.4
org.apache.spark
spark-core_2.10
${spark.version}
org.apache.spark
spark-sql_2.10
${spark.version}
com.databricks
spark-csv_2.10
${version.spark-csv_2.10}
com.databricks
spark-avro_2.10
${version.spark-avro_2.10}
org.apache.spark
spark-hive_2.10
${spark.version}
org.apache.spark
spark-hive-thriftserver_2.10
${spark.version}