I wrote simple program in spark to write a dataframe to table in mySql.
The program is as follows:
import org.apache.spark.SparkConf
import org.apache.sp
This is because your driver isn't present in the uber-jar that you are submitting to the cluster whether it's a standalone cluster or yarn or mesos, etc.
Solution 1 : Since you are using maven, you can use the assembly plugin to build your uber-jar with all the needed dependencies. More information about maven assembly plugin here.
Solution 2 : Provide these dependency libraries on runtime when you submit your application using the --jars
option. I advice your to read ore information about advanced dependencies management and submitting applications in the official documentation.
e.g it can look like this :
./bin/spark-submit \
--class
--master \
--jars /path/to/mysql-connector-java*.jar
I hope this helps !