Spark : Exception in thread “main” java.lang.ClassNotFoundException: com.mysql.jdbc.Driver

后端 未结 2 479
眼角桃花
眼角桃花 2021-01-27 07:47

I wrote simple program in spark to write a dataframe to table in mySql.

The program is as follows:

import org.apache.spark.SparkConf
import org.apache.sp         


        
2条回答
  •  既然无缘
    2021-01-27 08:36

    This is because your driver isn't present in the uber-jar that you are submitting to the cluster whether it's a standalone cluster or yarn or mesos, etc.

    Solution 1 : Since you are using maven, you can use the assembly plugin to build your uber-jar with all the needed dependencies. More information about maven assembly plugin here.

    Solution 2 : Provide these dependency libraries on runtime when you submit your application using the --jars option. I advice your to read ore information about advanced dependencies management and submitting applications in the official documentation.

    e.g it can look like this :

    ./bin/spark-submit \
      --class 
      --master  \
      --jars /path/to/mysql-connector-java*.jar
    

    I hope this helps !

提交回复
热议问题