Apache Spark : JDBC connection not working

后端 未结 6 1631
渐次进展
渐次进展 2020-12-06 11:45

I have asked this question previously also but did not got any answer (Not able to connect to postgres using jdbc in pyspark shell).

I have successfully installed Sp

6条回答
  •  余生分开走
    2020-12-06 12:08

    I had this exact problem with mysql/mariadb, and got BIG clue from this question

    So your pyspark command should be:

    pyspark --conf spark.executor.extraClassPath= --driver-class-path  --jars  --master 
    

    Also watch for errors when pyspark start like "Warning: Local jar ... does not exist, skipping." and "ERROR SparkContext: Jar not found at ...", these probably mean you spelled the path wrong.

提交回复
热议问题