How to specify sql dialect when creating spark dataframe from JDBC?

前端 未结 2 2017
日久生厌
日久生厌 2021-01-14 20:50

I\'m having an issue reading data via custom JDBC with Spark. How would I go about about overriding the sql dialect inferred via jdbc url?

The database in question i

2条回答
  •  感动是毒
    2021-01-14 21:00

    You can do something like this.

    val jdbcDF = spark.read
      .format("jdbc")
      .option("url", "jdbc:postgresql:dbserver")
      .option("dbtable", "schema.tablename")
      .option("user", "username")
      .option("password", "password")
      .load()
    

    For more info check this

    You can also specify in this way.

    val connectionProperties = new Properties()
        connectionProperties.put("user", "username")
        connectionProperties.put("password", "password")
        val jdbcDF2 = spark.read
          .jdbc("jdbc:postgresql:dbserver", "schema.tablename", connectionProperties)
    

提交回复
热议问题