How to specify sql dialect when creating spark dataframe from JDBC?

前端 未结 2 2011
日久生厌
日久生厌 2021-01-14 20:50

I\'m having an issue reading data via custom JDBC with Spark. How would I go about about overriding the sql dialect inferred via jdbc url?

The database in question i

2条回答
  •  南方客
    南方客 (楼主)
    2021-01-14 21:02

    Maybe it's too late. But answer will be next:

    Create your custom dialect, as I did for ClickHouse database(my jdbc connection url looks like this jdbc:clickhouse://localhost:8123)

     private object ClickHouseDialect extends JdbcDialect {
        //override here quoting logic as you wish
        override def quoteIdentifier(colName: String): String = colName
    
        override def canHandle(url: String): Boolean = url.startsWith("jdbc:clickhouse")
      }
    

    And register it somewhere in your code, like this:

    JdbcDialects.registerDialect(ClickHouseDialect)
    

提交回复
热议问题