How to write JDBC Sink for Spark Structured Streaming [SparkException: Task not serializable]?

后端 未结 3 2042
再見小時候
再見小時候 2020-12-13 21:27

I need a JDBC sink for my spark structured streaming data frame. At the moment, as far as I know DataFrame’s API lacks writeStream to JDBC implementation (neith

3条回答
  •  没有蜡笔的小新
    2020-12-13 22:02

    In case somebody encounters this in an interactive workbook, this solution also works:

    Instead of saving the JDBCSinkclass to a seperate file, you can also just declare it as a separate package ("Packaged cell") within the same workbook and import that package in the cell where you are using it. Well described here https://docs.databricks.com/user-guide/notebooks/package-cells.html

提交回复
热议问题