Kafka Source and Sink connector not working for large data

北城以北 提交于 2021-01-28 01:09:40

问题


I am creating a data pipeline using Kafka source and sink connector. Source connector is consuming from SQL database and publishing into topic and Sink connector subscribing to topic and putting into other SQL database. Table has 16 GB of data. Now the problem is, data is not getting transferred from one DB to another. However, if table size is small like 1000 rows then the data is getting successfully transferred.

Source connector config:

"config": {
       "connector.class": 
"io.confluent.connect.jdbc.JdbcSourceConnector",
       "tasks.max": "1",
       "connection.url": "",
       "mode": "incrementing",
       "incrementing.column.name": "ID",
       "topic.prefix": "migration_",
       "name": "jdbc-source",
       "validate.non.null": false,
       "batch.max.rows":5
     }

Source connector logs:

INFO WorkerSourceTask{id=cmc-migration-source-0} flushing 0 outstanding messages for offset commit 
[2019-03-08 16:48:45,402] INFO WorkerSourceTask{id=cmc-migration-source-0} Committing offsets
[2019-03-08 16:48:45,402] INFO WorkerSourceTask{id=cmc-migration-source-0} flushing 0 outstanding messages for offset commit
[2019-03-08 16:48:55,403] INFO WorkerSourceTask{id=cmc-migration-source-0} Committing offsets(org.apache.kafka.connect.runtime.WorkerSourceTask:397)

Can anyone guide me how to tune my Kafka source connector to transfer large data?

来源:https://stackoverflow.com/questions/55096342/kafka-source-and-sink-connector-not-working-for-large-data

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!