Distributed Kafka Connect with Multiple tasks Not working

梦想与她 提交于 2020-07-09 17:08:40

问题


I am running Apache Kafka on my Windows machine with two Kafka-Connect-Workers(Port 8083, 8084) and three partitions(replication of one). My issue is that I am able to see the fail-over to other Kafka-Connect worker whenever I shutdown one of them, but load balancing is not happening because the number of tasks is always ONE. I am using Official MongoDB-Kafka-Connector as Source(ChangeStream) with tasks.max=6. I tried updating MongoDB with multiple threads so that it could push more data into Kafka-Connect and may perhaps make Kafka-Connect create more tasks. Even under higher volume of data, tasks count remain one.

How I confirmed only one task is running? That's through the api "http://localhost:8083/connectors/mongodb-connector/status" : Response: { "name":"mongodb-connector", "connector": { "state":"RUNNING", "worker_id":"xx.xx.xx.xx:8083" } "tasks": [ { "id": 0, "state": "RUNNING" "worker_id": "xx.xx.xx.xx:8083" } ], "type": "source" } Am I missing something here? Why more tasks are not created?

来源:https://stackoverflow.com/questions/62761101/distributed-kafka-connect-with-multiple-tasks-not-working

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!