changestream

Distributed Kafka Connect with Multiple tasks Not working

梦想与她 提交于 2020-07-09 17:08:40
问题 I am running Apache Kafka on my Windows machine with two Kafka-Connect-Workers(Port 8083, 8084) and three partitions(replication of one). My issue is that I am able to see the fail-over to other Kafka-Connect worker whenever I shutdown one of them, but load balancing is not happening because the number of tasks is always ONE. I am using Official MongoDB-Kafka-Connector as Source(ChangeStream) with tasks.max=6. I tried updating MongoDB with multiple threads so that it could push more data into

MongoDB change stream timeouts if database is down for some time

末鹿安然 提交于 2020-01-25 00:50:08
问题 I am using mongoDB change stream in nodejs, everything works fine but if database is down has taken more than 10 5 seconds to get up change stream throws timeout error, here is my change stream watcher code Service.prototype.watcher = function( db ){ let collection = db.collection('tokens'); let changeStream = collection.watch({ fullDocument: 'updateLookup' }); let resumeToken, newChangeStream; changeStream.on('change', next => { resumeToken = next._id; console.log('data is ', JSON.stringify

MongoDB change stream timeouts if database is down for some time

雨燕双飞 提交于 2020-01-25 00:49:07
问题 I am using mongoDB change stream in nodejs, everything works fine but if database is down has taken more than 10 5 seconds to get up change stream throws timeout error, here is my change stream watcher code Service.prototype.watcher = function( db ){ let collection = db.collection('tokens'); let changeStream = collection.watch({ fullDocument: 'updateLookup' }); let resumeToken, newChangeStream; changeStream.on('change', next => { resumeToken = next._id; console.log('data is ', JSON.stringify