apache-kafka-connect

Can we update/Upsert a record in mongodb? data source is kafka

北城余情 提交于 2020-04-06 03:07:22
问题 We can update/upsert the record in mongodb BUT is there is any method or function from which we can update or upsert the document directly in mongodb and the source system is kafka and destination is mongodb. 回答1: Yes we can update/upsert the data. For update you have to define a parameter in Kafka connector. and whitelist the column on which bases you want to update the record. The property is as followed: document.id.strategy=com.mongodb.kafka.connect.sink.processor.id.strategy

Kafka Connect: No tasks created for a connector

大兔子大兔子 提交于 2020-03-19 05:07:29
问题 We are running Kafka Connect (Confluent Platform 5.4, ie. Kafka 2.4) in a distributed mode using Debezium (MongoDB) and Confluent S3 connectors. When adding a new connector via the REST API the connector is created in RUNNING state, but no tasks are created for the connector. Pausing and resuming the connector does not help. When we stop all workers and then start them again, the tasks are created and everything runs as it should. The issue is not caused by the connector plugins, because we

Kafka Connect: No tasks created for a connector

半世苍凉 提交于 2020-03-19 05:04:49
问题 We are running Kafka Connect (Confluent Platform 5.4, ie. Kafka 2.4) in a distributed mode using Debezium (MongoDB) and Confluent S3 connectors. When adding a new connector via the REST API the connector is created in RUNNING state, but no tasks are created for the connector. Pausing and resuming the connector does not help. When we stop all workers and then start them again, the tasks are created and everything runs as it should. The issue is not caused by the connector plugins, because we

Kafka Connect: No tasks created for a connector

和自甴很熟 提交于 2020-03-19 05:04:23
问题 We are running Kafka Connect (Confluent Platform 5.4, ie. Kafka 2.4) in a distributed mode using Debezium (MongoDB) and Confluent S3 connectors. When adding a new connector via the REST API the connector is created in RUNNING state, but no tasks are created for the connector. Pausing and resuming the connector does not help. When we stop all workers and then start them again, the tasks are created and everything runs as it should. The issue is not caused by the connector plugins, because we

When connecting to Snowflake internal stage I am seeing it connect to a different db

徘徊边缘 提交于 2020-03-05 06:07:25
问题 I am connecting through newly released Snowflake Kafka connector using standalone mode. The connector is able to connect to my snowflake account successfully but when it looks for creating internal stage, it doesn't use correct database as in config This is the content of new connector.properties file: name=kafkaSnowNow connector.class=com.snowflake.kafka.connector.SnowflakeSinkConnector tasks.max=8 topics=kafkaSnow1,kafkaSnow2 snowflake.topic2table.map= kafkaSnow1:kafka_db.kafka_schema

When connecting to Snowflake internal stage I am seeing it connect to a different db

让人想犯罪 __ 提交于 2020-03-05 06:06:19
问题 I am connecting through newly released Snowflake Kafka connector using standalone mode. The connector is able to connect to my snowflake account successfully but when it looks for creating internal stage, it doesn't use correct database as in config This is the content of new connector.properties file: name=kafkaSnowNow connector.class=com.snowflake.kafka.connector.SnowflakeSinkConnector tasks.max=8 topics=kafkaSnow1,kafkaSnow2 snowflake.topic2table.map= kafkaSnow1:kafka_db.kafka_schema

How to secure kafka connect so connection.url is not logged revealing credentials?

旧巷老猫 提交于 2020-02-23 06:26:30
问题 The problem I am having is that credentials are being logged when running a connector, I don't want to log those. Any leads will be appreciated. 回答1: Try to use the connection.password property, the value should be masked while logging. Something like this : curl -X POST http://localhost:8083/connectors -H "Content-Type: application/json" -d \ '{ "name":"jdbc_source_mysql_01", "config":{ "connector.class":"io.confluent.connect.jdbc.JdbcSourceConnector" "connection.url":"jdbc:mysql://mysql

Adding a connector to Kafka Connect

谁说我不能喝 提交于 2020-01-30 02:34:12
问题 I am using Confluent Kafka Docker image, specifically using this: https://github.com/confluentinc/cp-docker-images/tree/4.0.x/examples/cp-all-in-one I want to add the MySQL connector, by: downloading the version 1.5.46 of the connector (https://dev.mysql.com/get/Downloads/Connector-J/mysql-connector-java-5.1.46.tar.gz) mounting a volume with a jar that comes out of the downloaded connector (mysql-connector-java-5.1.46-bin.jar) adding the CONNECT_PLUGIN_PATH to the docker compose file

kafka.common.KafkaException: Failed to parse the broker info from zookeeper from EC2 to elastic search

我的梦境 提交于 2020-01-25 10:13:55
问题 I have aws MSK set up and i am trying to sink records from MSK to elastic search. I am able to push data into MSK into json format . I want to sink to elastic search . I am able to do all set up correctly . This is what i have done on EC2 instance wget /usr/local http://packages.confluent.io/archive/3.1/confluent-oss-3.1.2-2.11.tar.gz -P ~/Downloads/ tar -zxvf ~/Downloads/confluent-oss-3.1.2-2.11.tar.gz -C ~/Downloads/ sudo mv ~/Downloads/confluent-3.1.2 /usr/local/confluent /usr/local

kafka.common.KafkaException: Failed to parse the broker info from zookeeper from EC2 to elastic search

时间秒杀一切 提交于 2020-01-25 10:12:05
问题 I have aws MSK set up and i am trying to sink records from MSK to elastic search. I am able to push data into MSK into json format . I want to sink to elastic search . I am able to do all set up correctly . This is what i have done on EC2 instance wget /usr/local http://packages.confluent.io/archive/3.1/confluent-oss-3.1.2-2.11.tar.gz -P ~/Downloads/ tar -zxvf ~/Downloads/confluent-oss-3.1.2-2.11.tar.gz -C ~/Downloads/ sudo mv ~/Downloads/confluent-3.1.2 /usr/local/confluent /usr/local