apache-kafka-connect

Kafka Connect FileStreamSource ignores appended lines

房东的猫 提交于 2021-01-29 05:48:13
问题 I'm working on an application to process logs with Spark and I thought to use Kafka as a way to stream the data from the log file. Basically I have a single log file (on the local file system) which is continuously updated with new logs, and Kafka Connect seems to be the perfect solution to get the data from the file along with the new appended lines. I'm starting the servers with their default configurations with the following commands: Zookeeper server: zookeeper-server-start.sh config

Missing Confluent Kafka Connect Metrics using Jmx Exporter for Prometheus

北慕城南 提交于 2021-01-29 05:29:47
问题 I am not able to export "type=connector-metrics" metrics for Confluent connect service but other metrics are working fine. I am using prometheus exporter java agent to expose metrics from Confluent connect as shown below. Confluent Connect Configuration (/usr/bin/connect-distributed) export KAFKA_OPTS='-javaagent:/opt/prometheus/jmx_prometheus_javaagent-0.12.0.jar=8093:/opt/prometheus/kafka-connect.yml' kafka-connect.yml - pattern: kafka.connect<type=connector-metrics, connector=(.+)><>([a-z-

Is it possible to create a Kafka Connector without REST request?

会有一股神秘感。 提交于 2021-01-28 19:18:54
问题 Is it possible to create a Kafka Connector without REST request? I have started my worker (distributed) through a java code and want my connector also gets started along with it. I dont want to use REST call (not from browser and not from code) to create my connector. I just want a simple kafka api to invoke which creates my connector. Any help will be appreciated. 来源: https://stackoverflow.com/questions/58849640/is-it-possible-to-create-a-kafka-connector-without-rest-request

Kafka Source and Sink connector not working for large data

北城以北 提交于 2021-01-28 01:09:40
问题 I am creating a data pipeline using Kafka source and sink connector. Source connector is consuming from SQL database and publishing into topic and Sink connector subscribing to topic and putting into other SQL database. Table has 16 GB of data. Now the problem is, data is not getting transferred from one DB to another. However, if table size is small like 1000 rows then the data is getting successfully transferred. Source connector config: "config": { "connector.class": "io.confluent.connect

Efficient ways for whitelisting more tables in Debezium Mysql Connector

蓝咒 提交于 2021-01-27 18:50:19
问题 Are there any best practices that are followed for whitelisting a new table to the debezium mysql connector ? We are using debezium mysql connector for our CDC flows and a use case has arised to whitelist more tables to the connector configuration. Here are the version details of the debezium being used and the { "class": "io.debezium.connector.mysql.MySqlConnector", "version": "0.8.0.Final", "snapshot.mode": "schema_only" } There is a debezium ticket https://issues.redhat.com/browse/DBZ-906

Use debezium link postgresql 11 Couldn't obtain encoding for database test

非 Y 不嫁゛ 提交于 2021-01-27 11:58:15
问题 I use debezium cdc connect pg, and i build the pg 11 use by docker,the pg is run well. when i use debezium in kafka connector, it report: Couldn't obtain encoding for database test the curl is: curl -H "Accept: application/json" -H "Content-type: application/json" -X POST http://localhost:8083/connectors/ -d '{ "name": "debezium", "config": { "name": "debezium", "connector.class": "io.debezium.connector.postgresql.PostgresConnector", "tasks.max": "1", "database.hostname": "localhost",

Use debezium link postgresql 11 Couldn't obtain encoding for database test

百般思念 提交于 2021-01-27 11:56:08
问题 I use debezium cdc connect pg, and i build the pg 11 use by docker,the pg is run well. when i use debezium in kafka connector, it report: Couldn't obtain encoding for database test the curl is: curl -H "Accept: application/json" -H "Content-type: application/json" -X POST http://localhost:8083/connectors/ -d '{ "name": "debezium", "config": { "name": "debezium", "connector.class": "io.debezium.connector.postgresql.PostgresConnector", "tasks.max": "1", "database.hostname": "localhost",

Use debezium link postgresql 11 Couldn't obtain encoding for database test

|▌冷眼眸甩不掉的悲伤 提交于 2021-01-27 11:54:30
问题 I use debezium cdc connect pg, and i build the pg 11 use by docker,the pg is run well. when i use debezium in kafka connector, it report: Couldn't obtain encoding for database test the curl is: curl -H "Accept: application/json" -H "Content-type: application/json" -X POST http://localhost:8083/connectors/ -d '{ "name": "debezium", "config": { "name": "debezium", "connector.class": "io.debezium.connector.postgresql.PostgresConnector", "tasks.max": "1", "database.hostname": "localhost",

Can't connect Debezium 0.9.2 to a SQL Server 2008 R2

℡╲_俬逩灬. 提交于 2021-01-27 11:47:54
问题 When I'm trying to connect Debezium to my SQL Server database after enable CDC feature, I have this error message : java.lang.RuntimeException: Couldn't obtain database name, at io.debezium.connector.sqlserver.SqlServerConnection.retrieveRealDatabaseName(SqlServerConnection.java:364), at io.debezium.connector.sqlserver.SqlServerConnection.<init>(SqlServerConnection.java:84), at io.debezium.connector.sqlserver.SqlServerConnectorTask.start(SqlServerConnectorTask.java:86), at io.debezium

Can't connect Debezium 0.9.2 to a SQL Server 2008 R2

两盒软妹~` 提交于 2021-01-27 11:47:02
问题 When I'm trying to connect Debezium to my SQL Server database after enable CDC feature, I have this error message : java.lang.RuntimeException: Couldn't obtain database name, at io.debezium.connector.sqlserver.SqlServerConnection.retrieveRealDatabaseName(SqlServerConnection.java:364), at io.debezium.connector.sqlserver.SqlServerConnection.<init>(SqlServerConnection.java:84), at io.debezium.connector.sqlserver.SqlServerConnectorTask.start(SqlServerConnectorTask.java:86), at io.debezium