apache-kafka-connect

Kafka Connect SMT ApplyWithSchema requires struct error

一个人想着一个人 提交于 2020-08-10 19:47:15
问题 I have deployed a sample from confluent https://github.com/confluentinc/kafka-connect-insert-uuid for adding simple UUID field but I am getting an error that it requires struct. I am applying this within Debezium MySQLConnector Only Struct objects supported for [adding UUID to record], found: java.lang.String\n\tat org.apache.kafka.connect.transforms.util.Requirements.requireStruct(Requirements.java:52) What is a minimalist applyWithSchema method that just returns the record as is? I am

Kafka Connect SMT ApplyWithSchema requires struct error

不羁的心 提交于 2020-08-10 19:47:08
问题 I have deployed a sample from confluent https://github.com/confluentinc/kafka-connect-insert-uuid for adding simple UUID field but I am getting an error that it requires struct. I am applying this within Debezium MySQLConnector Only Struct objects supported for [adding UUID to record], found: java.lang.String\n\tat org.apache.kafka.connect.transforms.util.Requirements.requireStruct(Requirements.java:52) What is a minimalist applyWithSchema method that just returns the record as is? I am

How to include Type MetaData for Deserialization in Spring Kafka

爱⌒轻易说出口 提交于 2020-08-10 19:13:28
问题 I am doing deserialization at the Listener in Spring Kafka. But this assumes that the type information was included or sent by a Spring Kafka producer. In my case the Json is being sent across by the Debezium MySQLConnector and it does not add this meta data. So I would like to add it to the requests. I understand its placed in the request somewhere in the JsonSerializer, and I looked at the source code but could not figure out exactly how to use this to add meta data type during

Using a connector with Helm-installed Kafka/Confluent

你说的曾经没有我的故事 提交于 2020-08-02 09:44:28
问题 I have installed Kafka on a local Minikube by using the Helm charts https://github.com/confluentinc/cp-helm-charts following these instructions https://docs.confluent.io/current/installation/installing_cp/cp-helm-charts/docs/index.html like so: helm install -f kafka_config.yaml confluentinc/cp-helm-charts --name kafka-home-delivery --namespace cust360 The kafka_config.yaml is almost identical to the default yaml, with the one exception being that I scaled it down to 1 server/broker instead of

Using a connector with Helm-installed Kafka/Confluent

混江龙づ霸主 提交于 2020-08-02 09:44:12
问题 I have installed Kafka on a local Minikube by using the Helm charts https://github.com/confluentinc/cp-helm-charts following these instructions https://docs.confluent.io/current/installation/installing_cp/cp-helm-charts/docs/index.html like so: helm install -f kafka_config.yaml confluentinc/cp-helm-charts --name kafka-home-delivery --namespace cust360 The kafka_config.yaml is almost identical to the default yaml, with the one exception being that I scaled it down to 1 server/broker instead of

Kafka connector to hdfs: java.io.FileNotFoundException: File does not exist

爷,独闯天下 提交于 2020-07-28 04:57:45
问题 Everything was installed via ambari, HDP. I've ingested a sample file to kafka. The topic is testjson . Data ingested from csv file in filebeat. topics successfully ingested into kafka. /bin/kafka-topics.sh --list --zookeeper localhost:2181 result: test test060920 test1 test12 testjson From kafka i would like to ingest testjson to hdfs. quickstart-hdfs.properties name=hdfs-sink connector.class=io.confluent.connect.hdfs3.Hdfs3SinkConnector tasks.max=1 topics=testjson hdfs.url=hdfs://x.x.x.x