confluent-schema-registry

How to infer avro schema from a kafka topic in Apache Beam KafkaIO

柔情痞子 提交于 2020-07-03 12:59:10
问题 I'm using Apache Beam's kafkaIO to read from a topic that has an avro schema in Confluent schema registry. I'm able to deserialize the message and write to files. But ultimately i want to write to BigQuery. My pipeline isn't able to infer the schema. How do I extract/infer the schema and attach it to the data in the pipeline so that my downstream processes (write to BigQuery) can infer the schema? Here is the code where I use the schema registry url to set the deserializer and where i read

How pass Basic Authentication to Confluent Schema Registry?

巧了我就是萌 提交于 2020-06-26 07:03:50
问题 I want to read data from a confluent cloud topic and then write in another topic. At localhost, I haven't had any major problems. But the schema registry of confluent cloud requires to pass some authentication data that I don't know how to enter them: basic.auth.credentials.source=USER_INFO schema.registry.basic.auth.user.info=: schema.registry.url=https://xxxxxxxxxx.confluent.cloudBlockquote Below is the current code: import com.databricks.spark.avro.SchemaConverters import io.confluent

How pass Basic Authentication to Confluent Schema Registry?

寵の児 提交于 2020-06-26 07:01:17
问题 I want to read data from a confluent cloud topic and then write in another topic. At localhost, I haven't had any major problems. But the schema registry of confluent cloud requires to pass some authentication data that I don't know how to enter them: basic.auth.credentials.source=USER_INFO schema.registry.basic.auth.user.info=: schema.registry.url=https://xxxxxxxxxx.confluent.cloudBlockquote Below is the current code: import com.databricks.spark.avro.SchemaConverters import io.confluent

Is it possible to deserialize Avro message(consuming message from Kafka) without giving Reader schema in ConfluentRegistryAvroDeserializationSchema

心已入冬 提交于 2020-04-16 05:40:12
问题 I am using Kafka Connector in Apache Flink for access to streams served by Confluent Kafka . Apart from schema registry url ConfluentRegistryAvroDeserializationSchema.forGeneric(...) expecting 'reader' schema. Instead of providing read schema I want to use same writer's schema(lookup in registry) for reading the message too because Consumer will not have latest schema. FlinkKafkaConsumer010<GenericRecord> myConsumer = new FlinkKafkaConsumer010<>("topic-name",

kafka.common.KafkaException: Failed to parse the broker info from zookeeper from EC2 to elastic search

我的梦境 提交于 2020-01-25 10:13:55
问题 I have aws MSK set up and i am trying to sink records from MSK to elastic search. I am able to push data into MSK into json format . I want to sink to elastic search . I am able to do all set up correctly . This is what i have done on EC2 instance wget /usr/local http://packages.confluent.io/archive/3.1/confluent-oss-3.1.2-2.11.tar.gz -P ~/Downloads/ tar -zxvf ~/Downloads/confluent-oss-3.1.2-2.11.tar.gz -C ~/Downloads/ sudo mv ~/Downloads/confluent-3.1.2 /usr/local/confluent /usr/local

kafka.common.KafkaException: Failed to parse the broker info from zookeeper from EC2 to elastic search

时间秒杀一切 提交于 2020-01-25 10:12:05
问题 I have aws MSK set up and i am trying to sink records from MSK to elastic search. I am able to push data into MSK into json format . I want to sink to elastic search . I am able to do all set up correctly . This is what i have done on EC2 instance wget /usr/local http://packages.confluent.io/archive/3.1/confluent-oss-3.1.2-2.11.tar.gz -P ~/Downloads/ tar -zxvf ~/Downloads/confluent-oss-3.1.2-2.11.tar.gz -C ~/Downloads/ sudo mv ~/Downloads/confluent-3.1.2 /usr/local/confluent /usr/local

Kafka - error when producing from command line (character ('<' (code 60)): expected a valid value)

你。 提交于 2020-01-23 17:01:31
问题 I spinned on my laptop a Kafka in Docker (with docker-compose). After that, created new kafka topic with: kafka-topics --zookeeper localhost:2181 --create --topic simple --replication-factor 1 --partitions 1 (did not create schema in Schema Registry yet). Now trying to produce (based on this example - step 3 - https://docs.confluent.io/4.0.0/quickstart.html): kafka-avro-console-producer \ --broker-list localhost:9092 --topic simple \ --property value.schema='{"type":"record","name":"myrecord"

Backward Comaptibility issue and uncertainity in Schema Registry

我只是一个虾纸丫 提交于 2020-01-11 11:26:13
问题 I have a use case where I have a JSON and I want to generate schema and record out of the JSON and publish a record. I have configured the value serializer and Schema setting is Backward compatible. First JSON String json = "{\n" + " \"id\": 1,\n" + " \"name\": \"Headphones\",\n" + " \"price\": 1250.0,\n" + " \"tags\": [\"home\", \"green\"]\n" + "}\n" ; Version 1 schema registered. Received message in avro console consumer. Second JSON. String json = "{\n" + " \"id\": 1,\n" + " \"price\":

Backward Comaptibility issue and uncertainity in Schema Registry

雨燕双飞 提交于 2020-01-11 11:26:09
问题 I have a use case where I have a JSON and I want to generate schema and record out of the JSON and publish a record. I have configured the value serializer and Schema setting is Backward compatible. First JSON String json = "{\n" + " \"id\": 1,\n" + " \"name\": \"Headphones\",\n" + " \"price\": 1250.0,\n" + " \"tags\": [\"home\", \"green\"]\n" + "}\n" ; Version 1 schema registered. Received message in avro console consumer. Second JSON. String json = "{\n" + " \"id\": 1,\n" + " \"price\":

Registering AVRO schema with confluent schema registery

元气小坏坏 提交于 2019-12-25 04:34:20
问题 Can AVRO schemas be registered with confluent schema registry service ? As per readme on github https://github.com/confluentinc/schema-registry Every example uses a JSON schema with a single field and type without any name. I am trying to store following schema to repository but with different variants getting different error. curl -X POST -H "Content-Type: application/vnd.schemaregistry.v1+json" --data '{"schema": "{"type": "record","name": "myrecord","fields": [{"name": "serialization",