问题
I want to use kafka integration for clickhouse. I tried to use official tutorial like here! All table has been created. I run kafka server. Next run kafka producer and write in command promt json object like row in database. Like this:
{"timestamp":1554138000,"level":"first","message":"abc"}
I checked kafka consumer.It received object. But when I cheked tables in my clickhouse database there were empty rows. Any ideas what I did wrong?
回答1:
UPDATE
To ignore malformed messages pass kafka_skip_broken_messages-param to table definition.
It looks like a well-known issue that occurred in one of the latest version of CH, try to add extra parameter kafka_row_delimiter to engine configuration:
CREATE TABLE queue (
timestamp UInt64,
level String,
message String
)
ENGINE = Kafka SETTINGS
kafka_broker_list = 'localhost:9092',
kafka_topic_list = 'topic',
kafka_group_name = 'group1',
kafka_format = 'JSONEachRow',
kafka_row_delimiter = '\n'
kafka_skip_broken_messages = 1;
回答2:
So sorry. There was my fail. Before starting clickhouse and kafka. I tested sending simple messages into topic by kafka. And clickhouse tried parse it. I just create new topic and now everytning works. Thank you!
来源:https://stackoverflow.com/questions/55457726/using-kafka-to-produce-data-for-clickhouse