Using kafka to produce data for clickhouse

徘徊边缘 提交于 2020-07-22 12:48:30

问题


I want to use kafka integration for clickhouse. I tried to use official tutorial like here! All table has been created. I run kafka server. Next run kafka producer and write in command promt json object like row in database. Like this:

{"timestamp":1554138000,"level":"first","message":"abc"}

I checked kafka consumer.It received object. But when I cheked tables in my clickhouse database there were empty rows. Any ideas what I did wrong?


回答1:


UPDATE

To ignore malformed messages pass kafka_skip_broken_messages-param to table definition.


It looks like a well-known issue that occurred in one of the latest version of CH, try to add extra parameter kafka_row_delimiter to engine configuration:

CREATE TABLE queue (
 timestamp UInt64,
 level String,
 message String
) 
ENGINE = Kafka SETTINGS
  kafka_broker_list = 'localhost:9092',
  kafka_topic_list = 'topic',
  kafka_group_name = 'group1',
  kafka_format = 'JSONEachRow',
  kafka_row_delimiter = '\n'
  kafka_skip_broken_messages = 1;



回答2:


So sorry. There was my fail. Before starting clickhouse and kafka. I tested sending simple messages into topic by kafka. And clickhouse tried parse it. I just create new topic and now everytning works. Thank you!



来源:https://stackoverflow.com/questions/55457726/using-kafka-to-produce-data-for-clickhouse

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!