librdkafka

Differentiating between non-existent and un-authorized topic in librdkafka

醉酒当歌 提交于 2021-02-05 11:14:14
问题 How can I make sure if a topic is authorized or not ? I need this because, in my consumer I get the meta data for all the known topics and then do assign call. The metadata call doesn't give the un-authorized topics and non-existent topic. If a topic doesn't exist, I'll create one and if a topic is unauthorized, I have to fail. But I don't have a way to differentiate between non-existent and unauthorized topic. 回答1: You can try listing all the topics, if the topic exists it will be there in

Unable to install npm package (kafka-streams)

☆樱花仙子☆ 提交于 2021-01-29 11:22:09
问题 I am trying to use npm package kafka-streams but getting below error: PS D:\Projects\POCs\kstreams-poc> npm install kafka-streams > node-rdkafka@2.7.1 install D:\Projects\POCs\kstreams-poc\node_modules\node-rdkafka > node-gyp rebuild D:\Projects\POCs\kstreams-poc\node_modules\node-rdkafka>if not defined npm_config_node_gyp (node "C:\Users\virtual\AppData\Roaming\npm\node_modules\npm\node_modules\npm-lifecycle\node-gyp-bin\\..\..\node_modules\node-gyp\bin\node-gyp.js" rebuild ) else (node "C:

How to set the max size of a Kafka message using librdkafka

有些话、适合烂在心里 提交于 2021-01-28 05:08:21
问题 I'm trying to use Kafka to send a message of ~10Mb. I know its default size is 1Mb, but is that a hard limit? Can librdkafka support >10Mb and how do I set it? 回答1: You need to configure the topic max.message.bytes (see https://kafka.apache.org/22/documentation.html#topicconfigs), and configure the producer with message.max.bytes (see https://github.com/edenhill/librdkafka/blob/master/CONFIGURATION.md). Please allow some extra space (at least ~500 bytes) for protocol overhead. 来源: https:/

Nginx_Kafka_Module

99封情书 提交于 2021-01-11 07:52:42
1.安装git yum install -y git 2.切换到/usr/ local /src目录,然后将kafka的c客户端源码 clone 到本地 cd /usr/ local /src git clone https://github.com/edenhill/librdkafka 3.进入到librdkafka,然后进行编译 cd librdkafka yum install -y gcc gcc-c++ pcre-devel zlib-devel ./configure make && make install 4.安装nginx整合kafka的插件,进入到/usr/ local /src, clone nginx整合kafka的源码 cd /usr/ local /src git clone https://github.com/brg-liuwei/ngx_kafka_module 5.进入到nginx的源码包目录下 (编译nginx,然后将将插件同时编译) cd /usr/ local /src/nginx-1.12.2 ./configure --add-module=/usr/ local /src/ngx_kafka_module/ make make install 6.修改nginx的配置文件,详情请查看当前目录的nginx.conf 7

Kafka快速入门(十二)——Python客户端

丶灬走出姿态 提交于 2020-10-05 06:18:49
Kafka快速入门(十二)——Python客户端 一、confluent-kafka 1、confluent-kafka简介 confluent-kafka是Python模块,是对librdkafka的轻量级封装,支持Kafka 0.8以上版本。本文基于confluent-kafka 1.3.0编写。 GitHub地址: https://github.com/confluentinc/confluent-kafka-python 2、confluent-kafka特性 (1)可靠。confluent-kafka是对广泛应用于各种生产环境的librdkafka的封装,使用Java客户端相同的测试集进行测试,由Confluent进行支持。 (2)性能。性能是一个关键的设计考虑因素,对于较大的消息,最大吞吐量与Java客户机相当(Python解释器的开销影响较小),延迟与Java客户端相当。 (3)未来支持。Coufluent由Kafka创始人创建,致力于构建以Apache Kafka为核心的流处理平台。确保核心Apache Kafka和Coufluent平台组件保持同步是当务之急。 3、confluent-kafka安装 创建confluent源: 进入/etc/yum.repos.d目录创建confluent.repo文件: [Confluent.dist] name

centOS安装kafka的PHP扩展rdkafka==4.0.3

喜欢而已 提交于 2020-08-13 17:48:11
CentOS版本: # cat /etc/redhat-release Red Hat Enterprise Linux Server release 7.6 (Maipo) PHP版本: # php -v PHP 7.2.19 (cli) (built: Jun 4 2019 17:46:23) ( NTS ) Copyright (c) 1997-2018 The PHP Group Zend Engine v3.2.0, Copyright (c) 1998-2018 Zend Technologies kafka官方PHP客户端地址: https://cwiki.apache.org/confluence/display/KAFKA/Clients#Clients-PHP 1. librdkafka 安装kafka所有第三方客户端的依赖库: Kafka client based on librdkafka 1.1 下载librdkafka源码: # wget https://github.com/edenhill/librdkafka/archive/v1.5.0.tar.gz 1.2 解压到当前文件夹: ```shell # tar -zxf v1.5.0.tar.gz 1.3 进入源码目录: $ cd librdkafka-1.5.0/ 1.4 执行检测编译配置: $

Kafka快速入门(九)——C客户端

删除回忆录丶 提交于 2020-08-10 12:48:28
Kafka快速入门(九)——C客户端 一、Librdkafka简介 1、librdkafka简介 librdkafka是C语言实现的Apache Kafka的高性能客户端,提供C++接口。librdkafka专为现代硬件而设计,尝试将内存复制保持在最小,可以让用户决定是需要高吞吐量还是低延迟的服务,当前可支持每秒超过100万的消息生产和300万每秒的消息消费。 Github地址: https://github.com/edenhill/librdkafka 2、librdkafka安装 yum install librdkafka-devel Linux系统下源码编译安装: git clone https://github.com/edenhill/librdkafka.git cd librdkafka ./configure make -j sudo make install 编译完成后,C的静态库、动态库、头文件位于src目录,CPP的静态库、动态库、头文件位于src-cpp目录下。 安装完成后,librdkafka的头文件位于/usr/local/include/librdkafka,库文件位于/usr/local/lib,执行ldconfig确保librdkafka库生效。 librdkafka C API定义在rdkafka.h文件中。 librdkafka C++