protocol-buffers

Is it possible to enable GRPC message compression in server? (Python)

[亡魂溺海] 提交于 2020-01-06 06:57:48
问题 I have a gRPC client (in Java) sending requests to a server (written in Python). I need to enable both request compression and response compression. There is good documentation on how to enable compression on the client side. I have managed to compress the request like so: response = blockingStub.withCompression("gzip").method(request) However, I cannot find any documentation on how to compress the server response (also). It seems that there is almost no documentation (or examples) on how to

python protobuf can't deserialize message

|▌冷眼眸甩不掉的悲伤 提交于 2020-01-06 06:09:23
问题 Getting started with protobuf in python I face a strange issue: a simple message proto definition is: syntax = "proto3"; package test; message Message { string message = 1; string sender = 2; } generated via protoc -I . --python_out=generated message.proto and accessed in Python like: from generated.message_pb2 import Message Then I can construct a message m = Message() m.sender = 'foo' m.message = 'bar' print(str(m)) but de-serializing will not return a result s_m = m.SerializeToString()

Apache Beam with Flink backend throws NoSuchMethodError on calls to protobuf-java library methods

纵饮孤独 提交于 2020-01-06 03:41:07
问题 I'm trying to run a simple pipeline on local cluster using Protocol Buffer to pass data between Beam functions. The com.google.protobuf:protobuf-java is included in FatJar. Everything works fine if I run it through: java -jar target/dataflow-test-1.0-SNAPSHOT.jar \ --runner=org.apache.beam.runners.flink.FlinkRunner \ --input=/tmp/kinglear.txt --output=/tmp/wordcounts.txt But it fails when trying to run on flink cluster: flink run target/dataflow-test-1.0-SNAPSHOT.jar \ --runner=org.apache

Protobuf-net memcache provider fails on append/set ArraySegment

北战南征 提交于 2020-01-05 10:32:08
问题 I'm using protobuf-net lib with protobuf-net memcache provider and I'm trying to use memcache append function: var data = new ArraySegment<byte>(Encoding.UTF8.GetBytes("appendedString")); var result = _memcache.ExecuteStore(StoreMode.Add, key, data); And it throws exception: The runtime has encountered a fatal error. The address of the error was at 0x63765a43, on thread 0xd58. The error code is 0xc0000005. This error may be a bug in the CLR or in the unsafe or non-verifiable portions of user

conversion BigDecimal Java to c#-like Decimal

北战南征 提交于 2020-01-05 09:14:07
问题 In Java BigDecimal class contains values as A*pow(10,B) where A is 2's complement which non-fix bit length and B is 32bit integer. In C# Decimal contains values as pow (–1,s) × c × pow(10,-e) where the sign s is 0 or 1, the coefficient c is given by 0 ≤ c < pow(2,96) , and the scale e is such that 0 ≤ e ≤ 28 . And i want to convert Java BigDecimal to Something like c# Decimal in JAVA. Can you help me . I have some thing like this class CS_likeDecimal { private int hi;// for 32bit most

conversion BigDecimal Java to c#-like Decimal

给你一囗甜甜゛ 提交于 2020-01-05 09:13:15
问题 In Java BigDecimal class contains values as A*pow(10,B) where A is 2's complement which non-fix bit length and B is 32bit integer. In C# Decimal contains values as pow (–1,s) × c × pow(10,-e) where the sign s is 0 or 1, the coefficient c is given by 0 ≤ c < pow(2,96) , and the scale e is such that 0 ≤ e ≤ 28 . And i want to convert Java BigDecimal to Something like c# Decimal in JAVA. Can you help me . I have some thing like this class CS_likeDecimal { private int hi;// for 32bit most

maven-antrun-plugin: generate sources for protobuf does not generate Java files

允我心安 提交于 2020-01-05 05:31:21
问题 I have the following plugin in my pom.xml , which is supposed to generate java files required for the compilation of other projects: <plugin> <artifactId>maven-compiler-plugin</artifactId> <version>3.2</version> <configuration> <source>${jdk.version}</source> <target>${jdk.version}</target> </configuration> </plugin> <plugin> <groupId>org.apache.felix</groupId> <artifactId>maven-bundle-plugin</artifactId> <extensions>true</extensions> <configuration> <archive> <manifestFile>META-INF/MANIFEST

Deserialize protobuf buffer in Python from C++ with pybind11

此生再无相见时 提交于 2020-01-05 04:29:26
问题 I have a char *buffer which I convert to a C++ string std::string sbuffer(buffer); because I want to pass it to python. C++ can work with: protoObj.ParseFromArray(buffer, sbuffer.size()); I pass the buffer to python via: py::scoped_interpreter python; py::module calc = py::module::import("Calculation"); py::object Calculation = calc.attr("Calculation"); py::object calculation = Calculation(); calculation.attr("funcName")(sbuffer.data(), sbuffer.size()); The python file looks kinda like this:

“Unable to find Protobuf include directory” when I use “pip install mysql-connector”

≡放荡痞女 提交于 2020-01-04 18:58:22
问题 I get this error when I try to load using pip install mysql-connector. I tried pip install Protobuf too but no solution. # Python architecture: 64-bit # Python ARCH_64BIT: True Unable to find Protobuf include directory. 回答1: I found this useful: pip install mysql-connector==2.1.4 is obsolete. pip install mysql-connector-python is suggested. 回答2: The official package name of MySQL Connector for Python is mysql-connector-python : pip install mysql-connector-python And mysql-connector is a fork

“Unable to find Protobuf include directory” when I use “pip install mysql-connector”

自古美人都是妖i 提交于 2020-01-04 18:57:07
问题 I get this error when I try to load using pip install mysql-connector. I tried pip install Protobuf too but no solution. # Python architecture: 64-bit # Python ARCH_64BIT: True Unable to find Protobuf include directory. 回答1: I found this useful: pip install mysql-connector==2.1.4 is obsolete. pip install mysql-connector-python is suggested. 回答2: The official package name of MySQL Connector for Python is mysql-connector-python : pip install mysql-connector-python And mysql-connector is a fork