protocol-buffers

What's the preferred way to encode a “nullable” field in protobuf 2?

好久不见. 提交于 2019-12-04 11:18:40
问题 I am defining a ProtoBuf message where I want to have a "nullable" field -- i.e., I want to distinguish between the field having a value and not having a value. As a concrete example, let's say I have "x" and "y" fields to record the coordinates of some object. But in some cases, the coordinates are not known. The following definition will not work, because if x or y are unspecified, then they default to zero (which is a valid value): message MyObject { optional float x = 1; optional float y

Protocol buffer: does changing field name break the message?

时光毁灭记忆、已成空白 提交于 2019-12-04 10:17:11
问题 With protocol buffer, does changing field name of a message still let it compatible backward? I couldn't find any cite about that. Eg: original message message Person { required string name = 1; required int32 id = 2; optional string email = 3; } Change to: message Person { required string full_name = 1; required int32 id = 2; optional string email = 3; } 回答1: Changing a field name will not affected protobuf encoding or compatibility between applications that use proto definitions which

What's the different between .proto and .prototxt file

不羁的心 提交于 2019-12-04 07:57:39
In caffe project, there are both .proto file and .prototxt file. From Google Protocol Buffer documentation, .proto file defines the protocol, so what about the .prototxt , is it defined in Google Protocol Buffer, what's the different between them? The .proto file is used to describe the structure (the 'protocol') of the data to be serialized. The protobuf compiler can turn this file into python/or C++/or Java code to serialize and deserialize data with that structure For the .prototxt file. Looking at the documentation here , we can see that, there are two different formats for serialized data

Can you represent CSV data in Google's Protocol Buffer format?

99封情书 提交于 2019-12-04 07:32:07
I've recently found out about protocol buffers and was wondering if they could be applied to my specific problem. Basically I have some CSV data that I need to convert to a more compact format for storage as some of the files are several gig. Each field in the CSV has a header, and there are only two types, strings and decimals (because sometimes there are alot of significant digits and I need to handle all numbers the same way). But each file will have different column names for each field. As well as capturing the original CSV data I need to be able to add extra information to the file

Integrate Protocol Buffers into Maven2 build

≡放荡痞女 提交于 2019-12-04 07:30:40
问题 I'm experimenting with Protocol Buffers in an existing, fairly vanilla Maven 2 project. Currently, I invoke a shell script every time I need to update my generated sources. This is obviously a hassle, as I would like the sources to be generated automatically before each build. Hopefully without resorting to shameful hackery. So, my question is two-fold: Long shot: is there a "Protocol Buffers plugin" for Maven 2 that can achieve the above in an automagic way? There's a branch on Google Code

boost serialization vs google protocol buffers? [closed]

£可爱£侵袭症+ 提交于 2019-12-04 07:25:48
问题 Closed . This question is opinion-based. It is not currently accepting answers. Want to improve this question? Update the question so it can be answered with facts and citations by editing this post. Closed last year . Does anyone with experience with these libraries have any comment on which one they preferred? Were there any performance differences or difficulties in using? 回答1: I've played around a little with both systems, nothing serious, just some simple hackish stuff, but I felt that

Linking protobuf library with code (Google protocol buffers)

烈酒焚心 提交于 2019-12-04 07:06:48
I am getting linking error when I try to compile a test code. I'm using cygwin on windows 7. Initial steps like ./configure, make, make test & make install went fine I'm also able to generate .pb.cc and .pb.h with protoc command. But when I try to compile my test code, it gets many linking errors. I'm sure those errors are because it is unable to link to library. Cygwin has protobuf static library and linking library in /usr/local/lib . include files are present in /usr/local/include I tried with -lprotobuf, but it returns error saying -lprotobuf not found It hard to say what the problem is

Data format compatibility between protobuf versions

浪尽此生 提交于 2019-12-04 06:53:55
I was wondering if protocol buffer's serialized data format remains constant across protobuf compiler and client library versions. In other words, do I need to use the same compiler version to generate my Python, Java, and C++ classes? And do these clients all need to use the same version of protobuf libraries? This post sort of addresses my question, but its accepted answer is specific to the OP's protobuf version. Yes, that is pretty much the idea. It shouldn't matter which library you use, as long as it follows the spec. Note that the same data can be represented in slightly different ways,

How do I Share Enum values between my Java code and .proto file

感情迁移 提交于 2019-12-04 06:19:40
问题 I have a class that I wish to protobuf. in that class one of the fields is an enum (in a class of it's own). Can I avoid defining an identical enum value in my .proto file ? Or will I have to manually make sure the enum definition in the java code is the same as in the .proto file? java code: public enum Location { UNDEF(0),HOME(1), WORK(2); ... } .proto file corresponding code: message Address{ enum location { UNDEF = 0; HOME = 1; WORK = 2; } optional location addressLocation; ... } 回答1: The

.NET/C# Interop to Python

萝らか妹 提交于 2019-12-04 05:25:25
My backend is written in .NET/C#, I have a requirement that I need to execute python scripts passing context from the .net side of the house. These are queued up in a background task engine called hangfire running as a windows service. I did a little digging and found IronPython , however, after implementing it failed to support many of the pypi python packages that I need to execute inside my script. Secondly, I looked at Python.Net which is a embedded interpreter that embeds or extends CPython. CPython can run all the scripts/etc that I needed, however, I found that opening/closing the