protocol-buffers

Protobuf streaming (lazy serialization) API

℡╲_俬逩灬. 提交于 2019-12-03 16:37:13
We have an Android app that uses Protocol Buffers to store application data. The data format (roughly) is a single protobuf ("container") that contains a list of protobufs ("items") as a repeated field: message Container { repeated Item item = 1; } When we want to save a change to an item, we must recreate the protobuf container, add all the items to it, then serialize it and write it to a file. The problem with this a approach is it potentially triples the memory used when saving because the data has to first be copied from the model class to the protobuf builder and then to a byte array when

Protobuf with NodeJS on Windows

杀马特。学长 韩版系。学妹 提交于 2019-12-03 16:35:48
I would like to send simple TCP message to a device (Karotz) from NodeJS Script on Windows. NodeJS correctly installed an working TCP connection working Here is my .proto file (http://wiki.karotz.com/index.php/Voos-message.proto) I compile it to .desc using google's protoc I don't know how to build my message to send it to the device ? I read Google Description And protobuff_for_node and other fork But I don't understand how to install it on windows. Seems complicated because of native librarie. Is there dead simple javascript librarie that read the .desc Schema and build the message ? Without

IDL for JSON REST/RPC interface

妖精的绣舞 提交于 2019-12-03 16:27:12
问题 We are designing a fairly complex REST API, in which most of the I/O are JSON encoded objects with a specific structure. One challenge we have found is to document the API in such a way that makes it easier for clients to post correct input and process output. Because the data of both the input and output requires fairly complex JSON objects, client developers often introduce bugs related to the structure of the I/O objects. With all of the JSON web API's these days, I would have hoped for a

Loading protobuf format file into pig script using loadfunc pig UDF

雨燕双飞 提交于 2019-12-03 16:25:46
I have very little knowledge of pig. I have protobuf format data file. I need to load this file into a pig script. I need to write a LoadFunc UDF to load it. say function is Protobufloader() . my PIG script would be A = LOAD 'abc_protobuf.dat' USING Protobufloader() as (name, phonenumber, email); All i wish to know is How do i get the file input stream. Once i get hold of file input stream, i can parse the data from protobuf format to PIG tuple format. PS: thanks in advance Twitter's open source library elephant bird has many such loaders: https://github.com/kevinweil/elephant-bird You can use

Protocol buffer and OO design

帅比萌擦擦* 提交于 2019-12-03 16:08:43
问题 I'm using protocol buffer as a wire data-format in a client-server architecture. Domain objects (java beans) will go through following life-cycle. Used in client side business logic Converted to protobuf format Transmitted to the server Converted back to domain object Used in server side business logic "Protocol Buffers and O-O Design" section in ProtoBuf documentation recommends wrapping generated class inside proper domain model. I'd like to find-out the best appoach. For e.g. I have a

How to do non blocking socket reads with Protobuf using C#?

大憨熊 提交于 2019-12-03 15:56:01
Lets say I want to do non blocking reads from a network socket. I can async await for the socket to read x bytes and all is fine. But how do I combine this with deserialization via protobuf? Reading objects from a stream must be blocking? that is, if the stream contains too little data for the parser, then there has to be some blocking going on behind the scenes so that the reader can fetch all the bytes it needs. I guess I can use lengthprefix delimiters and read the first bytes and then figure out how many bytes I have to fetch minimum before I parse, is this the right way to go about it? e

Conversion between C structs (C++ POD) and google protobufs?

☆樱花仙子☆ 提交于 2019-12-03 14:07:01
I have code that currently passes around a lot of (sometimes nested) C (or C++ Plain Old Data) structs and arrays. I would like to convert these to/from google protobufs. I could manually write code that converts between these two formats, but it would be less error prone to auto-generate such code. What is the best way to do this? (This would be easy in a language with enough introspection to iterate over the names of member variables, but this is C++ code we're talking about) One thing I'm considering is writing python code that parses the C structs and then spits out a .proto file, along

Protocol buffer3 and json

房东的猫 提交于 2019-12-03 12:45:39
问题 Protocol buffer v3 claims, that library is json friendly (https://developers.google.com/protocol-buffers/docs/proto3#json), but I cannot find how to achieve get that mapping. Should I add some plugin, or some option into protoc, or call something special instead SerializeTo/ParseFrom? Is it someone who use that feature? 回答1: I'm using Protobuf 3.3.0, which does have a built-in JSON serializer and parser. You can use 2 functions from google/protobuf/util/json_util.h called MessageToJsonString(

Missing input file with protoc in protocol buffer

梦想的初衷 提交于 2019-12-03 12:39:48
I currently have a file called addressbook.proto in next to my protoc.exe. I am having difficulty generating the .h and the .cc file. Here is what I am doing protoc --cpp_out=c:\addressbook.proto However I get the following response Missing input file. Any suggestions on what I might be doing wrong ? The -cpp_out tag specifies the output directory for generated c source code. I would suggest trying (if proto is actually stored under the c: directory c:\addressbook.proto) protoc c:\addressbook.proto --cpp_out=./ or protoc addressbook.proto --cpp_out=./ 来源: https://stackoverflow.com/questions

python example for reading multiple protobuf messages from a stream

时光怂恿深爱的人放手 提交于 2019-12-03 12:28:09
问题 I'm working with data from spinn3r, which consists of multiple different protobuf messages serialized into a byte stream: http://code.google.com/p/spinn3r-client/wiki/Protostream "A protostream is a stream of protocol buffer messages, encoded on the wire as length prefixed varints according to the Google protocol buffer specification. The stream has three parts: a header, the payload, and a tail marker." This seems like a pretty standard use case for protobufs. In fact, protobuf core