Protocol buffers for serializing several data objects of a post/comment into a single serialized piece of data?

Protocol buffers certainly provides serialization, although the RPC side of things is left to your imagination (often something simple and socket-based works very well).

Protocol buffers certainly provides serialization, although the RPC side of things is left to your imagination (often something simple and socket-based works very well). The data-types are all well supported by protobuf (although you might want to use something like ms into the unix epoch for the date). Note though that protobuf doesn't include compression (unless you also apply gzip etc to the stream).

So the message will be "a bit longer than the string (which always uses UTF-8 encoding in protobuf). I say a "a bit" because the varint algorithm for integer types could give anything between 1 and 10 bytes each for the id and timestamp, depending on their magnitude. And a few (3, probably) bytes for the field headers.

If that sounds about right, then it should work fine. If you have lots of text data, though, you might want to run the protobuf stream through gzip as well. Java has excellent support within protobuf via the main google trunk.

I guess, if I gzip then it would add to more performance costs each time I read or write the data. Anyways, does it usually leads to considerable savings with gzip compression? – Marcos Mar 3 at 23:02 @marcos - it depends on the data.

Worth a test, maybe, especially as most of your payload is a string – Marc Gravell? Mar 3 at 23:47 Thanks Marc! Will give it a try!

– Marcos Mar 3 at 23:54.

Don't know if that fits in your specific case but I have seen suggestions to store a JSON representation of the data that can be directly sent to the browser. If you don't need any further processing steps involving POJOs then this or a similar approach might be a (fast) way to go.

Overall, it looks like Protocol Buffers is a good fit for what you want to do. Many people use it exactly for what you've described. I heard about some others using plain JSON for that, but it is definitely less efficient.

Protocol Buffers is fast, portable, mature and well-documented. It is developed and maintained by Google. One of the distinctive features of Protocol Buffers is the ability to transparently extend existing records with new fields.

For instance, you can extend your existing record format to contain some other fields without converting your existing data or modifying software that works with old fields (as it will silently discard unknown fields). Regarding your question about whether client can work with serialized format (if I understood the question correctly). If a client supports Protocol Buffers and have the ".

Proto" files describing data format, then they will be able to work with it just like you do. If a client can't work with Protocol Buffers, there are some third-party libraries 1 that can convert between Protobuf, JSON and XML formats (I haven't tried using them myself). You might also want to check out some alternatives to Protocol Buffers, such as Message Pack 2 and Avro.

They claim to be faster / more compact / have support for dynamic typing. 1 for example, http://code.google.com/p/protobuf-java-format/ 2 http://msgpack.org.

I cant really gove you an answer,but what I can give you is a way to a solution, that is you have to find the anglde that you relate to or peaks your interest. A good paper is one that people get drawn into because it reaches them ln some way.As for me WW11 to me, I think of the holocaust and the effect it had on the survivors, their families and those who stood by and did nothing until it was too late.

Related Questions