37

I recently started reading and employing gRPC in my work. gRPC uses protocol-buffers internally as its IDL and I keep reading everywhere that protocol-buffers perform much better, faster as compared to JSON and XML.

What I fail to understand is - how do they do that? What design in protocol-buffers actually makes them perform faster compared to XML and JSON?

gravetii
  • 9,273
  • 9
  • 56
  • 75
  • 3
    Binary format. Less wasteful. At the cost of not being human-readable. – Sergio Tulentsev Sep 03 '18 at 09:21
  • 3
    Protocol buffers uses an optimized binary format. Furthermore, the meta information defining what is in the message is not included in the message. E.g. if your message has a property named `foo` then this name is not part of the message. In XML and JSON you will include `foo` as a literal string for each occurrence of the property `foo` in the message. The result is that protocol buffer messages are very compact compared to the same messages in XML or JSON. – Martin Liversage Sep 03 '18 at 09:25
  • 2
    They have an really good explanation in their docs: https://developers.google.com/protocol-buffers/docs/overview Chapter 'Why not xml' – ChristianMurschall Sep 04 '18 at 09:17
  • lol i cant believe i found this question from two years ago and just notice an edit was made an hour ago (a little while after i got here) – Deryck Aug 09 '20 at 09:32

3 Answers3

42

String representations of data:

  • require text encode/decode (which can be cheap, but is still an extra step)
  • requires complex parse code, especially if there are human-friendly rules like "must allow whitespace"
  • usually involves more bandwidth - so more actual payload to churn - due to embedding of things like names, and (again) having to deal with human-friendly representations (how to tokenize the syntax, for example)
  • often requires lots of intermediate string instances that are used for member-lookups etc

Both text-based and binary-based serializers can be fast and efficient (or slow and horrible)... just: binary serializers have the scales tipped in their advantage. This means that a "good" binary serializer will usually be faster than a "good" text-based serializer.

Let's compare a basic example of an integer:

json:

{"id":42}

9 bytes if we assume ASCII or UTF-8 encoding and no whitespace.

xml:

<id>42</id>

11 bytes if we assume ASCII or UTF-8 encoding and no whitespace - and no namespace noise like namespaces.

protobuf:

0x08 0x2a

2 bytes

Now imagine writing a general purpose xml or json parser, and all the ambiguities and scenarios you need to handle just at the text layer, then you need to map the text token "id" to a member, then you need to do an integer parse on "42". In protobuf, the payload is smaller, plus the math is simple, and the member-lookup is an integer (so: suitable for a very fast switch/jump).

Marc Gravell
  • 1,026,079
  • 266
  • 2,566
  • 2,900
  • 7
    btw: in case anyone is unsure whether `0x08 0x2a` represents the same scenario: https://protogen.marcgravell.com/decode?hex=082a – Marc Gravell Sep 03 '18 at 11:00
  • btw: encoding reference https://developers.google.com/protocol-buffers/docs/encoding – Rick Dec 29 '21 at 09:27
2

While binary protocols have an advantage in theory, in practice, they can lose in performance to JSON or other protocol with textual representation depending on the implementation.

Efficient JSON parsers like RapidJSON or jsoniter-scala parse most JSON samples at speed 2-8 cycles per byte. They serialize even more efficiently, except some edge cases like numbers with floating points when serialization speed can drop down to 16-32 cycles per byte.

But for most domains which don't have a lot of floats or doubles their speed is quite competitive with the best binary serializers. Please see results of benchmarks where jsoniter-scala parses and serializes on par with Java and Scala libraries for ProtoBuf:

https://github.com/dkomanov/scala-serialization/pull/8

Andriy Plokhotnyuk
  • 7,883
  • 2
  • 44
  • 68
1

I'd have to argue that Binary Protocols will typically always win in performance vs text based protocols. Ha, you won't find many (or any) video streaming applications using JSON to represent the frame data. However, any poorly designed data structure will struggle when being parsed. I've worked on many communications projects to where the text based protocols were replaced with "binary protocols".

BlakeStone
  • 19
  • 4