I don't know, protobufs feel like an internal project with a lot of warts that made it outside too quickly. Take variable length integer encoding; it compresses nicely for 1, 2, 3, ... but as soon...
I don't know, protobufs feel like an internal project with a lot of warts that made it outside too quickly.
Take variable length integer encoding; it compresses nicely for 1, 2, 3, ... but as soon as you give it -1 it ends up taking 10 bytes because of how integers are normally represented. Instead of fixing this, a new type has been added that moves the sign bit from MSB to LSB. Fixed-length integers were also added, but only 4 and 8 bytes long.
People selling gRPC feel a lot like cargo cult, to be honest. Packing bits is not exactly rocket science and the amount of control traffic is usually not the bottleneck for most projects.
There are better binary formats, such as CBOR (wiki), which in particular is more or less backwards-compatible with JSON semantics (nested lists, maps and so on), does not require schemas and allows for extended field typing such as expressing that an integer field is in fact a UNIX timestamp. It also packs small integers in the leading byte, so 0..23, -1..-24, true, false, null take just a single byte, which is one byte less than protobufs. And yeah, you can index maps with integers if you'd like. Oh, and it supports streaming (chunking) natively.
Protobufs became ubiquitous within Google because they’re pretty good despite the warts. There were several attempts to clean it up by defining new standards, but they didn’t take into account...
Protobufs became ubiquitous within Google because they’re pretty good despite the warts. There were several attempts to clean it up by defining new standards, but they didn’t take into account non-Google use cases enough. Considering that it came from an Internet company, I don’t think they even took JavaScript seriously enough; a 54-bit integer type (the range of JavaScript safe integers) would have been a whole lot better for interop with JavaScript, and 54 bits is good enough for many applications. It’s a better default than 64 bits.
It’s really very C++ centric, assuming you can at least get a protoc binary to use for parsing schemas instead of choosing an easy-to-parse schema language that it would be easier to write multiple implementations for.
The warts can be worked around, though. JavaScript has warts, too. You aren’t going to get a greenfield design from a company that’s got a huge amount of existing code and data that they need to be able to reuse.
I don't know, protobufs feel like an internal project with a lot of warts that made it outside too quickly.
Take variable length integer encoding; it compresses nicely for 1, 2, 3, ... but as soon as you give it -1 it ends up taking 10 bytes because of how integers are normally represented. Instead of fixing this, a new type has been added that moves the sign bit from MSB to LSB. Fixed-length integers were also added, but only 4 and 8 bytes long.
People selling gRPC feel a lot like cargo cult, to be honest. Packing bits is not exactly rocket science and the amount of control traffic is usually not the bottleneck for most projects.
There are better binary formats, such as CBOR (wiki), which in particular is more or less backwards-compatible with JSON semantics (nested lists, maps and so on), does not require schemas and allows for extended field typing such as expressing that an integer field is in fact a UNIX timestamp. It also packs small integers in the leading byte, so 0..23, -1..-24, true, false, null take just a single byte, which is one byte less than protobufs. And yeah, you can index maps with integers if you'd like. Oh, and it supports streaming (chunking) natively.
Protobufs became ubiquitous within Google because they’re pretty good despite the warts. There were several attempts to clean it up by defining new standards, but they didn’t take into account non-Google use cases enough. Considering that it came from an Internet company, I don’t think they even took JavaScript seriously enough; a 54-bit integer type (the range of JavaScript safe integers) would have been a whole lot better for interop with JavaScript, and 54 bits is good enough for many applications. It’s a better default than 64 bits.
It’s really very C++ centric, assuming you can at least get a protoc binary to use for parsing schemas instead of choosing an easy-to-parse schema language that it would be easier to write multiple implementations for.
The warts can be worked around, though. JavaScript has warts, too. You aren’t going to get a greenfield design from a company that’s got a huge amount of existing code and data that they need to be able to reuse.