Hacker News new | past | comments | ask | show | jobs | submit login

The article says that one of the advantages of gRPC is streaming and that JSON wouldn’t support streaming.

That’s however just an implementation detail. JSON can easily be written and read as a stream.

Switching your whole architecture, dealing with a binary protocol and the accompanying tooling issues just because of your choice of JSON parser feels like total overkill.

JSON over HTTP is ubiquitous, has amazing tooling and his highly debuggable. Parsers have become so fast that I feel they might even have the opportunity to be faster than a protobuf based solution.

Finally I don’t buy the argument about validation. You have to validate input and output on the boundaries no matter what.

Even when your interface says “this is a double”, it says nothing about ranges (as seen in the article where valid ranges were specified in the comment) for example.




> Parsers have become so fast that I feel they might even have the opportunity to be faster than a protobuf based solution.

Not even close. Event new JSON serializers/deserializers aren't magic. Protobuf is a LOT easier to parse, so it's naturally a LOT faster.

First two duck results for "json vs protobuf benchmark":

https://auth0.com/blog/beating-json-performance-with-protobu...

https://codeburst.io/json-vs-protocol-buffers-vs-flatbuffers...


The first link shows a mere 4% margin when talking to a JavaScript VM.

Even at a 5x improvement, most projects will never reach a point where the transport encoding is a bottleneck. Protobuf has a lot going for it (currently using in a project) but can’t be sold on speed alone.


Is the JSON parser implemented natively, or in JS? It may not be apples-to-apples.


> Is the JSON parser implemented natively, or in JS? It may not be apples-to-apples.

True, but if you're wanting an implementation you can use in Javascript running in the browser, it may accurately reflect reality. You have a high-quality browser-supplied (presumably native) implementation of JSON available. For a protobuf parser, you've just got Javascript. (You can call into webassembly, but given that afaik it can't produce Javascript objects on its own, it's not clear to me there's any advantage in doing so unless you're moving the calling code into webassembly also.)

I don't think browser-based parsing speed is important though. It's probably not a major contributor to display/interaction latency, energy use, or any other metric you care about. If it is, maybe you're wasting bandwidth by sending a bunch of data that's discarded immediately after parsing.


My guess would be that most of the cost is creating the JS objects and the parsing is a relatively small part of the cost, so optimizing it would not help much.


Yea, the V8 json parser is implemented naively and optimized alongside the engine in a way that other serialization methods in Javascript, and JSON in other languages, is generally not.


As mentioned in other comments, gRPC transport is orthogonal to Protobuf serialization. The gRPC runtime library takes no dependency on that. You can use gRPC with JSON. It just happens the default code generators use protobuf IDL and serialization. You can use gRPC library with your own JSON based stub generator.


While that's true I think protobufs are (correctly) seen as the standard preferred way to use gRPC. The first point from the main page:

> Simple service definition

> Define your service using Protocol Buffers, a powerful binary serialization toolset and language

It's a little unfair to call it that orthogonal.


You can't do good streaming using REST / JSON, it's either broken / slow / badly implemented. And that's for one direction, bidirectional streaming is not even possible.


Not all of your API endpoints should response with JSON, that's all. Create endpoint for streaming - it's simple solution.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: