Throwing out standards-compliant HTTP (whether 1,2 or 3) with the bathwater that is JSON decoding was a mistake. JSON + jsonschema + swagger/hyperschema should be good enough for most projects, and for those where it isn't good enough, swap out the content type (but keep the right annotations) and call it a day! Use Avro, use capnproto, use whatever without tying yourself into the grpc ecosystem.
Maybe gRPC's biggest contribution is the more polished and coherent tooling -- in combining three solutions (schema enforcement, binary representation and convenient client/server code generation), they've created something that's just easier to use. I personally would have preferred this effort to go towards tools that work on standards-compliant HTTP1/2/3.
The con is that you can do that with the protocol of your choice directly and you don't need to bolt HTTP to whatever you're building.
That said, the http body and response are perfectly fine being binary. It's only the headers that are text based (in http 1. Http 2 turns those headers into binary as well.)
Who told you that? You can specify an arbitrary content type. Not just text.
For people who prefer JSON to protobuf, gRPC is serialization-agnostic. For folks who prefer REST verbs to gRPC methods, proto3 has native support for encoding REST mappings and tools like envoy and grpc web can do the REST <-> gRPC proxy translation automatically
The biggest advantage of an RPC is that it takes most of that out of the hands of the developers. Developers can just focus on business logic and leave the connection and request management to the standard library.