All you need is one parsing tool ported to 1000 platforms in existence now. We already have ASCII tools on all of those platforms, and we are pretty much guaranteed once new platform is created it would have basic ASCII tools. However it is not at all guaranteed it would have decoder tool for every binary protocol out there. That's why ASCII protocols are easier to handle than binary ones. And for 99.999% of protocol users, savings from converting to binary would not be even measurable. Sure, for likes of Google and Amazon economy of scale would be substantial. But 99.999% of web users aren't humongous-scale projects, they are relatively low-tech projects for which simplicity is much more important than squeezing out every last bit of performance.
So long as nobody that needs to speak ASCII deploys a server that doesn't speak HTTP/1.1, I think switching to a more compact binary protocol for HTTP/2.0 is a good thing. Embedded devices will be able to handle more sessions with less CPU power, for example.
I'm not convinced for average embedded device parsing HTTP headers represents significant amount of energy spent. Are there any data that suggest that for average device - I don't mean Google's specialized routers or any other hardware specifically designed to parse HTTP - this change would produce measurable improvement? In other words, how much longer the battery on my iPad would last? I don't think I'd gain even a single second, but I'd be very interested to see data that suggest otherwise.
All you need is one parsing tool that produces a textual representation of the binary protocol, and you can once again use grep and friends.