The typing of file / stream data is interesting, but I don't think it's realistic - it requires all tools to change virtually overnight to preserve the headers. And it's not enough just to preserve headers; what if you pump stuff out to a temporary file and read it in a few milliseconds later? What about compound files, with data chopped out of them? If you want pretty pictures, syntax highlighting etc., ISTM better to have an enriched terminal communication protocol along with a couple of viewer programs which can render data formats to it, rather than change the whole world to type every data flow everywhere.
Good shells have prompted object pipeline based tools before - PoSH is only a few years old. And it's still possible for older tools to run with no degradation in quality from their previous environments.
This is really compelling.
Powershell - I do indeed wish Unix shells worked a little more like it for jobs like parsing ps output portably across platforms, but it has poor performance for what I use the shell for, because it doesn't run each pipeline process as a separate process or thread - everything is single-threaded. I frequently run sed, awk, grep, sort over multi-GB files and rely on processes and job control (fork-join through mktemp, & and wait, and xargs -P) to make it work quickly.
But if, as I mentioned, the terminal supported a richer communication protocol - one smart enough to show render full 32-bit images including alpha - then a version of ls could be written that does what his ls command does, without needing to go as far as he does.
Presumably the "view" for `ls` will be configurable, so this is not an important issue. The worrying part for me is that it's another of layer of separation between me and my data.