Oddly this is much more like the 'developers' of old. If you sat down at a workstation you needed to know how to be your own system administrator, and you needed to write code.
Automation has enabled a fairly new class of engineer which I think of as someone who has no idea how the pieces all fit together but they can assemble them with the help of a framework and a toolkit into useful products. They become experts at debugging the toolkit and framework but have little knowledge of how everything else actually works.
The problem with this new type of coder is that they can write syntactically correct impossible programs. I didn't understand that until I taught myself VHDL (a hardware description language). VHDL was the first "language" that I knew where you could write syntactically correct "code" which could not be synthesized into hardware. The language expressiveness exceeded the hardware's capabilities (and sometimes you would need to have a time machine). Imagine a computer language where 1/0 was a legitimate statement, not caught by the compiler, but always blew up in your executable.
So we have folks who can write code that is grossly inefficient or broken on "real" systems.
Google had started a program when I was there to have developers spend time in SRE (their DevOps organization) and it was to invest in them the understanding of what went on in the whole stack so they could write better products. The famous 'times every programmer should know' by Jeff Dean was another such tool. You cannot get too far away from the systems that are going to run your code if you want to write performant code.
In 2009, DevOps seemed like there was finally a reasonable answer to Taylorism. Engineers and Programmers and Hardware Technicians and Support Representatives were not cogs in a machine, but humans that could collaborate outside of rigid boundaries. Even at the lowest levels of the support organization, individual workers along the chain could now design their own tiny factories.
From there, it's just a matter of communicating tolerances properly up and down the chain. I am probably over-romanticising these notions, but it certainly felt exciting at the time. Not at all like the "fire your IT department" worldview it turned into.
... isn't that all languages?
Second, my only point was that the example given was a piss poor example of the difference between hardware and software. Obviously a bad example doesn't disprove the claim it's supposed to support.
There's been an awkward growing phase of the technology industry that has led to technicians that don't have any real understanding of the systems they maintain. Compare and contrast Robert DeNiro's character in Brazil with the repairmen he has to clean up after. We could be training those poor duct maintenance guys better.
Chuck points out that abstracting the Developer's work too far away from the system in question means the Developer doesn't really understand the system as a whole. Jeff refers to "purely development roles" and other "pure" roles that aren't necessarily natural boundaries.
The example of VHDL is not about hardware and software, but about learning that you didn't actually know something you thought you knew.
The repairmen in Brazil do not realize (or necessarily care) what they don't know about duct repair. The system allows them to function and even thrive, despite this false confidence in their understanding.
At one point at least, Google was investing in (metaphorically) having DeNiro cross-train those guys, instead of umm... Well, the rest of the movie.
The initial detail was that VHDL, unlike "software" languages, has very different consequences. Can you imagine a language where (1 / 0) wasn't defined away as a DIVERR, but otherwise managed to remain mostly self-consistent? Where something can be logically / syntactically coherent, but not physically possible?
And if that example didn't hit home for you, so it goes, but there was plenty of detail unrelated to the specific example that I thought was more important / interesting to discuss. :shrug: