So, this post isn't attempting to address the usefulness angle; that's going to be in the two follow-ups. But I'll give you a summary of where I'm going with this, because you were actually in the back of my mind when I was writing this, and I'm interested in your thoughts. First though, this post.
The first reason that this is useful is that there are a lot of people out there who believe that C is somehow fundamental to computing. You and I both know that this is false, but I run into a ton of people who don't understand this. So, this first post is setting that stage: everything is always an abstraction, in the big picture sense. That's bullet point one.
That said, interpreting people literally isn't a good way to have a conversation, you need to know what they are saying, which may or may not connect to the exact words they used. I think when people are saying this phrase, they don't mean it in a literal sense. Part of the reason why is that there are some differences that matter when you zoom in from the big picture. I speculate a bit as to why, but regardless, knowing that it may not be literal is point two.
I've heard "all models are wrong; some models are useful" attributed to several people, but it's sort of the counter argument to bullet point one, and so makes up bullet point three. Drawing a distinction between the C abstract machine and the JVM can be a useful mental model, even if it's incorrect in some sense. The C abstract machine is closer to hardware than the JVM is, and even if it's not a perfect mapping, you'll be exposed to stuff that's closer than your high level language. As long as you know you're still working with an abstraction, learning C can be a great way to be exposed to this stuff. Just keep in mind that it's not magic, or particularly inherently special.
I do think that this is useful on its own. If you reduce it down to a soundbyte, sure, that's not interesting, but the interesting thing is in the details. In some sense, this is kind of the fundamental point of the article. You may find that boring, but that's okay; this stuff isn't really for you, both in a literal and figurative (experienced C developers generally) sense.
------------------------------------
Part two is going to discuss what happens if you take this to an extreme. I have two small bits of sample code that fundamentally do the same number of operations, and have the same computational complexity, but one runs much, much faster. This is due to how it interacts with caching, which is not part of the C spec, but is the reality of x86 (at least) hardware. This exposes a sort of fundamental tension when thinking about how the abstract machine relates to the physical machine. This is where C is really interesting, because it's low level enough to allow you to control memory allocation and access, which higher-level language users aren't really exposed to. This is the "why is this useful," really. The task is to know what behaviors you can rely on and which ones you can't, and how it relates to the hardware you actually want to support.
------------------------------------
Part three is going to show what happens if you make a mistake with the ideas from part two. If you incorrectly assume that C's abstract model maps directly to hardware, you can make mistakes. This is where UB gets dangerous. While you can take advantages of some properties of the machine you're relying on, you have to know which ones fit within C's model and which ones don't.
------------------------------------
I split this into three parts because it's an MVP, in a sense. Ship the first part so that I don't continue to revise the beginning over and over and over again. They're all related to each other, and could be one whole work, but it's true that this one is less directly useful than the others, as it's really setting the stage.
I think if you'd led with the UB thing, the notion of C targeting an abstraction rather than being an abstraction would have been more compelling to me (the perf argument gets back to a place where basically everything in every language is artifacts and leaky abstractions; it's worth visiting but doesn't persuade me in either direction).
I guess my issue, even with the whole outline laid out (thanks!), is that while C "isn't how the computer works" in sort of the same sense as a SPICE model isn't really how a circuit design works and ns isn't really how a network works, it's close enough to be illuminating in ways most other languages aren't (and has the virtue of most mainstream hardware being explicitly designed to make it, and particularly it, faster).
I think more developers should know C (though I think almost nobody should write in it anymore).
The first reason that this is useful is that there are a lot of people out there who believe that C is somehow fundamental to computing. You and I both know that this is false, but I run into a ton of people who don't understand this. So, this first post is setting that stage: everything is always an abstraction, in the big picture sense. That's bullet point one.
That said, interpreting people literally isn't a good way to have a conversation, you need to know what they are saying, which may or may not connect to the exact words they used. I think when people are saying this phrase, they don't mean it in a literal sense. Part of the reason why is that there are some differences that matter when you zoom in from the big picture. I speculate a bit as to why, but regardless, knowing that it may not be literal is point two.
I've heard "all models are wrong; some models are useful" attributed to several people, but it's sort of the counter argument to bullet point one, and so makes up bullet point three. Drawing a distinction between the C abstract machine and the JVM can be a useful mental model, even if it's incorrect in some sense. The C abstract machine is closer to hardware than the JVM is, and even if it's not a perfect mapping, you'll be exposed to stuff that's closer than your high level language. As long as you know you're still working with an abstraction, learning C can be a great way to be exposed to this stuff. Just keep in mind that it's not magic, or particularly inherently special.
I do think that this is useful on its own. If you reduce it down to a soundbyte, sure, that's not interesting, but the interesting thing is in the details. In some sense, this is kind of the fundamental point of the article. You may find that boring, but that's okay; this stuff isn't really for you, both in a literal and figurative (experienced C developers generally) sense.
------------------------------------
Part two is going to discuss what happens if you take this to an extreme. I have two small bits of sample code that fundamentally do the same number of operations, and have the same computational complexity, but one runs much, much faster. This is due to how it interacts with caching, which is not part of the C spec, but is the reality of x86 (at least) hardware. This exposes a sort of fundamental tension when thinking about how the abstract machine relates to the physical machine. This is where C is really interesting, because it's low level enough to allow you to control memory allocation and access, which higher-level language users aren't really exposed to. This is the "why is this useful," really. The task is to know what behaviors you can rely on and which ones you can't, and how it relates to the hardware you actually want to support.
------------------------------------
Part three is going to show what happens if you make a mistake with the ideas from part two. If you incorrectly assume that C's abstract model maps directly to hardware, you can make mistakes. This is where UB gets dangerous. While you can take advantages of some properties of the machine you're relying on, you have to know which ones fit within C's model and which ones don't.
------------------------------------
I split this into three parts because it's an MVP, in a sense. Ship the first part so that I don't continue to revise the beginning over and over and over again. They're all related to each other, and could be one whole work, but it's true that this one is less directly useful than the others, as it's really setting the stage.
... does that all make sense?