Then, unless we are speaking about a non-conformant ISO C implementation for bare metal deployments, it provides the initialization before main() starts, floating point emulation, handling of signals on non-UNIX OSes, VLAs.
How Apple, Google, Microsoft, Sony, ARM are now steering the industry regarding their OS SDKs has already proven who is on the right side.
I have been at this since the BBS days, I don't care about Internet brownie points.
The only thing left is actually having some liabily in place for business damages caused by exploits, I am fairly confident that it will eventually happen, even if it takes a couple of more years or decades to arrive there.
Claiming that a little process startup code (that isn't really part of a particular language, but is much more part of the OS ABI) was easier to exploit than an entire JRE is just dishonest.
I would never think of "floating point emulation, handling of signals on non-UNIX OSes, VLAs" as anything resembling a "runtime". These are mostly irrelevant anyway, but apart from that they are just little library nuggets or a few assembly instructions that get inserted as part of the regular compilation.
By "runtime", I believe most people mean a runtime system (like the JRE), and that is an entirely different world. It runs your "compiled" byte code because that can't run on its own.
I care about computer science definitions, not what most people think.
Dishonest is selling C to write any kind of quality software, specially anything conected to the Internet, unless one's metrics about quality are very low.
Then, unless we are speaking about a non-conformant ISO C implementation for bare metal deployments, it provides the initialization before main() starts, floating point emulation, handling of signals on non-UNIX OSes, VLAs.