This illustrates a point that now, in 2019, there is literally no OS designed for security. I mean, security was never a real goal. Even software specifically written to address security requirements could easily have gaping holes (re Heartbleed)...
What I painfully learned is that hardly anybody save sometimes the US government is willing to compromise on features or pay 1 cent more for a secure RTOS.
Look at it another way. The market for cybersecurity is supposed to hit $300 billion by 2024, according to a recent report. If we only released secure software, that $300 billion market wouldn't exist. In that sense, it is vastly more economically viable to release insecure software than to spend any effort on securing it.
Yeah, but that's a huge opportunity b/c they have tons of money and constantly overpay massively for things.
> If we only released secure software, that $300 billion market wouldn't exist
This is the broken window fallacy . In fact, what you are saying is that there is $300 billion of value to be created by making secure software.
The other important difference is that the software vendor, unlike the glazier, doesn't directly benefit from folks breaking its software.
A better analogy is the introduction of the iPhone creating the iPhone accessories market. For lots of reasons, Apple decided not to make the iPhone crush proof, allowing Otter and others to sell protective cases for it.
If you made the software secure in the first place, you would have secure software and $300 billion to spend on something else (guns, butter, or what have you).
That seems like the very definition of the broken window fallacy to me---but hey, if not, it's still a fallacy.
Of course we haven't factored in the extra cost of making the software secure in the first place. If that costs vastly more than $300 billion, it is vastly more economically viable to just make broken software, but I don't think that was the intention of the statement.
> The URGENT/11 vulnerabilities affect the noted VxWorks versions since version 6.5, but not the versions of the product designed for safety certification – VxWorks 653 and VxWorks Cert Edition, which are used by selected critical infrastructure industries such as transportation.
Seems to me more like meaning that a "certified" "safe" version exist but that a lot of companies used (most probably to save money) the "normal" edition, and - indirectly - that the differences between the "normal" and the "certified" editions were known, at least to the developers/company actually making VxWorks.
It would be "queer" that the "certified" editions have "different" mechanisms implemented (for completely different reasons) and only coincidentally they are more secure.
For example you can get a "medical grade" QNX but the certificate only covers the kernel, so you have to write and verify the entire userspace yourself.
>... which are used by selected critical infrastructure industries such as transportation ...
... as they do not contain the vulnerable TCP/IP stack.
What will happen is you'll purchase a certificate for the RTOS kernel plus a few critical components. Then you can choose to use any other off-the-shelf components that the vendor or third parties provide. Those parts don't have to be safety-critical, but if a defect is found in uncertified software it's not VxWorks's problem.
VxWorks is very clearly and concisely stating that the safety-critical certified components are not affected. But they're not going to make statements about the systems their safety-critical clients built. That's not their responsibility. And Armis is almost certainly reprinting a statement from VxWorks. Both Armis and VxWorks are leaving it up to each VxWorks customer to determine whether their particular configuration of Safety-Critical VxWorks uses a vulnerable stack as an add-on.
1) Armis tested those certified environments and couldn't replicate the bugs
2) Armis did not test those certified environments and reprinted the VxWorks statement
If #2 they should have written instead "we could not test the vulnerabilities on the "certified" versions but we believe in VxWorks' assurance that they are not vulnerable (because ... )"
If an entire "family" of versions of the OS is vulnerable to these 11 (eleven) bugs whilst the "certified" versions are vulnerable to none (of these specific 11 ones, not necessarily not vulnerable to other 11, maybe 12, other ones), it means that the certified versions are different.
Small volume and thus less attractive targeting might explain why - since no time has been spent to find the hypothetical "other 12" vulnerabilities in the "certified" versions - noone found them (yet).
> Stack overflow in the parsing of IPv4 options (CVE-2019-12256)
> Four memory corruption vulnerabilities stemming from erroneous handling of TCP’s Urgent Pointer field
(CVE-2019-12255, CVE-2019-12260, CVE-2019-12261, CVE-2019-12263)
> Heap overflow in DHCP Offer/ACK parsing in ipdhcpc (CVE-2019-12257)
DoS via NULL dereference in IGMP parsing (CVE-2019-12259)
While a safer language wouldn't make the remaining logical ones disappear, there would be 7 vulnerabilities less.
Highly constrained OSes and languages that do so in order to minimize attack surface tend to be challenging to work in. As a result they constrain productivity which increases time to market and the folks who got something out, even insecure, would "win" the market. It was a sad thing to see happen.
The other aspect was that IPC via message passing is a very natural way to program.
That it has never taken off is more evidence that there's no money in securing software, just cleaning up the mess insecure software leaves behind.
I am under the impression that the people behind seL4 have managed to successfully commercialize earlier other versions of L4 before seL4 was created.
Anyway, even if we grant the premise that seL4 has not taken off, that does not seem to justify saying that there is no money in securing software.
I might hazard to say that (in my opinion) no OS written in a memory unsafe language is secure by design.
EDIT: bzlg. SYSGO GmBH
Correctness is a goal of many operating systems.
Common Criteria distinguishes security in security functions and assurance (the effort spent in the verification of the implementation).
You may not always foresee the requirements for 'correctness'.