Interesting that he doesn't say why here. There's no "it simply had better performance", "it did more with the power budget" or other reason.
I suspect Intel were quite eager to have a customer and case study to launch with.
This customer designs industrial HVAC systems, which suck hundreds if not thousands of kilowatt-hours per day. Millions of watts. And they're excited that the Quark draws a few milliwatts less than a competitor's part? Or that it saves a few dollars in the $100,000 bill of materials?
A better case study would someone doing a wireless sensor working off a battery in a cornfield or something. Show us the performance per watt.
(McAfee security software? really?)
It's a name people who wear hard hats recognize. Commercial HVAC people aren't hip like us.
For a couple of years now I've noted that one advantage ARM had that seemed quite durable was you could put it into the SOC of your own design, but you could not do that with an x86 chip. If Intel is willing to allow that it is a potent weapon.
(I guess one thing the article mentions is that Quark is potentially more secure than the alternatives. Presumably embedded systems security will be a bigger issue as we approach the future Internet of Things.)
>Presumably embedded systems security will be a bigger issue as we approach the future Internet of Things.)
Embedded systems security, or the lack of it, is a big issue now.
Less fun was the associated project of getting the GNU assembler and linker to run under 16-bit DOS. I recall that at the the time the linker consisted of one giant source file, and one of the tools (I think the assembler) contained the wonderful declaration: