This story has been upvoted a fair bit without any discussion, so to kick that off a bit: the author is Ross Anderson, a well known cryptographer at Cambridge and author of the excellent Security Engineering. The article is a particularly good read for several reasons:
1. Anderson begins with an explanation of the fundamentally unique challenge of security engineering (especially in the context of cryptography) - that is to say: it's inherently more difficult to develop software designed to withstand hostile environments and users. This requires a set of skills and analyses that are not necessarily required in many other domains of software engineering.
2. He provides interesting historical examples (from a contemporary perspective!) of real world cryptographic failures in the context of banking software, especially ATMs. At the time this was written (1993), ATM fraud was in its relative infancy and an understanding of ATM software compromise was not pervasive like it is in the modern day. These days you can see ATM fraud at a typical BlackHat or DEFCON conference every year, but back then Anderson was providing a commentary that was not well known. ATM fraud was starting to become a headlining issue in the United Kingdom at the time, but the author had known about it since the 80s as a security consultant for various financial institutions.
3. Anderson distinguishes between cryptosystem design failures, cryptosystem implementation failures and non-cryptographic infrastructure failures. He provides examples of each (complete with technical descriptions of how certain algorithms work) to demonstrate the nuances of different security attacks, and to show that while a great deal of focus can be put into the research and development of a virtually perfect cryptosystem, it can fall apart in algorithm implementation or software infrastructure. This is a great perspective on the discussion because of its historical context: the 1990s were a decade in which the government no longer had a significant intellectual advantage over industry and academia in the field of cryptography (unlike, say, the late 70s). It was straightforward (if not "easy") to develop safe, custom cryptosystems from a design perspective, in that you could legally and safely do it without the government and without licensing from one or two quasi-government entities. But the software engineering standards for safely implementing and supporting these designs were not yet as mature; moreover, the focus on rigorous cryptographic implementation (as opposed to rigorous mathematical design in isolation) was not as pervasive then as it is now. This happens to line up with Paul Kocher's research, which essentially invented the field of side channel analysis and implementation security failures.
4. The article even touches on the political and industrial challenges to making real security improvements in consumer banking software. As mentioned before, ATM fraud was a headlining issue in the United Kingdom, and the consequent erosion of consumer trust in the financial sector was becoming a real problem for politicians In turn, the government was putting pressure on companies to enact security improvements. Anderson's article is a response to this zeitgeist; he first explores the ineffective ways the public and private sectors interact to enact security changes, then proceeds to make proposals on how the two could cooperate to meaningfully improve security standards in the future.
Here's what he has to say about so-called "Trusted Computing".
"1. What is TC - this `trusted computing' business?
The Trusted Computing Group (TCG) is an alliance of Microsoft, Intel, IBM, HP and AMD which promotes a standard for a `more secure' PC. Their definition of `security' is controversial; machines built according to their specification will be more trustworthy from the point of view of software vendors and the content industry, but will be less trustworthy from the point of view of their owners. In effect, the TCG specification will transfer the ultimate control of your PC from you to whoever wrote the software it happens to be running. (Yes, even more so than at present.)"
1. Anderson begins with an explanation of the fundamentally unique challenge of security engineering (especially in the context of cryptography) - that is to say: it's inherently more difficult to develop software designed to withstand hostile environments and users. This requires a set of skills and analyses that are not necessarily required in many other domains of software engineering.
2. He provides interesting historical examples (from a contemporary perspective!) of real world cryptographic failures in the context of banking software, especially ATMs. At the time this was written (1993), ATM fraud was in its relative infancy and an understanding of ATM software compromise was not pervasive like it is in the modern day. These days you can see ATM fraud at a typical BlackHat or DEFCON conference every year, but back then Anderson was providing a commentary that was not well known. ATM fraud was starting to become a headlining issue in the United Kingdom at the time, but the author had known about it since the 80s as a security consultant for various financial institutions.
3. Anderson distinguishes between cryptosystem design failures, cryptosystem implementation failures and non-cryptographic infrastructure failures. He provides examples of each (complete with technical descriptions of how certain algorithms work) to demonstrate the nuances of different security attacks, and to show that while a great deal of focus can be put into the research and development of a virtually perfect cryptosystem, it can fall apart in algorithm implementation or software infrastructure. This is a great perspective on the discussion because of its historical context: the 1990s were a decade in which the government no longer had a significant intellectual advantage over industry and academia in the field of cryptography (unlike, say, the late 70s). It was straightforward (if not "easy") to develop safe, custom cryptosystems from a design perspective, in that you could legally and safely do it without the government and without licensing from one or two quasi-government entities. But the software engineering standards for safely implementing and supporting these designs were not yet as mature; moreover, the focus on rigorous cryptographic implementation (as opposed to rigorous mathematical design in isolation) was not as pervasive then as it is now. This happens to line up with Paul Kocher's research, which essentially invented the field of side channel analysis and implementation security failures.
4. The article even touches on the political and industrial challenges to making real security improvements in consumer banking software. As mentioned before, ATM fraud was a headlining issue in the United Kingdom, and the consequent erosion of consumer trust in the financial sector was becoming a real problem for politicians In turn, the government was putting pressure on companies to enact security improvements. Anderson's article is a response to this zeitgeist; he first explores the ineffective ways the public and private sectors interact to enact security changes, then proceeds to make proposals on how the two could cooperate to meaningfully improve security standards in the future.