

Charles Stross: Where we went wrong - rsaarelm
http://www.antipope.org/charlie/blog-static/2010/08/where-we-went-wrong.html

======
tptacek
The only point in here that's defensible is (5), and that's because it's so
vague that it doesn't mean much.

(1) Split I and D memory aren't a silver bullet against memory corruption
flaws; "Harvard" architecture machines have had remote code execution flaws.
Attackers aren't writing directly to program text; they're writing to data
structures that effect command & control (most famously the stack) inside the
process. Over the past 10 years, _randomization_ has been more effective than
any explicit control over what can or can't be executed.

(2) Most modern memory corruption flaws don't involve someone's strcpy(). It's
far more common to find math problems in the handling of counted data
structures --- you know, like the ones Charlie Stross praises. Meanwhile, if
you want "safe" string handling, you use a string library.

(3) We have pervasive TCP/IP encryption today. It does nothing to address
computer integrity. Encryption only keeps attackers from talking to you if you
can authenticate and authorize every connection. You talk to thousands of
sites daily; authorizing them is untenable. Meanwhile, this business of
"listeners promiscuously logging traffic and cracking it at leisure" (side
note: nobody logs "promiscuously" anymore): it takes an awful lot of leisure
to crack AES128, or even DES-EDE.

(4) "The world wide web" has problems, but it's unclear that Stross can
actually point them out. Turning Javascript on is unlikely to make you "some
script kid's bitch" (turning _Java_ on is another story). People turn off
Javascript to avoid XSS, which makes one site a bitch, but not your whole
browser.

(6) _Bloody_ Microsoft took security seriously sooner than any other large
vendor, as we're all discovering to our chagrin today. Nobody took software
security seriously in 2000 except for software security practitioners. It's as
true in 2001 as it was in 1995 that a skilled attacker could have popped any
Sun machine, any sufficiently complex web app, any document viewer, any
graphics format, or any SSL stack. So why blame Microsoft? They at least got a
grip on the problem.

Let me be direct: things would have been just as bad if our CPUs had split I&D
memory. They'd have been just as bad if everyone used Pascal strings. They'd
have been just as bad if we had pervasive IPSEC from day #1. They might be
better if we didn't have the WWW, but then, we wouldn't care how good or bad
things were (like we didn't seem to care so much when people were owning up
DMS500 switches at the phone company in 1991). Things would have been just as
bad --- maybe, believe it or not, worse --- had Solaris been the dominant OS
in 2000.

The problem is, nobody really knows how to make correct software. It's a core
problem in Software Engineering and it's unsolved. Without correct software,
you can't have secure software. Sorry.

~~~
Daniel_Newby
Re. #1, Harvard architecture also does not solve problems with source code
injection attacks, as we see with SQL queries and complex data structures that
direct the flow of execution.

Harvard architecture is a pain in the ass anyway. When I designed a Harvard
architecture chip (8051 derivative) into a product, I added an external
address space mapper to turn it into a von Neumann architecture. The software
folks had to write a custom file system/linker/loader and were _thankful_. The
alternative was painfully reprogramming the chip with every recompile.

Re. #5, the large system developers I know _love_ them their C# and sing the
praises of Microsoft for making C/C++ avoidable. Their 500 kLOC distributed
system integration project would be nearly untenable if they had to manually
get every container iteration and bounds check exactly right. And Microsoft
does not talk about it much, but I get the impression that they have been
running heavy code reviews and static analyzers on their C/C++ software. At
least there seems to have been a decline in exploitable bounds check errors.

~~~
tptacek
Every piece of Microsoft software that has a customer-visible name --- and
many that don't --- have been run through a gauntlet of static source code
analyzers, have been reviewed by developers trained in secure C/C++
programming, have been "threat modeled" by internal and external teams to find
design alternatives that reduce exposure to threats, and have been subjected
to multiple external pen tests, often on the dot release.

The result is demonstrably imperfect, which just underscores the point that we
don't know how to produce secure software "in the large".

Here's a shorthand: if Daniel J. Bernstein can't get it right, it's absurd to
suggest that any software company's going to do any better.

------
bryanlarsen
Or maybe it's "where we went right". Let's imagine a world where TCP/IP was
encrypted. What consequences would this have? 1: it would have spread a lot
slower. Effective encryption was VERY expensive 30 years ago. 2: The powers in
charge would have been much less likely to let commercial interests and non-
research institutions connect.

The most likely possibility in that environment? It may well have caused AOL
to win. Remember, network effects are hugely important. Sure, the digerati
would have accounts on compuserve or the Well, but they would have also have
had an AOL account because everybody was on AOL.

I shudder.

~~~
india
Exactly. Von Neumann architecture, null string termination in C and lack of
encryption in TCP/IP are all arguably the things we got spectacularly right.
All these three are efficiency + simplicity vs features + complexity trade
offs. Imagine how much extra energy and latency and points of failure would be
introduced into the system if almost every network device was required to
encrypt/decrypt every packet. Naughty things like error correction and oh a
zillion tricks our everyday game programmer is using would be impossible.

------
terra_t
Bull. It's not about technology, it's about people.

Back in the 1960's, people had a fear that there was going to be this one big
mainframe, attended to by a bunch of priests, that would rule the world. (See
"Colossus, The Forbin Project")

Just a decade later, in Don Parker's 1976 book, "Crime By Computer",

[http://www.amazon.com/Crime-Computer-Donn-B-
Parker/dp/068415...](http://www.amazon.com/Crime-Computer-Donn-B-
Parker/dp/0684155761)

we see that the computer crime landscape is substantially the same as it is
today. We see embezzlement, computers being used to create thousands of false
insurance policies, data theft through timesharing terminals, physical attacks
on computers, and concerns about privacy. The only thing that's missing is
phishing... And this is just before microcomputers hit the market.

By early 1980's the "Cyberpunk" genre is established in science fiction and
Niel Young sang "Computer Cowboy" on his album trans... We then knew the
threat of computers was anarchy, not total government control.

The fact is, evil is in the heart of man. People are going to use whatever
technology is available to do what they're going to do. Criminals use cars,
air travel, and telephones every day. We can certainly close off certain
avenues of technological attack, but as long as there is a motive, people are
going find the opportunity to commit crimes.

~~~
jacquesm
> The fact is, evil is in the heart of man.

Only in the heart of those that commit crimes. And we try to design our
society in such a way that we reduce the crime. And the internet is now part
of our society, which is very technological in nature.

So we have to design the internet in such a way that we take in to account the
reality that some people are not nice.

And that makes it about technology, not about people.

~~~
jdietrich
> Only in the heart of those that commit crimes.

<http://en.wikipedia.org/wiki/Stanford_prison_experiment>

<http://en.wikipedia.org/wiki/Milgram_experiment>

~~~
bryansum
I don't finding anything about Milgram demonstrating evil in man; it
demonstrated the willingness to take orders from a perceived authority.

------
swombat
_According to one estimate pushed by the FBI in 2006, computer crime costs US
businesses $67 billion a year. And identity fraud in the US allegedly hit
$52.6Bn in 2004._

 _Even allowing for self-serving reporting (the FBI would obviously find it
useful to inflate the threat of crime, if only to justify their budget
requests), that's a lot of money being pumped down a rat-hole. Extrapolate it
worldwide and the figures are horrendous — probably nearer to $300Bn a year.
To put it in perspective, it's like the combined revenue (not profits; gross
turnover) of Intel, Microsoft, Apple, and IBM — and probably a few left-overs
like HP and Dell — being lost due to deliberate criminal activity._

I call bullshit. These numbers are, imho, just as made up as the RIAA's
"losses to piracy" numbers. I don't believe them, not even for one second. The
real numbers could be as far as 3 or 4 orders of magnitude smaller

~~~
tptacek
The real numbers are no doubt N orders of magnitude smaller, but 3-4 sounds
high. Could we lose billions of dollars a year to crime abetted by software
insecurity? Yes.

At a first pass, remember that every piece of hardware and software that every
company buys to address these problems is part of the cost. Antivirus alone
gets us over a billion.

------
jacquesm
C doesn't really have a string type, it just has something called pointer to a
character, and another thing called 'array of characters'. The standard
library is what implements the string functions, not the C language, and the
standard library was not written in such a way that the guts of the
implementation were hidden from view so it is nearly impossible to fix after
the fact. The few string routines that are now 'overwrite safe'
notwithstanding, the 'old' stuff is still in use and plenty of new code is
still being produced using these unsafe functions. Nul terminated strings are
a convention, not a language implementation detail. So the blame does not
really go to the C language but to the library implementation.

Another thing missing is homoglyphs.

~~~
glymor
Your forgetting string literals which are part of the language.

~~~
jacquesm
Yes, that's true. They are supposed to be stored in 'unmutable' memory though,
and will cause a segfault if you try to write to them.

    
    
      jam@jam-desktop:~$ cat y.c
      main()
    
      {
            char * s = "abc";
    
            *s = 'c';
      }
      jam@jam-desktop:~$ ./y
      Segmentation fault
    

But you're right that they are a part of the language.

~~~
kabdib
Platform-dependent behavior; what crashes on your Linux box doesn't
necessarily crash on the embedded system running your flight control
software...

~~~
tptacek
The immutability of string literals (at least, the property Jacques is
demonstrating) isn't platform-dependent.

~~~
sfk
From the standard [6.5.4][String literals]:

"If the program attempts to modify such an array, the behavior is undefined."

Hence there is no requirement to segfault.

~~~
tptacek
This is just message board geekery. Nobody could reasonably argue that the
requirement is to "segfault"; C runs in many places where there's no such
concept. Jacques point was that C string literals are immutable. Stop
bickering with him; he's right.

~~~
sfk
I appreciate your confidence, but you are wrong:

OpenSolaris/suncc, Jacques' program:

    
    
      $ cat immutable.c
      #include <stdio.h>
    
      int main()
      {
        char * s = "abc";
        *s = 'c';
        printf("%s\n", s);
        return 0;
      }
    
      $ cc -o immutable immutable.c 
      $ ./immutable 
      cbc
    
    

EDIT: I can't reply to you, so I have to take this route. I also think that
this behavior is not desirable. However, I do not see the passage in the the
standard that _forbids_ writable strings. On the contrary, in Annex J writable
string literals are explicitly allowed as an extension, and I don't see any
indication that the presence of writable strings make an implementation non-
conforming.

Perhaps you could point to the passage in the standard that requires string
literals to be immutable (your words, not mine) or that requires a compiler to
issue a warning if a string literal is modified.

If you still wonder why I'm arguing this: It is not for the sake of
nitpicking, but to support kabdib's post.

~~~
tptacek
That Sun's CC does something goofy here has nothing to do with the argument.
Code written to depend on that weird behavior is unreasonable. Why are you
arguing this point? I don't get what you think you're proving here.

There are plenty of platforms that C runs on that are incapable of making
anything immutable, including _program text itself_. That doesn't make C
strings less immutable. Why isn't your compiler yelling at you for doing that?
I think it's broken!

------
nickpinkston
These all seem pointless next to the inevitable social engineering that led
John in accounting to give that Excel spreadsheet of bank info to "Dan" from
an "outside auditing firm". PEBKAC!!

------
endtime
>And Microsoft, by dropping security support for older OSs, aren't helping the
problem.

I disagree with this. The only way to get users off defunct OSes, and onto the
newer, fundamentally more secure ones, is to stop supporting the old ones.

~~~
tptacek
It's also a total red herring.

Microsoft spends serious dollars shoring up the security of Windows XP, a
10-year-old operating system. What kind of security support do you think OS X
10.0 customers get?

Stross' comment here is probably sparked by news stories about Microsoft
dropping support for Win2k and XPSP2. But the solution to the XPSP2 problem is
simply to upgrade to XPSP3. I'm not sure, but I think that if you had auto-
update on (and you're crazy if you don't), you got that automatically.

If you're deployed permanently on Win2K, nothing Microsoft does is going to
make you secure. Win2K lives in places where access to the same Ethernet
collision domain is already game-over for an attacker.

It is simply unreasonable to suggest that Microsoft is making people less
secure by encouraging them to get off operating systems that were first
released to QA in 1998.

 _NB: as always, I have to post the disclaimer that while we've done work for
MSFT in the past, our entire company is standardized on Apple hardware and
Apple operating systems._

------
sprout
>User education, or the lack of it. (Clutches head.) I have seen a computer
that is probably safe for most users; it's called an iPad, and it's the
digital equivalent of a fascist police state: if you try to do anything dodgy,
you'll find that it's either impossible or very difficult.

I'd be interested to see some security studies that compare banking behavior
on iOS/Android vs. desktop. I don't access financial data on my phone, but my
impression is that the way these devices are designed to be used without much
training makes it a lot easier for social engineering to succeed, which is the
primary attack vector anyway.

------
jemfinch
C++ hasn't been compiled to C in any production compiler for a rather long
time.

~~~
tomjen3
Properly not, but the fact is that it can, and has been. Therefore it is still
possible to do so today, which means that his claim still stand.

~~~
tptacek
The point wouldn't make sense even if its premise was accurate. The JVM is
also implemented in C. But you don't see a lot of ASCIIZ overflows in Java
code. The C++ std::string class doesn't suffer from C's wild-west memory
handling on cfront-style compilers.

------
will_critchlow
Even the app store isn't immune to attack. Wasn't there a trojan type attack
recently where an apparently innocent program had a dual purpose?

I don't have a link to hand I'm afraid...

