
Class Breaks - thegeomaster
https://www.schneier.com/blog/archives/2017/01/class_breaks.html
======
Coincoin
This is exactly why I hate electronic voting. When I argue about it, people
always tell me that it's easier to commit vote fraud with paper ballots.

Sure, it's easier for a single person to temper with a single paper ballot or
even poll box, but it is much harder on a large scale. Tampering with every
physical boxes and tallies in a country requires much more effort and
organisation than exploiting a single vulnerability that applies to an endless
series of identical machines.

~~~
taeric
The biggest actual risk to voting remains the party in power. Why go through
the effort of fraud, when you can just throw out the results?

Similarly, propaganda is also more effective.

~~~
abecedarius
Why hide that you're changing the election result? Well, I can imagine cases
where it's not worth the bother, but they hardly seem like the default.

~~~
taeric
My point is look at pretty much every election where democracy actually left.
Not exactly subtle fraud going on.

(Though, I will confess I have not tabulated this, so would be very curious in
full numbers.)

~~~
jsharpe
This is a tautology. You're saying that in every election where fraud was
obvious ("democracy actually left"), fraud was obvious.

There is probably a lot more subtle election fraud that is never detected.

~~~
taeric
Hmm... I'm trying to say in every election that was manipulated, it was not
subtle fraud that was the problem. I have little doubt that some fraud
happens. I just don't think it typically matters. In most cases where it was
not detected, it had no bearing on the outcome.

Now, overt manipulation of the voting populace through propaganda or other
means is very common and has definitely happened. Ballet fraud, though? Are
there any documented cases of that being a thing that mattered?

------
ChicagoBoy11
I can't help but think this sort of issue far outweighs whatever concern we
may have when it comes to any sort of bad AI actor. Every time I read about
the OTA updates on Tesla cars my spine tingles a bit about the possibility
(which I can only conceive is greater than 0) of some bad actor sending out
code which turns those cars into ungoverned highway missiles.

If there's one lesson that we can take from all the data breaches from top
companies and high-profile government officials is that we are terrible at
securing software. The 21st century Ted Kaczyinski isn't going to need the
post office; he's got a MBP and SSH.

~~~
gist
> of some bad actor sending out code which turns those cars into ungoverned
> highway missiles

Yes but think of the probabilities that have to happen are:

a) Knowledgeable person

b) Motivated person

c) Evil or mentally unstable person

d) Determination and ability to overcome adversity.

And the person or person who has to decide that Tesla is what they want to
exploit instead of another car manufacturer or another object and so on.

Compare this with someone who simply wants to take their car and drive it into
a crowd.

~~~
pavel_lishin
> Compare this with someone who simply wants to take their car and drive it
> into a crowd.

You have to multiply the probability by the harm. Yes, most people will just
drive their car into a crowd, killing - let's say - 10 people.

But the man who perseveres and drives EVERY controllable car into a crowd can
kill thousands, if not millions.

~~~
gist
Yes but it's infinitely easier (which is my point) for multiple people to
drive a car into a crowd killing 10 people at different times. The bar to be
able to deploy on a large scale (by a single actor, not North Korea) is much
higher and the group of people that could potentially pull it off is
infinitely smaller for at least the reasons that I mentioned.

Anyone of us could drive into a crowd and kill 10 or many more people. Yet it
happens extremely infrequently, compared to at least how many times it could
happen.

------
yuchi
Very interesting read. My personal highlight is (about the _type_ of security
we are moving into):

> It's a world where driverless cars are much safer than people-driven cars,
> until suddenly they're not.

~~~
zevets
It also means any political figure who "dies in a self driving car accident"
will instantly become part of a conspiracy theory.

~~~
snerbles
The case of Michael Hastings comes to mind.

[http://nymag.com/news/features/michael-
hastings-2013-11/](http://nymag.com/news/features/michael-hastings-2013-11/)

------
matt_wulfeck
If we take a cue from nature, it means diversified genetics. This means
different IoT manufacturers becoming as snowflaked as possible to avoid
diseases that affects other systems. When one system gets too large it only
takes a single virus to bring it down.

But this runs counter to the "don't roll your own crypto" philosophy. Crypto
in this context can be any sort of important aspect of your device that you
want to secure from hackers.

Right now it's a bad idea to do things on your own, but as systems get even
larger and interconnected it may be your only defense.

~~~
mi100hael
I think you're missing the fact that a line of devices from a manufacturer is
still a "class." If you have thousands or millions of devices out there with
un-tested, DIY crypto, then that's still thousands or millions of devices
rendered malicious when a vulnerability is found & exploited. At least
vulnerabilities like Heartbleed were quickly patched and widely-publicized.
IoT vendors have been historically horrible when it comes to providing
updates.

------
golemotron
This seems related to the concept of Software Monoculture that Bruce wrote
about here:
[https://www.schneier.com/blog/archives/2010/12/software_mono...](https://www.schneier.com/blog/archives/2010/12/software_monocu.html)

It's a clear case for technological diversity. The hard problem is that
economies of scale work against technological diversity. It's similar to what
happens with monoculture in biology.

~~~
kmicklas
It's a catch 22 because diversity reduces the potential for class breaks but
also increases the probability that any given system is insecure in the first
place. (Less resources can be devoted to securing each unique system.)

~~~
golemotron
I don't think that necessarily follows. People deciding to make a new system
with different security would factor that into the cost of the system. It's
just as likely to result in fewer systems due to increased cost.

~~~
kmicklas
That's basically my point. The level of tech diversity is set by some kind of
equilibrium between the cost of writing new code and the value of doing so.

------
rm_-rf_slash
I'm surprised this isn't getting much attention here.

There really isn't a technological solution for class breaks. Go big and you
replace one kind of class break with another (a top-tier security system for
hotel card readers just focuses the attacks on that system). Go small and it
isn't cost effective or plausible at scale (everybody install gentoo! Compile
your own kernels!).

Honestly, I'm stumped. We rely on these systems more and more to squeeze out
every bit of productivity we can get, but that only leaves us at the mercy of
our systems.

Is it even possible to fix this kind of situation before it really turns on
us, or are we doomed to suffer a cataclysm we cannot yet imagine?

~~~
wyldfire
The cure is worse than the disease. But it's coming. It will come as a
devastating one-two punch: the end of general purpose computing devices and
the rise of a publicly-identified Internet.

The former will be phased in as countries legislate it. Legislators won't be
concerned with the slippery slope between abacus and iphone, the courts will
likely draw a distinction somewhere around anything north of "silicon-based
electronics with the right resources for an IP stack". A few countries will be
bastions of Free Computing, but many/most will draw up similar laws.

The latter will be held back by Metcalfe's Law until some near-global momentum
can be built. It would likely take a simultaneous attack on many global
corporations/political parties/beloved individuals, but it seems like that's
just a matter of time. Faced with repeated attacks, the public won't accept
"but we designed it that way intentionally" as valid.

~~~
zevets
Abandoning general purpose computing is too crazy, as it amounts to throwing
away software. The anonymous internet is a dead-man walking, but silicon
valley is as complicit in this as the government will be.

However, software engineering will change into a much more regulated
environment. Writing software for the web will eventually become like writing
software for the space shuttle: slow, rigorous and very expensive.

It will also kill the hacker culture, as hacking together a piece of software
will become like hacking together a bridge: illegal.

~~~
rm_-rf_slash
IANAL but I assume that if I purchased a tract of land and built a bridge on
it, I could create it as I please, given that I would be liable for damages
should the bridge break and harm somebody on my property, not unlike a slip-
and-fall.

I for one would prefer most vital software to be developed slowly and
methodically. We insist on strict standards for our bridges and roads - why
don't we do the same for our information infrastructure?

Move fast and break things can break people too.

~~~
zevets
I would prefer software infrastructure be developed rigorously as well, as I
think would most people. But that will lead to regulations or at best,
expensive insurance company mandated standards.

But before we celebrate its demise: the side effect will also be the same ones
seen in the civil engineering world: creative ideas take decades to come into
existence. As you mentioned in your other comment, there was a time when
railroad building was exciting, and in that time, people died doing exciting
engineering. Civil engineering tools are now orders of magnitude better than
they were years ago, but there hasn't been an explosion in creative
structures, even at the lowest level. I also think many of the perks (ie:
salary) of the software engineering industry are intimately related to the
'move fast and break things' culture. When that leaves the industry, it may be
less of a good riddance than you think.

This may be my perspective as an outsider though, I don't work in software.
But I have considered making the jump to software engineering, because 'moving
fast and break things' sounds fun.

Fundamentally we agree, but I'm a pessimist.

~~~
rm_-rf_slash
Also agreed, but I don't think the "boringification" of civil engineering is
all that bad either. I met a civil engineer recently. He is a mundane but
serious professional. He has no patience for the shiny objects that the real
estate developers try to distract other people with. Codes are upheld and
enforced.

It's easy to grumble about regulations until their reasons are forgotten. Then
they get repealed and the problems come back. The mortgage crisis is a prime
example.

I don't think this is such a bad future for software either. Much of my job
writing software for higher education involves adhering to complex policies.
These policies are necessary to remain FERPA/HIPAA compliant, which are
necessary for their own reasons. Playing fast and loose is taking out a debt
for an uncertain future.

~~~
dispose13432
The "boringification" of computers will bring us back to the 70s.

If we have to prove _everything_ on your computer, from your calculator or
Pokemon, how much do you think a license of windows will cost? $250,000?

Linux/FreeBSD/OpenBSD/Minix will be dead (no one to sponsor certification and
then put it in the public).

Crazy "best-practices" (change passwords every couple days, password must
include symbols, letters, numbers, upper-case and lower-case in a random
order, but be no shorter or longer than eight characters)

Must have a full team of lawyers to prove that everything done fit the letter
of the law, and that any hacks are not your responsibility

"Shinyness" is what allows you to have free VSCode and Atom.

Sure, I miss the 70s when you could get a text editor measured in bytes (but,
btw, electron is probably more secure than 70s unix) but I definitely like the
cost/convenience of modern software.

------
taeric
A better analogy would be crops and the dangers of a mono culture. A class
break is akin to a disease that an entire crop is susceptible to. See, e.g.,
the bole weavil. Or the reason the current popular banana is at risk.

------
tptacek
For what it's worth, the more idiomatic term in computer security is "bug
class". I know "class break" is potentially the better, more general term, but
nobody I talk to ever really uses it.

~~~
wglb
In the insurance industry it is called Category Risk. As in if you insure
houses instead o cars you get lots of exposure with a hurricane.

------
jimmytidey
Very similar argument to lovely CPG Grey video:
[https://www.youtube.com/watch?v=VPBH1eW28mo](https://www.youtube.com/watch?v=VPBH1eW28mo)

