
Seeing Like a Finite State Machine - raleighm
http://crookedtimber.org/2019/11/25/seeing-like-a-finite-state-machine/
======
pjc50
The title is a reference to
[https://en.wikipedia.org/wiki/Seeing_Like_a_State](https://en.wikipedia.org/wiki/Seeing_Like_a_State)
which discusses "legibility", the process of mapping messy reality into
convenient categories for systematic control.

The characteristic item of legibility is "papers": ID cards, passports, and so
on; but also things like title registers. In a transparent rule-of-law society
the control systems enabled by these are generally to the public benefit,
which is why they get instituted in the first place. But at every stage
there's people and situations that don't "fit", and a huge temptation to
hammer the square peg into the round hole. Impositions ranging from "no you
cannot have the right gender on your passport" to wholesale ethnic cleansing.

And the stuff about the inability to correct mistakes in authoritarianism
leading to scientific and technical error is spot-on. That's how the Soviet
Union got Lysenkoism, and its space program only survived due to the personal
political effectiveness of Korolev (who had already been in a gulag!).

~~~
quotemstr
Distributed thought policing can be just as effective in enforcing error as
the centralized variety. In the west, there are many true things that one
cannot say: who can say how much damage these prohibitions have done to our
decision making?

As for the Soviets: their physics, chemistry, and mathematics education was
very good. It was only in the softer academic fields like history, biology,
and psychology that you saw widespread nonsense. These fields are the ones
with direct political implications. Today, we struggle with these fields in
much the same way for much the same reason: see the replication crisis in
psychology.

~~~
pjc50
> In the west, there are many true things that one cannot say

 _sigh_ such as? And what repercussions do you receive for saying them? I see
a lot of people with books, TV interviews, and academic positions complaining
about being "silenced" despite still being very audible.

> we struggle with these fields in much the same way for much the same reason

Evolutionary psychology may be the worst example of this. Or large areas of
economics. Or the political implications of climate science.

~~~
teddyh
> > _In the west, there are many true things that one cannot say_

> _sigh such as?_

Be aware that you are, in effect, asking for people to actually step in the
bear trap in order to prove to you that it’s dangerous.

Even pg’s essay _What You Can’t Say_ (well known around here) didn’t mention
any specific things, for obvious reasons.

------
Traster
I think there's actually a good point here. There does seem to be this scare
mongering that China is going to become some sort of ultra efficient techno-
totalitarian regime where their oppression is guided perfectly by technology.
However, It's very difficult to get useful work done with machine learning.
It's very easy to get absolutely rubbish results. The reason there's a massive
arms race amongst the top tech companies for talent is because you need
_really_ smart people to actually apply machine learning well.

So the likelihood is that a state like China is going to primarily to be
getting rubbish results. The data they feed into their systems is going to be
patchy, biased, often doctored. So when it starts locking people up and
sending them to re-education camps they're not actually removing the
disruptive forces in their society, they're just reproducing the great purges
from Russia. Picking people more or less at random and attacking vast swathes
of the population. And whilst they fail to solve their social problems they'll
start creating economic problems as they start destroying their own workforce.
At which point they either have to abandon these techniques entirely
(unlikely) or just go more and more to the extreme, which is what we've seen
time after time for states that engage in these behaviours.

~~~
quickthrower2
China has a lot of smart people and a lot of money. I think they could do a
pretty good job at applying machine learning.

~~~
uoaei
Yeah, over a billion people, all loyal to the cause.

Machine learning is sloppy and imperfect now but if anyone is going to get out
ahead of the pack first, it would seem to be China. Especially since they are
not held to the same accountability as other nations due to the West's
relatively weak influence, and they are more comfortable with locking up the
wrong person just to make sure they catch the right one too.

------
taneq
Machine learning doesn't have to be perfect (or even effective) to do its job
in this context. It's like the Mechanical Hound in Fahrenheit 451 - as long as
the public hears it's been let loose and then sees it catching someone, that's
enough to reinforce its 'infallibility' and its usefulness as a tool of the
state.

------
_pmf_
The technological side does not matter. The modern AI renaissance is purely
driven by wide availability of training data. The West is in the process of
restricting this, China is putting it front and center.

The academic state of algorithm development (whether China is better or worse
in this regard) is a minor rounding error.

------
Chris2048
> The great advantage of democracy is its openness to contrary opinions and
> divergent perspectives. This opens up democracy to a specific set of
> destabilizing attacks but it also means that there are countervailing
> tendencies to self-reinforcing biases

> These correction tendencies will be weaker in authoritarian societies; in
> extreme versions of authoritarianism, they may barely even exist.

What does authoritarian vs democracy have to do with "openness to contrary
opinions and divergent perspectives"? Do the courts of public opinion perform
better than government?

~~~
lostmyoldone
One of the more interesting results from dynamic systems theory is that all
simultaneously complex and stable systems are stable because they are
stabilized by internal opposing forces.

As authoritarianism imply trying to shape the world through power and
authority, the society must then either become unstable, stop being complex
(?!), or respond with equivalent power by eg. overthrowing the government.

One could say that the courts of public opinion perform worse, and is thus
better as it decrease the total amount of power/energy in the system. Ie, it's
ultimately less lethal in most cases. Not to say that the public can't be
terrifying, but on average it seems to be a much better bet.

Allowing large scale disinformation campaigns _is_ a problem for democracies,
and it must be combated vigorously, or one could get into an situation where
the disinformants have too much power, with _much of_ the same systemic
results as an authoritarian leadership.

~~~
FpUser
_" Allowing large scale disinformation campaigns is a problem for
democracies"_

It does not have to be disinformation. Given how easily the majority is
brainwashed by propaganda it is not that difficult to imagine mass postings
that while being technically true are presented in a way that helps to form
needed state of mind and just progress from there.

How do you fight that?

~~~
quotemstr
You don't. If the masses change their minds in a way that you don't like in
response to true facts, either put out your own competing true facts or just
accept that you make have been wrong.

A worldview that can exist only with the support of censorship doesn't deserve
to exist at all.

~~~
damnyou
I've always subscribed to the notion that while there is an objective reality,
in many cases humans have no hope of finding it directly and instead only have
access to it through cultural filters.

What this means is that cultural processes can change what facts are
considered true over time.

~~~
quotemstr
That's postmodernist baloney. There is an objective truth, and we can
asymptotically approach it. Humanity can access facts. It's nonsense to
suggest that we can't distinguish between claims because everything is false.

~~~
damnyou
That's rationalist baloney. There is an objective truth, and in many cases we
can asymptotically approach it, but not nearly in all cases.

~~~
quotemstr
Which specific ideas are important enough to censor the public to enforce but
not amenable to empirical verification? Sounds like religion to me.

~~~
damnyou
Using the right pronouns for people in workplaces and schools. There is
empirical evidence of improved mental health but ultimately it's based on a
specific belief system — you have to believe that trans people are a group to
care about, and that almost anyone who asks you to use a different pronoun for
them is being sincere about it.

------
quotemstr
The central point of this garbage article is that authoritarian regimes that
use machine learning will fail because they'll be misled by biased outputs.
Titledropping Scott's famous book is just for flavor. As one of the comments
says, the article equivocates between "bias" as injustice and "bias" as error.
Authoritarian regimes can use machine learning just fine.

The rest of the comments are just bizarre, with claims like there's no
difference between dictatorships and democracies because "totalitarianism" is
just a "colonialist" category. And this blog wants to make the claim that it's
China that's detached from reality?

Machine learning's real problem is in the west, which has a very difficult
time accepting unpleasant facts even when fair ML systems produce them. It's
the west, not a place like China, that's going to be misled by accepts to
"fix" machine learning.

~~~
whodidntante
Downvoted ? Really ? Must have hit a nerve somewhere.

"Machine learning's real problem is in the west, which has a very difficult
time accepting unpleasant facts even when fair ML systems produce them.'

For example, "proof" that bias exists in an institution due to a statistical
difference between its distribution of identity groups and the general
population. Will be interesting when these type of rules become part of our ML
overlords.

True story: I took a science course in school a long time ago when my lab
partner used the calibration knob in the back of the voltmeter to get exactly
5 volts out of the power source. Found this out only after we kept getting
very odd results.

------
ptah
>Authoritarianism then, can emerge as a more efficient competitor that can
beat democracy at its home game (some fear this; some welcome it).

false assumption that democracies cannot be authoritarian

