The characteristic item of legibility is "papers": ID cards, passports, and so on; but also things like title registers. In a transparent rule-of-law society the control systems enabled by these are generally to the public benefit, which is why they get instituted in the first place. But at every stage there's people and situations that don't "fit", and a huge temptation to hammer the square peg into the round hole. Impositions ranging from "no you cannot have the right gender on your passport" to wholesale ethnic cleansing.
And the stuff about the inability to correct mistakes in authoritarianism leading to scientific and technical error is spot-on. That's how the Soviet Union got Lysenkoism, and its space program only survived due to the personal political effectiveness of Korolev (who had already been in a gulag!).
As for the Soviets: their physics, chemistry, and mathematics education was very good. It was only in the softer academic fields like history, biology, and psychology that you saw widespread nonsense. These fields are the ones with direct political implications. Today, we struggle with these fields in much the same way for much the same reason: see the replication crisis in psychology.
sigh such as? And what repercussions do you receive for saying them? I see a lot of people with books, TV interviews, and academic positions complaining about being "silenced" despite still being very audible.
> we struggle with these fields in much the same way for much the same reason
Evolutionary psychology may be the worst example of this. Or large areas of economics. Or the political implications of climate science.
> sigh such as?
Be aware that you are, in effect, asking for people to actually step in the bear trap in order to prove to you that it’s dangerous.
Even pg’s essay What You Can’t Say (well known around here) didn’t mention any specific things, for obvious reasons.
social psychology was specifically singled out by the replication crisis for having something like 2/3 of its published papers failing to replicate
such as? And how can you complain about opinions which you can't find easily just by looking at those which you can?
it could just be that the softer academic fields are just harder to study, unrelated to what society we try them in.
So the likelihood is that a state like China is going to primarily to be getting rubbish results. The data they feed into their systems is going to be patchy, biased, often doctored. So when it starts locking people up and sending them to re-education camps they're not actually removing the disruptive forces in their society, they're just reproducing the great purges from Russia. Picking people more or less at random and attacking vast swathes of the population. And whilst they fail to solve their social problems they'll start creating economic problems as they start destroying their own workforce. At which point they either have to abandon these techniques entirely (unlikely) or just go more and more to the extreme, which is what we've seen time after time for states that engage in these behaviours.
After all, dictators throughout history have made their countries horrible with no AI at all.
Machine learning is sloppy and imperfect now but if anyone is going to get out ahead of the pack first, it would seem to be China. Especially since they are not held to the same accountability as other nations due to the West's relatively weak influence, and they are more comfortable with locking up the wrong person just to make sure they catch the right one too.
Edit: I was just curious, given the claimed widespread applicability of current ML techniques and the apparent demand for human intervention in the process, to what extent there had been any progress on automating the use of ML itself.
The academic state of algorithm development (whether China is better or worse in this regard) is a minor rounding error.
> These correction tendencies will be weaker in authoritarian societies; in extreme versions of authoritarianism, they may barely even exist.
What does authoritarian vs democracy have to do with "openness to contrary opinions and divergent perspectives"? Do the courts of public opinion perform better than government?
As authoritarianism imply trying to shape the world through power and authority, the society must then either become unstable, stop being complex (?!), or respond with equivalent power by eg. overthrowing the government.
One could say that the courts of public opinion perform worse, and is thus better as it decrease the total amount of power/energy in the system. Ie, it's ultimately less lethal in most cases. Not to say that the public can't be terrifying, but on average it seems to be a much better bet.
Allowing large scale disinformation campaigns is a problem for democracies, and it must be combated vigorously, or one could get into an situation where the disinformants have too much power, with much of the same systemic results as an authoritarian leadership.
It does not have to be disinformation. Given how easily the majority is brainwashed by propaganda it is not that difficult to imagine mass postings that while being technically true are presented in a way that helps to form needed state of mind and just progress from there.
How do you fight that?
A worldview that can exist only with the support of censorship doesn't deserve to exist at all.
What this means is that cultural processes can change what facts are considered true over time.
We have a process for filtering untruths out of common knowledge: ideas competing on a fair basis in public has worked for hundreds of years. We should continue relying on the marketplace of ideas instead of just accepting pjc50's official list of true facts.
If an idea is bad, you can discredit it using argumentation and evidence. If you can't, your alternative probably isn't as correct as you think. Who are you to decide a priori which ideas are good and which ones are bad? Who gave you that authority?
The rest of the comments are just bizarre, with claims like there's no difference between dictatorships and democracies because "totalitarianism" is just a "colonialist" category. And this blog wants to make the claim that it's China that's detached from reality?
Machine learning's real problem is in the west, which has a very difficult time accepting unpleasant facts even when fair ML systems produce them. It's the west, not a place like China, that's going to be misled by accepts to "fix" machine learning.
"Machine learning's real problem is in the west, which has a very difficult time accepting unpleasant facts even when fair ML systems produce them.'
For example, "proof" that bias exists in an institution due to a statistical difference between its distribution of identity groups and the general population. Will be interesting when these type of rules become part of our ML overlords.
True story: I took a science course in school a long time ago when my lab partner used the calibration knob in the back of the voltmeter to get exactly 5 volts out of the power source. Found this out only after we kept getting very odd results.
false assumption that democracies cannot be authoritarian