Hacker News new | comments | show | ask | jobs | submit login
A glitch in the human psyche that equates repetition with truth (wired.com)
277 points by varunvkrishnan 222 days ago | hide | past | web | 143 comments | favorite



Repetition makes lies seem true, but repetition also makes truth seem true. All persuasion (and indoctrination) happens through repetition, whether in a classroom setting, a forum like this one, or in sales.

This realization is also deeply depressing, because it means you're doomed to repeat yourself over and over if you want to persuade people.

Let's take Noam Chomsky for instance. He gives more than a hundred talks every year for the past 50 years. He has written dozens (if not a hundred) books, given thousands of interviews. His message is always the same. Because after you've figured out what your best and most persuasive arguments are the only thing left to do is repeat yourself over and over. Every day is groundhog day.

Startups also have to learn the value of repetition. Long form sales text works, because of repetition. Long form video demos work, because of repetition. Drip email campaigns work, because of repetition. It's often better to give customers one good reason to use your product, repeated three times than to give three distinct reasons why they should purchase. Counter-intuitive, perhaps, unless you've heard this argument before.


Another example is Ronald Reagan. I can't remember who said it, but one of the people around him was asked how he managed to become president and known as the great communicator, and the reply was along the lines of, "he made the same speech [1] every week for 25 years."

[1] https://m.youtube.com/watch?v=qXBswFfh6AY


Richard Stallman is another example of someone who gives the same handful of speeches over and over (many times in almost exactly the same words). I've heard him describe over and over again the wastefulness of not being able to write software to interact with a university printer because the printer manufacturer choose to keep the necessary details secret.


I've never seen Stallman speak, but that sounds like a similar situation to this anecdote from Nottingham in the 80s:

https://m.youtube.com/watch?v=XvwNKpDUkiE

It's a somewhat tedious story but it's all about the freedom to control your own hardware.


Stallman's version is described in http://www.oreilly.com/openbook/freedom/ch01.html

or with Stallman's edits (I'm not sure if he's modified this chapter) in

https://en.wikisource.org/wiki/Free_as_in_Freedom_2.0/Chapte...


his copyright and community talk has stayed with me since I heard it 12 years ago, it certainly was special (well argued) and left a lasting impression.


> This realization is also deeply depressing, because it means you're doomed to repeat yourself over and over if you want to persuade people.

Yes. Embrace this.

Patience is a virtue; and not getting angry because the person you've stated something to doesn't get it even the third time you've stated it. I actually find it fun to try and come up with different ways to state things such that people might better understand it.


"Doesn't get it" isn't always (or even usually) the problem, though. It's more "wasn't listening" or "didn't have time to comprehend what you meant (or the implications of what you meant) before the conversation moved on."

The biggest component of success in communication comes down to saying things enough times that your message can actually be listened to and digested at least once. You can vary the way you say things each time, but literal repetition works nearly as well, because the problem is almost never "I don't know what those words in that order mean" but rather "I didn't hear half that sentence" or "I was thinking about lunch" or "that might have been important but it just sort of passed by and I forgot."


Repetition of a true message to a variety of audiences may be valuable because one gains a better understanding of the common misconceptions and opposing arguments. Which leads to subtly improved arguments and rhetoric. Also there's always the possibility that the message is in fact wrong and that this may be brought to light by new criticism or fresh evidence.


I take your point, but I think it's a bit of an overstatement to say that all persuasion happens through repetition. Maybe I'm a bit full of myself, but I like to think that with sound enough logic, I could be persuaded from my position by hearing an argument only once.

That would probably be the distinction I would draw between persuasion in general and indoctrination / brainwashing. Of course, generally indoctrination and brainwashing seem to have a much higher rate of success in changing minds, which is indeed depressing.


Learning is repetition. Exercises. Flash cards. Midterms followed by exams. You either hear the argument once and you repeat it to yourself, or you get the argument spoon fed multiple times, but the bottom line is the same. Without repetition there is no memorization or internalization, and no learning can take place.

Phrasing the challenge as "sound enough logic" puts the burden of proof in the wrong place. It implies that whenever you're not persuaded it's the fault of the other person for not being persuasive enough. That's the opposite of open-mindedness. It is exactly because of the presumption that your current beliefs are true that you won't change your mind as easily as you might think. Even Scientologists say they'll leave the church if somebody could just provide them with evidence it's all baloney, but it's a standard of proof nobody can meet.

When I say all persuasion is repetition, it's really not an overstatement. Maybe you're closer to believing me this time ;-)


Um, what about the Elaboration Likelihood Model? It's about persuasion and it contends that there are two ways it happens: through a rational evaluation of the material, or based on the credibility of the source (and other social cues).

Now, sometimes you might come to believe something through repetition, but it's possible to verify things (some of the time) and you shouldn't just claim that repetition is the only route.


I think the model is wrong, because it presumes people are rational agents that change their mind either through careful evaluation of the arguments or by outsourcing this rational evaluation to a trustworthy person. This model doesn't take repetition at all into consideration, so I don't think it holds up empirically.

Back in Aristotle's day it was about Ethos, Pathos, and Logos. I don't think that model has stood the test of time either.

I don't want to open Pandora's Box, but look at Trump's speeches. He persuades through repetition, and it demonstrably works.


Ok, I didn't mean to beg the question by simply claiming that people can rationally evaluate statements.

The things people learn are not noise. They have structure, they have hidden consistencies. Whatever you think of the rationality or otherwise of thought, surely you will agree that trying to learn inconsistent facts is going to be more trouble than learning consistent facts which support each other in a cumulative way.

People might identify with the opinions of a politician because they already hold those opinions, or similar opinions. They are unlikely to ever agree with/be persuaded by complete inconsistent nonsense, however often it is repeated. Like, I mean nonsense that doesn't even have linguistic structure or maybe any meaning.


> People might identify with the opinions of a politician because they already hold those opinions,

Agree and I think it is even more than that. It is about how genuine they seem. I've listen to Hillary, Trump, Sanders and Obama, and some just naturally sound more genuine. Obama seemed genuine, like he believed what he was saying, Trump and Sanders as well. But not Hillary. She said all the right things, she was very polite, and seemed to be personable with jokes and remarks sprinkled here and there. But overall she sounded fake and scripted.

I posit, hearing someone who seems to truly believe what they say is a solid first step in persuading others to change their opinions. Otherwise it becomes an uphill battle and it is just pandering to people who already believe and agree with the argument.


I guess we look at the world very differently, because:

> They are unlikely to ever agree with/be persuaded by complete inconsistent nonsense, however often it is repeated. Like, I mean nonsense that doesn't even have linguistic structure or maybe any meaning.

If the US election hasn't persuaded you that complete gibberish can be persuasive when repeated endlessly, there's nothing I can say that will.


I'm aware of the theory of the "big lie", and the fact that neurons that fire together, wire together...

But what about Feynman's injunction: "The first principle is that you must not fool yourself -- and you are the easiest person to fool." Is that kind of empirical principle just out of reach for the standard human?

In terms of repetition, I think David Hume's theory of causal inference is pretty relevant too.

Just because something is repeated regularly, e.g. seems to happen every day doesn't mean that on the day when it doesn't happen, our heads explode because we have built a rigid expectation. Instead we can usually discern possible, exceptional, reasons why it isn't happening.

I'm totally in agreement with you on the role of repetition in eroding the landscape of memory to create convictions. We might rarely remember a one off event that is never referred to. But you seem to be claiming that there's no higher-level activity, or even possibility of higher-level activity involved in mentally evaluating what to believe.

The famous Star Trek scene where Picard is encouraged/"brainwashed" to lie about the number of lights he sees comes to mind. He resists and continues to report what his senses tell him. This cultural theme of reason resisting the repetition of lies is a big one, and you haven't really presented any convincing evidence to discredit it.


> Is that kind of empirical principle just out of reach for the standard human?

What if it's out of reach even for the best among us? To me it looks like many of our core beliefs cannot be shaped through reason, and are not the product of reason in the first place. Our convictions have simply been copied from our environment through osmosis.

I'm sure you can't help but notice the amount of groupthink that takes place in any community, whether in real life or right here. How does this groupthink come to be? People don't really change their minds in long discussion threads like these; it certainly doesn't feel like they do. And yet minds get changed or there wouldn't be groupthink. By reading, responding, and reading some more, we slowly change over time; like water carves its way through rock. We shape HN by our responses and in turn HN shapes us: the relation is symbiotic.

We can, sometimes, if we try really hard, and then only for a moment, apply real reason. I see no evidence that this has a big impact on what people believe and how they act, though. People who are good at formal reasoning are good at arriving at the correct answer in reasoning puzzles, but there is no matching improvement when it comes to the decisions they make in their life. It's pretty clear that even really smart people can't reason themselves out of their problems. This is a big problem for the view that convictions are the product of reason.

In contrast, the friends you keep and the media you consume does make a tremendous impact on what you believe and how you act. Change who your friends are and, through repeated exposure, your convictions will change subliminally.


Ok. I think most of this sounds reasonable. We do necessarily adopt the attitudes and dispositions of those we associate with. HN is a pretty ambiguous place from that perspective: an internet gold-rush town inhabited by those who feel the need to justify/discuss their position by engaging online.

In a spirit of hopeful insight into what's going on, I'm linking to the Wikipedia page for Skagway, Alaska.

https://en.m.wikipedia.org/wiki/Skagway,_Alaska


It might be gibberish at a factual or logical level, but at least for a certain portion of the populace, it hits all the right emotional notes and checks all the right boxes in their value and belief systems.

Of course, you won't see it yourself if you don't share their worldview and are unable to empathize and put yourself in their shoes.


I think that's right, but in part because it's all gibberish people with wildly different values can hear him speak and still and hear what they want to hear.


> If the US election hasn't persuaded you that complete gibberish can be persuasive when repeated endlessly, there's nothing I can say that will.

I depends. Mass media has proven that false. They almost unilaterally backed one candidate and attacked others, spewed lies and misinformation, and they failed to make a difference it seems. I would argue they made things worse. Here is a multi-billion dollar apparatus designed to persuade people and control opinions and it has failed. So it is not always true. I think there is stuff that is more nuanced in how it works, not simply "repeat this X amount of times and you're done, you'll get Y% more percentage of believers".


The media talks to their own base and you can't persuade people who don't tune in.

The media on the left did persuade their audience that Trump stood no chance of winning the election, even as Trump was drawing larger and larger audiences at every stump speech. Everybody could see the flood of pro-Trump signs, hats, and bumper stickers in swing states, but this was apparently of no consequence. People were expected to disbelieve their own eyes, and so they did. I think there was a stunning amount of groupthink going on in the media this past election. Who consumes the news all day, every day? Media people, politians, and policy wonks. And so they brainwashed themselves.


social media was very important in this election and trump did a much better job


The "brainwashing" part is not related to the repetition, but the way you "acquire" the information, "acquire" on the sense of taking it for yourself. If you learn something through repetition, it's a good thing, it is learning. If you're fooled by repetition, then it's brainwashing. Also important to note, that quite often, the one that fools you is yourself. Now to distinct between "being fooled" and "learning". I would say that learning comprises understanding, while being fooled only comprises agreement. Thing is, defining "real understanding" is a pretty hard problem.


I've read geometric/mathematical proofs that have convinced me immediately. Just seeing the logical steps flow from step to step was enough to prove the assertion was true.


I consider this as an object of argument as well, however, axioms seem to be the underlying structure of repetition for all proofs [as far as I understand mathematics].


What do you mean "structure of repetition"?

Premises which are already accepted are used to conclude further ones.


Somewhat true but at some point, some people look around and realize even though something is at that point part of their psyche, they come to the realization that they've been had. I am aware that some people won't believe their eyes despite seeing evidence right in front of them, but a few aren't that foolish.

As that old Lincoln saying goes, you can fool some people all the time, all people some of the time, but you can't fool all people all the time.


I forget who said something like this, but "the danger is not in being heard. The reality is in having a good idea and having to repeat yourself over and over."


You are confusing memorization, teaching and brainwashing/indoctrination.

They all work by repetition but last one hurts people ability to thing rationally.


Reminds me of the zillion articles that have been published over the years explaining why net neutrality is important. Sadly, this message doesn't even stick very well.


I wish once people realised this, they had the courage to apply this to their own dearly held beliefs, particularly political or social ones - how many of those are just lies repeated ad nauseum that clearly aren't true?

But I think most peoples reaction is just to say "Yeah! that's totally what those other people do that our side never does because we're right.".


Have you applied it to your own beliefs? Which ones? Do you no longer believe in them?


I have. I'm not going to mention which ones as that would cheapen the general message, which is bigger than politics. And yes, I no longer believe them.


No it wouldn't cheapen the message, it would illustrate it. I'm also curious which ones.


'Cheapen' might not be the right word but it wouldn't certainly tend to hijack it. As soon as you get specific people start debating the specifics.


The devil is in the details.


I've got two for you, from both sides: "Fake News" and "Muslim Ban".


What about your current beliefs? Why not apply it to them too?


Triangulation. Take in news sources from opposite sides of the political (scientific etc) spectrum and take them all in in equal size/depth. They'll repeat their own message to make you believe you are "on the right side of the issue" but with triangulation both brainwashing attempts cancel out.


My only problem with that is that I get frustrated and contrarian with every side.


I think that's a positive. More Socratic.


Once you do, you no longer look at the world the same, but also don't have in common with majority of the people. It is a little bit like waking up from Matrix. :)


>But I think most peoples reaction is just to say "Yeah! that's totally what those other people do that our side never does because we're right.".

to guard against this, as soon as i am aware of a catchphrase or common talking point, i mentally deconstruct it and find the truth. frighteningly, the most commonly repeated verbiages are, should we say, misleading.

but lean a little closer, stranger, and i'll whisper the truth into your ear: the "two sides" are unequal in their abilities for evaluative thought, self righteous zealotry, and dogmatism. an honest attempt at critical evaluation will go nowhere if your evaluating apparatus is garbage.


> but lean a little closer, stranger, and i'll whisper the truth into your ear: the "two sides" are unequal in their abilities for evaluative thought, self righteous zealotry, and dogmatism. an honest attempt at critical evaluation will go nowhere if your evaluating apparatus is garbage.

The very fact that both sides are convinced there are two sides suggests this is a false statement. The "sides", really the false divisions and classifications, are some of the biggest lies ever told. In any case, any inequality between them is not the problem. By way of analogy, if two glasses of water have unequal amounts of fecal matter with neither being even close to zero, wouldn't it be better not to drink either of them rather than suggest one is better than the other?


We have two political parties in the United States. Do you believe the voters for each side are at parity in their ability to understand reality? Do you believe it is helpful to suggest people choose neither?


The idea that having two de facto political parties means there are only two sides and you have to pick one of them is part of the lie. You do understand you can vote for Democrats for one office, Republicans for other offices, write your own name in for a third office, vote Green for a fourth office, and completely abstain on a final office when you either dislike both candidates or feel to ill-informed to cast a vote, right? But "choosing a side" goes way beyond voting, and in this election both 'sides' choose to vote against a side, not for one.

And as for disparity in the understanding of voters of these fictitious sides, the side that likes to claim the other side is uneducated and ignorant is, statistically speaking, less educated. That, too, is part of the lie.


We're closer to having one, since the smart people like to convince themselves not to support the second one through sophistry.


If with the second one you mean the non-ruling party, it actually got more votes in the presidential election than the ruling party.


You need to win all the small elections you can, from senator down to county dogcatcher, and not just win-but-lose the big one. Think of it as promoting internally.


in reality there area bajillion sides. it's just an easy construct that people will understand at least two of those.


> the "two sides" are unequal in their abilities for evaluative thought, self righteous zealotry, and dogmatism. an honest attempt at critical evaluation will go nowhere if your evaluating apparatus is garbage.

The good news is that you don't even have to worry about the "other side" just worry about the things that you believe. That was the point.

Also unlike hard sciences and logic the fact that there only two sides and one must be in either one, is also a thing to evaluate.

So you're right. First there is a need for a meta-evaluation, is my evaluation apparatus even working and how would I test if it is.


but lean a little closer, stranger, and i'll whisper the truth into your ear: the "two sides" are unequal in their abilities for evaluative thought, self righteous zealotry, and dogmatism. an honest attempt at critical evaluation will go nowhere if your evaluating apparatus is garbage.

Of course everyone will think the two sides are unequal, with the side that isn't theirs getting the worst of it.


What makes some people more vulnerable to accepting things uncritically? Why is it so hard to admit when they or someone they support makes a mistake?

I have an ego, I don't like being wrong, and I think I'm right a lot. Ok, so far I get it. But to constantly ignore or avoid objective evidence? How do people not become ill at the thought?

1) Clinical narcissism or sociopathy? It's all an intentional means to an end.

2) Simple lack of practice in critical thinking? They are acting in good faith but just not seeing the con.

3) Their morale code does not exclude machiavellian tactics and they just want to win.

Maybe the population has all three types collaborating both knowingly and unknowingly across different roles.


It's not some people. It's most people. Investigating claims rigorously is very difficult, even for highly skilled individuals. And it usually has very minimal direct impact on the mundane aspects of day-to-day to life. For many, there isn't an easy answer to the question: "Why should I bother, given everything else going on in my life?"

Honestly, it's pretty great how many people are willing to spend time and effort that doesn't directly benefit them (on a base material level) to care about this stuff.

It's also harder than one may think to do this well. You have be fairly skeptical 24/7, even of yourself and your own thoughts. I think that doing it halfway likely leads to lots of seemingly well-rationalized ideologies.

I like to think I'm getting better at being generally skeptical and slowly layering together a coherent, mutually supportive set of usefully accurate mental models of how the world works, but I find it very tough to know to what extent I'm fooling myself or not. One simple maxim I've found useful is to try to avoid allowing any idea from becoming "sacred" and above questioning. While not practical for daily life, I think it's a great fundamental background orientation for our thoughts and perception of the world.


> I like to think I'm getting better at being generally skeptical and slowly layering together a coherent, mutually supportive set of usefully accurate mental models of how the world works

You may be interested in the Meaningness project: http://meaningness.org


I remember running across that site a few months ago when it was posted on HN. I really like his basic description of meaning as a nebulous, contextual, and participatory dynamic. I also remember some good points about how the universe is one cohesive thing, but the stuff inside varies immensely in the degree of connectedness, and that these connections are often quite nebulous.

Btw, here is the fixed link: https://meaningness.com/

And HN discussion: https://news.ycombinator.com/item?id=13136458


Page doesn't load.


How deliciously Nihilistic.


Working link: https://meaningness.com/

HN discussion from a few months ago: https://news.ycombinator.com/item?id=13136458


Thanks, was on my mobile phone at the time I wrote my original comment.


> What makes some people more vulnerable to accepting things uncritically?

To add to this, I think an additional problem is academia.

So much of success and even prestige in academia is garnered through skills that do not involve critical thinking.

Who hasn't had a conversation with someone at the top of their field who has strong, factually unsupported, convictions about another?

Entire fields are subject to self-interested ideologies, rather than facts (i.e. my undergrad degree in economic science was more propaganda science at all): http://www.thecrimson.com/article/2013/12/13/economics-scien...

Just to clear up any notion that I'm being bitter, I graduated at the top of my class.

There needs to be entire coursework in critical thinking, starting in primary school. Unfortunately, most of my primary school education outside of maths and language was just rote memorization of facts.


> Why is it so hard to admit when they or someone they support makes a mistake?

One of the critical aspects is social pressure. If they ever made fun of the thing they are supposed to be convinced of, convinced others (publicly especially) against it, they will throw every cognitive, emotional, and other tricks against every believing that.

That is why public ministering and proselytization as a necessary step in participating in many religions -- it is not just to simply bring others into the faith, but it is to inoculate those who do it against every disavowing it. I posit that social media and everyone messaging each other false-hoods is part of this public display of belief. Later on going against that is very hard, because there is solid evidence of them making fun of it just a week before. Nobody wants to be seen flip-flopping or being a hypocrite.

2) Simple lack of practice in critical thinking? They are acting in good faith but just not seeing the con.

Interestingly critical thinking of often orthogonal to other proxies for what society thinks "intelligence" is. Often it goes the opposite way -- the smarter the person thinks they are (maybe the more diplomas they have hung on their wall), the less likely they are to ever change their positions, because they will:

a) Have to confront the fact that they have chosen or supported an invalid one before. And with 3 diplomas on the wall, that is surely not something they would do

b) They have a greater capability at rationalization. When the CEO has a bad day because they had a fight at home and goes to work and shuts down a project or fires someone publicly, they will rationalize it to themselves in many other ways except "I really was upset for another reason, and made a stupid mistake, I just wasn't thinking straight". They'll use their intelligence to make something up that sounds reasonable.


How important is it for you to know that the world is round? What would change in your life, if you didn't know that seemingly important fact? How about that homeopathic medicine is a fraud, or that global warming is man made or that vaccines don't cause autism? If you were wrong about all of those things, what changes about your life right away? That's why people don't exercise critical thinking. It just doesn't matter if you know the truth about things, for most people, most of the time. (Until it does, of course, and you die of cancer because you tried to pray it away instead of getting chemo).


All of those things actively affect others' wellbeing when put into public policy. And a democracy gives the masses the ability to put those beliefs in public policy.

Would it affect you tight away? No, but in a Generation our children contract more disease, less of them go into stem fields because of the distrust and dissonance that has been spread, and the air your children breath becomes more polluted.


The belief that a particular quack medicine is effective may have cost Steve Jobs many years of life.


> But to constantly ignore or avoid objective evidence? How do people not become ill at the thought?

Information bubbles. Too many people only consume information that aligns with their beliefs. To many, they don't feel that they are ignoring or avoiding objective evidence because they easily dismiss it as lies without any evidence.


For an elaboration on your first answer, look up The Last Psychiatrist. Lots of interesting analysis of American society within the framework of narcissism. And to be honest I was convinced by a lot of his arguments, it's a very applicable framework.


"Just the place for a Snark!" the Bellman cried/ As he landed his crew with care/ Supporting each man on the top of the tide/ By a finger entwined in his hair.

"Just the place for a Snark! I have said it twice/ That alone should encourage the crew/ Just the place for a Snark! I have said it thrice/ What I tell you three times is true."


Did she really mis-spell Hitler's name?

Also, I highly suggest everyone making the Hitler comparison actually read the definitive work "The Rise and Fall of the Third Reich: A History of Nazi Germany," so we can perhaps have some more intelligent comparisons than: "You know who else used repetition? Hitler."

Every article I read seems to have become a fun little exercise in "How can I put both Trump and Hitler into this seemingly innocuous article about cognitive fusion?"

It's obnoxious and old. If you want to actually do a Trump-Hitler comparison, read a book on the topic and write an actual paper about it instead of just flinging it out willy-nilly so you can fear-monger your readership into believing that actual dystopian eugeno-fascism is just on the horizon.


Better than saying it over and over is to imagine the scenario of the lie repeatedly. Memory is malleable, and this actually creates "false" memories.


The title tries to make it seem as this is completely non-political, yet the entire body of the article is basically anti-Trump rhetoric.

So here is another human psyche detail - when someone removes the actual details (such as what these 3 exec orders are and the specifics of the mentioned crimes), they are doing so to protect the narrative...

I'll give you an example: "Ban of majority Muslim countries" vs. "Iraq, Syria, Iran, Sudan, Libya, Somalia and Yemen".

The reason the media removes the list of countries and replaces them with the quote, is because when people see the names of these countries, they realize that they are an open death-sentence destination for Americans, and have a large public that often chants death threats to America.

And that is a problem for the narrative the media is using (to exploit the situation). So the details have to be removed.

This is the first step of the manipulation of public opinion. The second step is the repetition.


"In a time of universal deceit, telling the truth becomes a revolutionary act." - George Orwell


So how do propose we pick "the truth"? This is the problem we now face: Truth has been labeled relative.


Parse the wording of the statement(s) in question. Look for lapses existing between the actual language of a message and it's intended result. Wordsmith's are expert at saying nothing substantial while allowing(encouraging) the audience to perceive or project their biases to said message.


I'm curious now, mind dissecting this quote?


Ambiguous, leading statements allow a listener to assume the content in lieu of specifics.

In the spirit of the MSM's "Fake News" news lately, I submit a seemingly innocuous example: Of all of the "Supermoon" articles back in December(and priors), how many mention the fact that the closest full moonrise is indiscernable in size to the naked eye as the furthest full moonrise? Besides Phil Platt, I believe the answer is approaching 0. Yet, every MSM news site/channel promoted it as "the biggest in ~100 years", many webizens clicked the links, some even went out to watch it live and most now believe it appeared physically larger than any other full moonrise in ~100 years.

edit:added "in" to discernable.

swapped ~ for +/-.


1. Admission. "I did it" is one way. (Thinking of caught on camera) - after admission often the search for truth stops.

As an aside one strong component of a couples ability to coexist is just this: how do they interpret relative truths. And how do they gain input for truth moments.


> after admission often the search for truth stops

There is also a phenomenon of false confessions, which sometimes has to do with psychological factors and sometimes with incentives like people being punished less harshly when they accept a plea bargain.

Since this subthread is talking about Orwell, I can point out that in Orwell's most famous work the authorities were very keen to obtain false confessions.


I propose we base truth on whatever epistemological grounding that allows one to assert the truth claim of relative truth.


Almost a bit ironically I am really surprised by the amount of agreement in this discussion. How is this a glitch in the human psyche oder something like that? Everything you remember is something you believe to be true and repetition makes you remember things, really no surprise or glitch here.

You tell me Mount Everest is 5742 meters high, seems a reasonable size for the largest mountain on earth. A year later you tell me Mount Everest is 6488 meters high. Seems reasonable and I have long forgotten that you claimed a different height last year. For whatever reason you keep telling me Mount Everest is 6488 meters high every Sunday afternoon, week after week.

I have no reason to doubt that what you are telling me is true and after a couple of weeks I will start to know and remember that Mount Everest is 6488 meters high. But then a couple of years later someone else tells me that Mount Everest is actually 8893 meters high. I object. To settle the issue we decide to look it up on Wikipedia and lo and behold the official height of Mount Everest is indeed 8893 meters.

This may or may not make me remember that Mount Everest is 8893 meters high but it is very likely that I will remember that 6488 meters is not the correct height and it might make me trust you less with regard to mountain related facts. Even without any repetition.

If you want me to accept a statement, the statement must be believable based on what I already know and believe to be true. And I have to have some trust that you are telling me a true statement. Repetition is only secondary, only required if you or I want that I remember the statement in the long term.

And if something is surprising or exciting or whatever, then one might remember something easily without a lot of repetition. The height of Mount Everest was never really interesting to me and learning the wrong height took some repetition. But then learning that Mount Everest is actually 8893 meters high and that I remembered the wrong height for years, that came as a surprise and may not take much if any repetition to remember.


Repetition not only feels like truth but also like beauty.

If you create art, even if it's primitive or ugly, if you can repeat something within one artwork or across multiple works, the works suddenly gain some merit, feel more like art, less like random doodle, just due to repetition.

To be clear, I'm not claiming that's the only way to invoke perception of truth or perception of beauty. It's just something I observed while trying to appreciate some contemporary art.


This article completely misrepresents why introducing false facts is effective.

Repetition, in itself, does not persuade anyone of anything. Repetition, as others have noted, simply makes the thing being repeated easier to remember. The true "persuasion" -- i.e., the misinterpreting of the false fact as true -- occurs because we forget the /source/ of a statement quicker than we forget the /content/.

So, for example, if you happen from your friend Joe (whom you know to be a compulsive liar) that "Priuses are actually less environmentally friendly than Hummers, because manufacturing the batteries for Priuses actually releases more greenhouse gases than a Hummer releases over its average lifespan," you'll likely remember that statement for far longer than you remember that it came from Joe, the compulsive liar. And if you find yourself in an argument with a pretentious Prius driver two years down the road and you search your memory banks for relevant facts to throw in his face, you may well pull out the "Prius battery" statement, without ever remembering that it is almost certainly bunk. You have, in essence, adopted a false belief due to having an imperfect/poorly configured memory.

To take it a step further: If you then make the "Prius battery" statement to the Prius driver, presenting it as fact, you have repeated it (thus making it more firmly entrenched in your mind) and you have replaced the (previously empty) "Spoken By" metadata field with one that now reads "Me [trust score: 100%]." Speaking the false fact is not necessary to make the false-belief-adoption effect appear, but if you do happen to speak the false fact, it only serves to strengthen the effect and further entrench the false fact.

This effect, of course, only works with facts that are not absurd or plainly wrong on your face. If you hear 2+2=5, you don't need to remember the source to know that's wrong. But there's a whole class of facts out there that exist in a gray area -- where they are wholly falsifiable on their face, and would require some serious digging to validate/disprove -- where this effect can lead to serious confusion. To the extent the repetitive blasting of falsehoods works, it works because of this and this alone. Wired here is doing us (and the fools who paid for the "HeadOn" advertising campaign) a disservice by implying otherwise.


"His primary rules were: never allow the public to cool off; never admit a fault or wrong; never concede that there may be some good in your enemy; never leave room for alternatives; never accept blame; concentrate on one enemy at a time and blame him for everything that goes wrong; people will believe a big lie sooner than a little one; and if you repeat it frequently enough people will sooner or later believe it." Hitler's rules sound vaguely familiar.


Bear in mind that this is contemporary writing. They likely wrote it in such a way with the goal of forming this comparison in your mind.

I think one needs to look at older writings to get a take that isn't designed to reinforce the constructed media narrative du jour.

Even then its hard to get decent info on this topic because its always been so morally charged. Nothing obscures reality worse than moral concerns.

Edit: I think you can also apply that description to several parties in the modern media environment.


Indeed, it's important to remember that publications like Wired do have a narrative they want to push. My favourite example of this is still these two articles about the possibility of hacking election machines to rig the election:

https://www.wired.com/2016/10/wireds-totally-legit-guide-rig... - from before the election, claiming it was basically impossible and would require a massive conspiracy

https://www.wired.com/2016/11/hacked-not-audit-election-rest... - from after the election, arguing the safeguards against hacking are ineffective and casting doubt on the security of the election

Both articles are backed up by a convincing-sounding set of facts and expert opinions, yet despite the available evidence not actually changing they come to completely opposite conclusions. All that changed was that before the election "hacking voting machines is impossible" was the better anti-Trump narrative, and after he won there was suddenly a reason to cast doubt on the results. It's all about the narrative. (Which is one reason you should question the endlessly-repeated claim that there's "no such thing as alternative facts". Careful selection of which facts to include and exclude is a great way to create a narrative.)


There are also subtle narratives in the mind that, like fish swimming in water, it is generally not noticeable. Even facts can be co-opted to support those stories:

"I am a good person."

"I am a just person."

"I do the right things."

"I am a hero."

"I am right because the facts support me."

"I am not to blame."

"I am to blame."

"I am successful."

"I am a failure."

It's the lies people tell themselves that make room for allowing opening for lies other people want to say. "Here, you repeat the lies I want to hear about me, and I'll repeat the lies you want to hear about you."


And yet they didn't single out a specific figure, nor did the poster above. As you say, there may well be more than one party someone might apply this description to.

Apparently a specific comparison formed in your mind, though, for some mysterious reason.

Probably just "constructed media narrative du jour," right? Certainly not because the person might actually be a quintessential example that stands out from everyone else so well that you already knew who people were likely talking about without anyone being named (yet, strangely, apparently want to deny that person should be considered for such a comparison at all).

> Nothing obscures reality worse than moral concerns.

It's not really clear that it's possible to separate human concerns from moral concerns, and I can only imagine someone arriving at the conclusion that "nothing obscures reality worse" by searching a pretty narrow set of reality-obscuring hazards.


> Apparently a specific comparison formed in your mind, though, for some mysterious reason.

Doesn't seem mysterious. The claim you are responding to is that, intentional or not, the author described Trump but said they were describing Hitler. That Trump came to mind is unsurprising.

Trump is commanding an absurd proportion of the media and social media's attention. Readers and authors are overprepared to see Trump everywhere in a world where it's hard to load a web page without seeing his name and face on it. Sometimes I go to a news site and load each section one by one to see if there is a single section that doesn't feature him prominently -- sometimes the sports or entertainment sections manage to avoid Trump, but not always.

I personally find the Hitler comparisons to be about as absurd as the birther movement was. And yet when I read the word "Hitler", my first thought is "Trump" and not "holocaust" or "tiny moustache". That isn't because I find the comparison apt -- I find it absurd. It's because Trump is so prominent in my attention and because absurd comparisons to Hitler have been made so many times that it's becoming expected.


It's worth reading this article by Ron Rosenbaum, author of Explaining Hitler:

https://lareviewofbooks.org/article/normalization-lesson-mun...

He agrees that "to compare Trump’s feckless racism and compulsive lying was inevitably to trivialize Hitler’s crime and the victims of genocide", but also explains in substantial detail how Trump's playbook closely follows Hitler's.


It's not that great of a leap for the person you are replying to because "Hitler" and his supporters have been called nazis since the summer of 2015 and every comment connected to him inevitably turns into a discussion about neo-Nazis.

It's pretty clear who was being referring to.

concentrate on one enemy at a time and blame him for everything that goes wrong; people will believe a big lie sooner than a little one; and if you repeat it frequently enough people will sooner or later believe it

If we really want to get picky it's worth pointing out that the person we aren't mentioning attacks multiple people at the same time and lies about little things all the time.

Said person is a jackass and an authoritarian but to compare him to a regime that killed millions using the loosest historical analogies does them a disservice and muddies the water.


The thing to remember about Hitler was that his outcome happened in stages, each progressively more unlikely than the one preceding it. Comparisons to him should really have a date attached to them. It's easy to focus on the killed-millions-of-people ending, but it's likely that wasn't the inevitable conclusion of his tactics, views or whatever aspect of him is being compared. If someone compared a young chess prodigy to Bobby Fisher, we wouldn't think that it's being suggested that he'll die a paranoid schizophrenic loner. And yet when someone compares someone to an early-stage Hitler, we automatically assume the comparison is hyperbolic and can be dismissed by showing how unlikely the person being compared is to end like Hitler.


DOes this sound 100% like the Democratic Party to anyone else?


There was a related article posted a few months ago from the BBC on "How liars create the ‘illusion of truth’".

https://news.ycombinator.com/item?id=12829781

http://www.bbc.com/future/story/20161026-how-liars-create-th...


The structure of discussion threads can be used to achieve repetition:

-- early false statements are seen by many

-- later corrections down-thread are seen by few

-- popular promoted, true but unpopular hidden

https://www.reddit.com/r/rust/comments/5queq5/how_high_perfo...


Why does the article not mention the russia-hysteria as another prime example of baseless repetitive untruth of the current news cycle? It's also surprisingly light on its facts and logic: in the american left-right debate is there really excessive repetition from one side more than the other?

Is it really a matter of repetition or a matter of trust? Does it matter that fact-checkers point out the errors in trumps tweets? He has gained the trust of his followers, while the fact-checkers have a bad reputation and shady relationships with establishment. Even if they get the facts right, people don't trust them to make decisions right. It's more about what you plan to do with the facts than who has the most facts, and policy decisions are not deterministic, no matter how many facts you throw at them.

Same goes for establishing scientific facts indeed. Peer review is based on a few, reputable reviewers, rather on a crowd of anonymous but trust-less fact-checkers.


Indeed. It's astounding how many of the Trump-Russia claims simply fell apart: the supposed lifting of sanctions on selling IT equipment to the FSB that in fact specifically forbade that, the secret communications link with Russia that turned out to be a sub-sub-subcontractor sending marketing emails for Trump hotels, the fake Clinton email that Trump could only have got from Russian outlet Sputnik... or from the Trump supporter on Twitter whose misreading of a real email went viral and lead to that article, the supposed pro-Russian change to Republican foreign policy that left almost all the anti-Russian parts intact but matched up neatly with Trump's long-stated foreign policy views, and so on ad nauseam. Despite this, the endless repetition turned it into something that everyone simply knows is true.


I find it amusing that this article quotes a single research study, doesn't really describe why you should find it compelling, but just repeats its claims multiple times.


Some religious books are extremely repetitive. So repetitive that it seemed to me to be a form of intellectual violence against the reader.


Buddhist sermons are an excellent example for that.


Mantras, christian chants, and others. Repeat ad infinitum.


I think this is simply survivors bias for every information transmitted orally. It had to be repetitive, or people couldnt remember it. See here for the Ilyad:

http://faculty.gvsu.edu/websterm/Read_Iliad.htm

> For an oral poet, however, such repetition is not a fault, but a vital technique (Lord 3–67). Repetition is a psychological necessity in oral discourse, which vanishes as soon as it is uttered


It's also useful when the text makes no sense, is offensive, outdated and/or just plain wrong, so with repetition they can always say "that's how we always did it so it's right" to shun off the people who grow up, think for themselves and start questioning the texts.


I've heard that many times but I wonder if it's true. The article cites studies but I didn't bother to check the studies themselves. It cites some contemporary anecdotes but I didn't bother to check if they were accurate. It seems to confirm my experience so I'll accept it as reasonable.

The article is clearly trying to put Trump in a bad light and other comments here are applying it to other politicians and corporations. I think the important point is that repetition doesn't discriminate truth but can be dangerous because it can seem like it does. On important issues there is no substitute for research and accurate methods to interpret new data.


How many times have you heard it? If a lot of times, then I guess it isn't true.


All the best salesmen know this innately.


And the best parents


"You love asparagus! It's your favorite!"


I've used this on myself rather effectively.


See also the problem of induction from philosophy: https://plato.stanford.edu/entries/induction-problem/.

Here induction refers to inductive reasoning rather than mathematical induction. For example if you see a billion white swans you might conclude that "swans are white". It's not "true" but it's not wrong either from a bayesian point of view. We really don't have any other way of doing experimental science.


I didn't believe the premise of this article the first time I heard about it. But now that I've encountered it a number of times on the internet, I'm beginning to think it might be true.


More on the "big lie" propaganda technique at https://en.wikipedia.org/wiki/Big_lie .


I only read the headline, but this seems a very obvious part of empirical observation and learning that sits at the center of human existence. Perhaps with one layer of indirection, where you are observing a message/interpretation as opposed to direct physical events. (Or, you're observing those events but in concert with a in incorrect but repeated interpretation.)

Basically, we learn patterns real well. Doesn't matter the nature of the pattern, if we have no counter-example of significant or equivalent weight.


That's, of course, by design. If you are an animal then repetition is the only way to learn and make sense of the environment. One could learn what usually happened and what to do, but not, obviously, how it works.

Humans are the only beings who has language and hence reason to understand how everything works, unless they engage in producing dogmas and chimeras out of words and abstract concepts, which is what they usually do.


Reminds me of the ancient Indian parable in Panchatantra - http://panchatantra.org/of-crows-and-owls/the-brahmin-and-th...


> Eating carrots improves your eyesight

it doesn't improve your visual acuity, but it can have a positive effect on your overall eye health [1]. there's vastly more to eyesight than sharpness/clarity. my wife is an optometrist.

[1] http://yoursightmatters.com/carrots-really-improve-eyesight/


"See, in my line of work you got to keep repeating things over and over and over again for the truth to sink in, to kind of catapult the propaganda." ― George W. Bush (2005)

https://www.youtube.com/watch?v=VxnegxNEDAc


Marketing 101: bullshit repeated consistently enough with a large reach wins a broadcast-modality "contest of wills" where consent/attitudes/preferences can be manufactured to accomplish almost any aim, from the sinister, public good or prank.


I'm reminded of the initial "processing" scene in The Master between Phillip Seymour Hoffman and Joaquin Phoenix.

Hoffman's character repeatedly asks questions to Phoenix's character, inducing a hypnotic state. Repetition can also be used as a form of mind control.


There are people who understand this, and there are people who think you can just debate everything forever, and that the one with the most supporting points and references to Latin names of argumentative fallacies wins.


You're better off making all the arguments you can, and letting the medium do the repeating for you. The people you reach in person are a very small set of the people you reach in total.


Truth seems more like a matter of frequency and consistency. Repetition is an essential part of that equation. From an abstract point of view, I don't think there's any getting around that.


Having heard this idea many times I remain unpersuaded so far (!) because I haven't heard an explanation for it. Anyone know one?


This works mostly/only for the undecided.


I was hoping for an interesting read. Instead it was another politically-fueled piece on HN.


"It's not a lie if you believe it." --George Costanza


Religion, worship.


[flagged]


We detached this subthread from https://news.ycombinator.com/item?id=13623098 and marked it off-topic.


Can I ask you a question? Are you trolling or what are your aiming here? Now it's of course downvoted to oblivion, but I'd like to know why even here, amongst smart people, we go down into so cheap comments and even try to defend changes that will eventually make lifes of people like me a living hell.


It's ridiculous to compare Trump to Hitler. It's a childish emotional response from people who have spent a lot of their adult life in left leaning echo chambers.

If you have a policy you want to criticize, by all means do. Don't shut your brain off and try to imply the president is the same as a man responsible for genocide. It puts you in the same camp as the people claiming Obama is a closet Muslim terrorist.

>Now it's of course downvoted to oblivion

Don't talk about downvotes, it makes for boring conversation.


[dead]


Please stop creating accounts to break the HN guidelines with.


> we go down into so cheap comments

Plenty of people are making the exact same "cheap" comparison to Trump, except to you that's likely ok. You're witnessing a difference of opinion.


The other opinion is rubbish if it dangers my own life and my partners life.


Anything that endangers your life is "wrong"?


He has a different opinion from you, what's so hard to understand?


If the opinion leads to danger the life of me and my partner, I don't understand.


Well they both have names starting by "Hi", containing "l" and "r".


If you think that this is something limited to what evil Trump is doing, you are a fool and you don't understand yourself or the world around you at all.


Of course it isn't just him, he's just the most visible and powerful such example.


When the media constantly focuses on him as the target, it starts to feel as though they don't really care about the story they're pushing, they only want you to notice how many of these bad things are associated with Trump. It seems dishonest. Personally, I think a better example would have been the wage gap myth; the author could have even noted that Obama was guilty of promoting it. (Disclaimer: I voted for Obama and I did not vote for Trump).


Trump is certainly the example that's repeated again and again, and there's no doubt that does seem to have convinced a lot of people.


Facts don't matter. Facts don't matter. Facts don't matter. Facts don't matter. Facts don't matter. Facts don't matter. Facts don't matter. Facts don't matter. Facts don't matter. Facts don't matter. Facts don't matter. Facts don't matter. Facts don't matter. Facts don't matter.




Applications are open for YC Winter 2018

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact

Search: