Hacker News new | past | comments | ask | show | jobs | submit login
Dots Will Be Connected (theconvivialsociety.substack.com)
91 points by longdefeat on Feb 21, 2023 | hide | past | favorite | 76 comments



We may be inundated with too much data, and more information may not improve our situation, but appointing any person or institution to limit our information inflow will certainly worsen things.

I fully stand with thezvi's points over at https://thezvi.wordpress.com/2023/02/20/on-investigating-con... ; especially 1 to 3:

1. Think for yourself, shmuck!

2. When it seems worthwhile, do your own research.

3. The ones telling you not to ‘do your own research’ are probably the baddies.


Maybe this is my personal bubble, but the group of people who most loudly tells other people to "do your own research" seems to consist mostly of conspiracy theorists and scam artists. Not seldom their own research is of the "go outside and you can see for yourself that the earth is flat" type, too.


Yes, which is exactly the problem. Thinking for yourself, a vital skill, has been surrendered to scam artists. This doesn't just harm misinformed people, it also harms science as a field.


"Has been"???

Thinking for yourself has never been a skill near as important as many people think it is. When looking at human endeavors the field of science and the scientific method is pretty damned new. I mean, maybe if you're the village leader thinking for yourself worked out, but if you were the lower on the organizational structure you might find yourself clubbed to death for questioning the boss too much.


Thinking for yourself doesn't mean blurting out things that'll get you clubbed. Think for yourself, construct the most accurate view of the world and the people in it you can, run little trials and experiments if you have to, and then say the string of words that will get the leader clubbed.


Thinking for yourself also means knowing your limits. It's OK to defer to experts such as the CDC.


Knowing your limits is good advice. Equally good is knowing the limits (and motives) of those you might defer to.

Even if one assumes them to be relatively honest brokers, the guidance issued by a government agency is (at its best) guidance intended to minimize the possibility of societal collapse, which may or may not also correlate with your attempts to maximize prevention of contracting an infectious disease that may kill you.


I think you and GP may be speaking of different types of people and motivations. Somebody who makes an outrageous claim and follows up to criticism or questions with "do your own research" is almost certainly full of poo.

By contrast people who do genuinely "do their own research" on topics tend to be pretty happy to share what they found, where, and why they think its significant. And they would love to be corrected if provably wrong, because its more of a lust for knowledge than anything else. And I think this is the sort of people our GP was talking about, whereas it seems you were talking about the aforementioned type.


That reminds me that I have a book from about 2000 by a Australian author, called "Ideas Generation". A few years after it came out, she was in the news from getting caught plagiarising magazine articles. Which seemed ironic. I can't find any trace of that scandal online now. As if it never happened...


That isn't just your bubble, certainly: "Do your own research" is a pretty reliable marker for people with unreliable information.


How does that even work, though?

Ultimately, if two people are at odds over something and I'm coming in blind, the one trying to keep me from hearing the other side has already discredited themselves.

How is that even debatable, barring an appeal to authority or something equally as tiresome and/or fallacious?


It really depends on context.

On HN I trust someone who's inviting nit-picking by saying "this is SOP, google it your damn self" a lot more than I trust someone citing their sources because there are a fair number prolific commenters who form an opinion and then back it up with cherry picked links as a debate tactic (moves the debate from one of their opinion to one of source credibility).

In synchronous communication or another context where I can't reasonably "just google it" the suggestion to do one's own research is a lot more suspect.


> On HN I trust someone who's inviting nit-picking by saying "this is SOP, google it your damn self" a lot more than I trust someone citing their sources [...]

There's the old "trust but verify" saying that's relevant here. The person that cited their sources at least did some of that work for you, whereas the other person might have based their claim on random shit they overheard on the subway.

I don't understand this idea that showing less of your work is somehow more convincing.


What were the steps that got you to this conclusion?


Random searches for this phrase on Twitter seems to back this up somewhat. From the first about 100 which I was made, about 2/3rd was about some obvious misinformation, and a deeper analysis would probably take it higher. However, there is definitely a bias here, because I’ve only encountered with this phrase before from people “backing” obvious COVID, vaccine or Trump misinformation.


Sounds like you just did your own research, no? And you sort of cited your source.

Now someone can look at what you said and decide how confident they are with what you claim. From my perspective, I can take what you wrote as someone's honest opinion based on stuff that comes across their social feeds. I wouldn't treat it as ground truth, but I can feel confident that at least one person in the world has had that experience.

If someone makes a claim and points to a few studies that I can cross check, even if I'm not qualified to evaluate the studies, I can have some confidence that some domain experts have given evidence for the claim. I can also look for studies that contradict the claim, and in their absence, I can know that some experts believe xyz and I can't find any experts that disagree.

That's about the best I can do as a layman, and I think it's a lot better than nothing.


> Sounds like you just did your own research, no?

People doing their own research isn't what we're talking about.

People saying "Do your research!" and what it correlates with is what we're talking about.


So you've come up with a heuristic that says 1) doing your own research is fine, and 2) people who tell you to do 1) are suspect?

If so, I can't imagine thats very effective. You're probably better off just assuming strangers don't know what they're talking about unless you have some good reason to believe otherwise.


No, I think that people who treat "Do your research!" as an argument are peddling suspect information.

If they had good information, they'd have good sources.


When someone tells you, "Do your own research", where are they coming from? Having done their own research. Because they believe in their conclusions, they have built for themselves an expectation that when you do your own research, you will reach the same conclusions they have.

But this is circular reasoning: the very expectation that a conclusion will be reached through arbitrary research is equivalent to treating that conclusion as given fact.

This pattern leaves us open to define "research" in whatever arbitrary fashion we choose.

It's worth recognizing that research itself, at its core - before you even get to science, logic, academics, or any other strict behavior - is backstory. It's the context that defines a claim.

When we give the title "research" to a backstory, we classify it as a true story. But every story is invented! Very few stories are provably true from beginning to end. The ones that are exist in mathematics, and are called theorems.

So despite expressing confidence in the truthiness of a story, we are still in the weeds.

The way I see it, there are two expectations for how "your own research" is to be done. The first is true scientific and logical study. The second is religious repetition. The admonition itself cannot tell us which one we are dealing with. Ironically, the only way to know is to "do your own research".


In 2023, labeling people as "conspiracy theorists" is a much bigger red flag for the (deliberately or not) misinformed than urging someone to "do your own research".

In fact, I struggle to see how "do your own research" would be a reliable way to spread misinformation.

Can you provide an example of how that works? Even a hypothetical?


I'll give an example that I'm sure the gp did not have in mind, but it's one that I see somewhat often. If you ever ask a question online about trans ideology, such as a basic question about biology like "do trans woman menstrate?", a lot of people will say things like "it's not my job to educate you."

I personally hate that response. If you truly believe something is true, and you believe that you have evidence to back it up, you would show that evidence every chance you get.

Don't get me wrong, I think everyone should have some understanding of why they think the things they do, which should require "doing your own research" to some extent. But people who make bold claims and defer to telling people to do your own research without any kind of source for their claims is probably doing so because said claims don't have great evidence to back them up.

I do agree that the people that tell you not to do your own research and just appeal to authority shouldn't are the bad guys. "Do your own research" should be like saying "I am not a lawyer, but" - it should be used as a disclaimer of sorts after presenting an explanation of why you think your claim is true. "Here's what I think, here's why, but I could be wrong, so you should look into it yourself too."


Why are you even asking "basic questions about biology"?

"It's not my job to educate you" is a response borne of extreme frustration with being repeatedly asked the same basic questions (that can easily be answered with Google), to the point of being suspicious that the questioner is acting in good faith. A common pattern is someone "playing innocent" in order to bait a Socratic dialogue out of someone. If you get this line often, it would be worth some self-reflection.

>If you truly believe something is true, and you believe that you have evidence to back it up, you would show that evidence every chance you get.

Meh. After being asked why I think the Earth is round for the hundredth time, with the most carefully considered answers invariably failing to convince my interlocutors, I'm now inclined to tell them to bugger off instead of wasting my energy. It's basically the fail2ban mitigation to DoS attacks.


> If you ever ask a question online about trans ideology, such as a basic question about biology like "do trans woman menstrate?", a lot of people will say things like "it's not my job to educate you."

That surely can't be a question asked in good faith though? Both the asker and the answerer will know that it is impossible for transwomen to menstruate, because they are male. So such a hostile response must be a slighter politer version of telling the asker to quit with the bullshit questions and piss off.


> it is impossible for transwomen to menstruate, because they are male

I hate to break it to you, but what you just said is transphobic. You're probably confused and don't believe me, and wish I would produce evidence. I don't want to get too off topic, so I will just say - wouldn't it be really annoying if all I said was "please educate yourself / do your own research"? That was my point.

also nice throwaway lol


My own research tells me that your question cannot be in good faith. Google tells pretty clearly, that menstruation means a different thing with similar aspects. If you had asked that in good faith you would have definitely used the expression “menstruation symptoms”.


Males can't have any type of menstruation though, or symptoms of it, as they lack the requisite body parts. It's just not physically possible.

So I think this must be a question used to point out to transwomen that they are not female. Which I expect many would get annoyed about and give a snappy answer to, as they want to be female and probably don't want to be reminded that they never really can be.


Just type it in Google, and after that reread my comment. We agree.


Definitely a bubble, if you think it's limited to a few groups of people. But seeing as how the "conspiracy theorists" have an outstandingly good track record, maybe there's something to it when they say it.


Doing your own research is a great idea and more people should do it!

Reading the existing literature and keeping up to date with new publications; studying the general subject area for, say, 3 years to get to at least an undergraduate level of understanding; studying statistics and experimental design; planning experiments and analysis and obtaining critical feedback from peers to mitigate the chance of errors; attempting to reproduce the results of others working in this area…


See, the problem is, you get people who have studied for a lot more than three years, and then you get the replication crisis. Do you think that could be avoided by requiring more education? And I see people correctly calling out problems with papers that have done a lot less than three years of effort: sometimes the issues are a lot more basic than that. As usual, verification is cheaper than computation; studying gives you (hopefully!) a better chance to get things right but it doesn't immunize you from getting things wrong. And peer review didn't stop blatant photoshopping [1]; though the person who found them is a scientist, one cannot imagine that her coursework included "spotting rotated or mirrored pictures in a paper".

So no, it doesn't even take an undergraduate level of understanding to contribute, and you don't need a degree to beat the experts (sometimes). I think you're envisioning a world where science is "mostly okay" and at any rate mostly inexploitable: chess computers may make mistakes, but you certainly can't catch them in one. Whereas I think people have been espousing this view of science for so long, and marginalizing citizen science for so long, that there are now many opportunities for personally invested individuals to spot things that people with degrees missed.

So sure, some people will fall for conspiracy theories. Some people will fall for fraud. Some people will simply believe wrong things. But "science" as an institution (ie. excepting the sense that any such study is science) does not deserve any sort of monopoly on finding out true things, and claiming otherwise does both the scientific community and the populace at large a disservice.

[1] https://www.nytimes.com/interactive/2022/10/29/opinion/scien...


I think this is a big part of the problem. It also means trusting the experts difficult, as it's really hard as a layman to know who is an expert. Most conspiracy theories in fact have their own experts they call upon.

I keep seeing discussions that go like preprint/abstract pokemon battles where everyone's just throwing out papers they don't understand that seem to corroborate their point as though that was enough.

If you can't read the papers, and don't understand their context in the field, then you really shouldn't be drawing conclusions from them. In science, you always start with "I don't know". If you see evidence that you don't understand, then you still don't know.


>I keep seeing discussions that go like preprint/abstract pokemon battles where everyone's just throwing out papers they don't understand that seem to corroborate their point as though that was enough.

It's the same phenomenon among people who learn logical fallacies for the first time. Instead of using that knowledge to improve their own framing and arguments, they instead use it as insta-kill magic spells to "win" arguments without actually bothering to understand the opposition.

Intellectual humility and curiosity are in short supply, especially in online spaces where you're rewarded for combat and engagement.


I think that's part of it, but I think it's mostly an artifact of a generation being taught that "reputable sources" is be-all end-all evidence, and critical thinking is the same as checking the existence of these sources.

That was possibly sound advice in the past, but it's definitely severely incomplete advice in the modern information ecosystem.

The pseudoscience people aren't eschewing science in favor of the emerald tablet of hermes trismegistus and drawing conclusions from seances and divining rods. More often than not, they draw on published articles too.


You make a very important point!

But I interpreted the comment you answered to as saying that doing your own research is time consuming/costly. And requiring people to "do their own research" is often infeasible.


Ditto, all this is exactly what I do when I want to try some new type of chocolate bar from the supermarket, and want to understand the nutritional, economic and moral implications of what I'm doing.


Come now, there are certainly heuristics with which you move about and make decisions in the world every day without investing 3 years a piece on. And even when you desire to seek the advice of some trusted expert, you rely on heuristics and biases to inform your decision on which one to trust.


Sure, but I don’t call it research.


Yes, but in the common language this is what it is called. And researching the existing literature is proper scientific research btw. so you can do scientific research, without going all in, like you seemed to imply.


"Do your own meta-analysis"


> do your own research

Sure, yes do that, but that doesn't address the issues brought up in the article ( "3. We are all conspiracy theorists now." and "4. Actually, we’re all cultists now.") how to ensure you're not just being sucked into an info-cult, an echo-chamber?

The answer in the article seems to be more "analog" friendship - I'm assuming he means actual in-person interaction. And while that's important, I also like _why the lucky stiff's recommendation: "when you don't create things, you become defined by your tastes rather than ability. your tastes only narrow & exclude people. so create.” Make stuff.

In addition, given the firehose of information we face daily, doing your own research easily could become a full time job.


I wish that people thought for themselves and did their own research. In that case, we'd get countless different, nuanced opinions.

Instead, we mostly get a tiny number of stereotypical opinions that people passively acquire from their favored forms of media. As if listening to Joe Rogan for example were "doing research".


The "do you own research" line is often abused to dismiss experts or send you on a wild goose chase.


The short version, from a reference in the article.

“The primary purpose of narrative,” media scholar Katherine Hayles argued several years ago, “is to search for meaning,” which makes “narrative an essential technology for human beings, who can arguably be defined as meaning-­seeking animals.”

His "narrative vs database" essay talks around the problem. Narratives tend to discuss causation, either explicitly or by implication. Raw data does not contain causation information. That's the real distinction here.

We know that humans are hard-wired to find causation, even when it may not exist.[1] This has survival value. In a hostile environment, an excessive false alarm rate is better for survival than missing a threat.

If you do correlation on enough data, you find what looks like causation. Often it's just noise. This is a well known phenomenon. Intelligence analysts, investment quants, and people who analyze research data have to be trained to watch for it. Most people don't have that kind of training.

Under information overload, this gets worse. Combine this with the human tendency to find causation when it doesn't exist, and you get false narratives. Even without wishful thinking or bias.

It's not mysterious. It's how human brains work.

[1] https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4008651/pdf/nih...


You appear to be strawmaning this piece to a large degree.

The necessity for humans to find meaning isn't ignored or argued as anything other than "how human brains work", in the piece.

The Database metaphor isn't a separation of raw data from narrative -- it's recognizing that in the modern zeitgeist, the abundance of data is so vast that a new experience has emerged that supersedes any Narrative, that's the Database: a super collection of all raw data as well as the known paths through it.

The existence of the Database then calls into question the validity of any one Narrative, and the rest of the piece follows.

* No need to reference NIH.


Brilliant synthesis:

Globally humans have transitioned almost overnight from information/knowledge scarcity to information flood. Once we had plenty of time to forage information to construct narratives and share them and agree/disagree on them before the next influx. Now we are literally waterboarded with information that has no context, minimal sharing and no real conversation attached. We are left with whatever individual narrative our pattern matching can construct, and it's usually inadequate. Narrative is definitely a superpower of the group not the individual.

We may need to wait a generation until people who have grown up in this world and can filter feed on the information can create/disseminate narrative adapted to the new rate of information flow and yet somehow true to reality.


The situation invites comparisons to the obesity epidemic or perhaps (more of a stretch) the introduction of alcohol into American tribes.

Regardless, it's critical to recognize that we as a people need to develop a greater worldview/metacognition to go any further forward together.


> it's critical to recognize that we as a people need to develop a greater worldview/metacognition to go any further forward together.

Possibly, but the chances of a move towards a cohesive worldview seem fleeting at best. If anything it seems more likely that we'll continue to fragment further and even more rapidly. We had a chance at this during the pandemic - humanity could have come together to fight a common enemy (the virus) but instead we fractured into groups and fought each other.


This is contrary to the article, you're assuming everyone involved scanned the deluge of information and the only "rational" result would be to come to the same conclusion about an insanely complex situation as you did, whereas the article argues that there's so much information and so many dots that people scan the same wealth of data and come to different conclusions.


Ok, but I think my point still stands: Thinking we'll arrive at a cohesive, shared worldview seems like wishful thinking at this point. Maybe there was a golden age in the post-war period, say in the 70s, when a cohesive, shared world view kind of existed in the US, but we're well past that now.


Or, even more pessimistically, the introduction of oxygen into the atmosphere by cyanobacteria in the Great Oxygenation Event.


> We may need to wait a generation until people who have grown up in this world and can filter feed on the information can create/disseminate narrative adapted to the new rate of information flow and yet somehow true to reality.

I'm not so optimistic. Remember when when people thought gen z and gen alpha would be "digital natives?" They were supposed to be tech savvy but a good chunk of them can't use a search engine, or a word processor. A teacher I know says that each year the kids just get stupider and stupider, they sit around all day on social media and their brains haven't developed or something.


It chimes with people in the late C19th / early C20th complaining about the fact they were overloaded by news because newspapers were being published on a weekly and then daily basis.


Somebody like Richard Dawkins is missing from the conversation. I believe he pointed out once that in a 300m person economy, million-to-one chances might happen 300 times, albeit to 300 somebodies out there, not you, all things being equal.

It's like the observational biasses which go to "busses come in threes" or "the other lane is always moving faster" aren't being integrated into thinking.

Dots will be connected. OMG! every ancestor I ever had lead directly to me! it must be in the plan..


Buses do come in threes, at least according to Transit vehicles’ headway distribution and service irregularity[1] as cited in [2]

1: https://link.springer.com/article/10.1007/s12469-010-0024-7

2: https://theconversation.com/transport-experts-explain-why-bu...


> rather than saying that we are all conspiracy theorizers now, I should say that we are all, from someone’s perspective, cult members now.

I should start promoting a Church of Objective Reality. The main tenet of faith will be that there is a shared, objective reality; which we experience in individually distinct ways. The main requirement for membership will be regular donations; after all, what the point of group action anymore if not personal gain?


the point? maybr the elevation of human perspective above selfish bias.


Our glorious leader is trying to help you achieve that as well by having you give him the money you are so selfishly holding on to


money is the root of all Evil. Allow me to Protect you from the Evil by taking the Money from you...


Sometimes you just have to believe in something as you won’t be able to verify it. The quake in Turkey, real or not? Yeah is probably real but no you have not seen it or experienced it, but you know is way too much evidence from different news sources to be fake and no reason to fake it as far as you are concerned.

How about balloons where the US is saying is a spy device but China is saying is for weather and was blown off course? Same idea us you were not there when the ballon was made or you did not see what they recovered. There is lots of incentives for either country to blow this either way, I would just not choose a side till more events that can connect the dots that makes it very unlikely it isn’t real. Again you will never be able to verify but you have to see a mountain of evidence from many sources to gauge likelihood


Epistemology is hard but for some reason it doesn’t seem to be addressed directly frequently.


You can usually find "truth" relatively easily by seeing what two opposing sides tend to agree on. Sometimes they'll both be wrong, but more often than not they're probably right. So both the US and China claim there was a Chinese made balloon and the US shot it down. So it's pretty safe to say that's all true. Was it a weather balloon or a spy balloon? This is where the key point is a willingness to accept uncertainty.

But I do not agree on the 'preponderance of sources' argument as a means to finding truth. As a contemporary example, consider the COVID lab leak stuff. You had large numbers of scientists making declarations in journals that it was literally impossible, fact checkers declaring the concept clearly debunked, government calling it dangerous disinformation, the media running endless stories calling anybody who dared even think about a lab leak as little more than crazy conspiracy theorists of the flat earth type, and more.

Until one day, everything changed and the lab leak was perfectly okay to discuss. There was no silver bullet, just a geopolitical recalculation. And it turns out among those scientists were people directly involved COVID related gain of function research. Those journals somehow couldn't discover such an absurd undeclared conflict of interest, even though a simple search or CV reference would tell all. The media suddenly did a hard 180, and you even had mainstream outlets actively mocking how absurdly "obviously true" it was that the virus came from a lab leak.

We live in an era of disinformation being coordinated and deployed at scale, and now human-like text-generation is about to enter the picture. When my logic runs contrary to a million sources - I'm going with myself. If I'm wrong, at least I believed something that I felt seemed like the truth, and could presumably defend. By contrast believing in something that seems false (and increasingly often turns out to be) simply because of a mountain of pressure? That would feel just awful to me.


I fed this to ChatGPT and asked for a summary, and got something a lot easier for me to understand than the article itself. Maybe I'm a dummy! But it helped. (edit: the below is not ChatGPT's response!)

The article presupposes there's anxiety, indecision, uncertainty, and frustration as a result of collecting more data, but this isn't a given. Instead, a better solution would be to simply observe; you're not being asked to make decisions, you're not being asked for certainty or clarity, the reality is you don't meaningfully matter to almost any of what's being shown to you in media, and almost nothing being shown matters to you.

Caring and feeling empathy towards those negatively impacted by something is one thing, but your job is not to solve those problems in any real, if-you-dont-then-no-one-will sense. You can do your small, tiny part (e.g. throw $5 at a relief fund, be kind/forgiving to those who do get hurt, use your vote to improve the way your government handles those situations, etc.) but there's no big, sweeping act you're being asked to perform.

The narrative is irrelevant. The "connected dots" are irrelevant. If you want to live a happier life while still being informed, then you must stop trying to understand everything. Just observe, accept you're mostly powerless, and do the tiny amount you can to ease the burden of those who get harmed by the stuff you read about.


Wow, chatgpt really told the author to go touch grass.


I didn't paste the ChatGPT response in here (doesn't feel healthy to HN to flood it with ChatGPT responses), I wrote that myself.


Ah, disappointing!

In all seriousness, I have found the conclusions you drew from your chatgpt summary to be quite useful for making my way in this particular era of history. Being able to just evict any such controlling narrative from my head has prevented me from severely losing touch with actual reality, which I have sadly seen happen to more and more folks in recent times.

This has ultimately made me a much more "reasonable" person, but at the cost of having many hard convictions that I'm willing to "unreasonably" stand my ground on.

Being able to be stubborn like that can occasionally be a force for good (e.g. civil rights movement), but can also lead to personal ruin, not unlike bashing one's head into a concrete wall trying to move it. This has led me to think really hard about what convictions I do hold dear. I haven't come up with many, but so far I'm leaning a lot on "don't treat people like objects".


This is a great article, it made me really think about my own propensity to connect the dots. And, without considering my biases and limited knowledge, so that those dots are driven into connections in the only way they could, and often not to the truth.

Then, I thought: thank god we have now have search engines with AI, AI that can summarize and connect the dots for us!

Oh, wait...


This guy is just another jackass running cover for the regime. "Let the experts do the data analysis for you!" You think running this program still works after the past 6 years? Why are "high IQ" people so unbelievably daft?


Consider for a moment that "Let the experts do the data analysis for you!" may be a much kinder way of saying, "You're too uneducated to understand the data and will with a high degree of certainty screw up the analysis!"

As George Carlin famously says, "Think of how stupid the average person is, and realize half of them are stupider than that." Half!


The problem is not one of qualifications, but of motivation and trust. Even if you genuinely believe the organization before you is filled with nothing but the brightest individuals in the world, leagues beyond your understanding in their field, it means absolutely nothing if you don't trust them. In fact it would mean less than nothing, as there's nothing scarier than a clever individual with "impaired" ethical values.

Trust is difficult to earn, and easily lost. And we live in a society where, for most, trust is gone. The way to get it back is through extreme transparency and coming from a point of view of understanding this distrust.


Trust isn’t so easily lost when you understand the lack of certainty in this world.

So then, who benefits from offering certainty, if it’s more or less impossible to obtain?


Everyone does. That's why its such a popular tactic, but trust is so hard. That is the reason for transparency.


> Pointing out ‘the other side are conspiracy theorists’ or ‘the people who believe this also believe these other terrible things’ does not prove the other side is wrong, nor is it going to convince anyone on the other side that they are wrong.

It is completely legitimate in my opinion to use "the people who believe this also believe these other terrible things" as an argument against the possible motivations of the people who believe whatever the current thing is.

Imagine a conspiracy theory that Jack and the Beanstalk was a true story, which was suppressed by the medieval Church, in order to restrict public access to the great riches of Heaven and keep them for the Church only. Let's say also that a large number of prominent and long-standing Jack&Beanstalk truth researchers also happen to be extreme right-wing xenophobic nationalists, some of whom have publicly posted really disgusting bigoted things on Pleroma. No matter how compelling the J&B evidence is, in my opinion it's both intellectually rational and morally appropriate to question the entire J&B conspiracy theory. Who benefits by promoting the theory? What worldview does it support? What worldview does the traditional narrative ("it's a fairy tale") contradict?


> “When you give people too much information, they instantly resort to pattern recognition." — Marshall McLuhan

Can anyone provide context for that? The only attributed source I can find for it is Joel Stein's 2019 "In Defense of Elitism: Why I'm Better Than You and You are Better Than Someone Who Didn't Buy This Book", which says it's from a 1968 Canadian TV show. I can't track down which one, much less find a recording of it.

Digging more, I found that Douglas Coupland's 2010 "Marshall McLuhan : you know nothing of my work!" (see https://archive.org/details/marshallmcluhany0000coup/page/n9... ) gives a slightly longer albeit unattributed quote:

"When you give people too much information, they instantly resort to pattern recognition to structure the experience. The work of the artist is to find patterns."

Yet the linked-to essay refers to "conspiracy theorizers" and "cult members", not "artists", which suggests that McLuhan had a different view on the topic.

A text search of McLuhan's 1964 "Understanding Media" finds "pattern recognition" used twice:

1) Now, however, in the electronic age, data classification yields to pattern recognition, the key phrase at IBM. When data move instantly, classification is too fragmentary. In order to cope with data at electric speed in typical situations of “information overload,” men resort to the study of configurations, like the sailor in Edgar Allan Poe’s Maelstrom. The drop-out situation in our schools at present has only begun to develop. The young student today grows up in an electrically configured world. It is a world not of wheels but of circuits, not of fragments but of integral patterns. The student today lives mythically and in depth. At school, however, he encounters a situation organized by means of classified information. The subjects are unrelated. They are visually conceived in terms of a blueprint. The student can find no possible means of involvement for himself, nor can he discover how the educational scene relates to the “mythic” world of electronically processed data and experience that he takes for granted. As one IBM executive puts it, “My children had lived several lifetimes compared to their grandparents when they began grade one.”

2) "We are entering the new age of education that is programmed for discovery rather than instruction. As the means of input increase, so does the need for insight or pattern recognition." - https://archive.org/details/understandingmed0000mars_s3z9/pa...

This further supports my interpetation that McLuhan sees pattern recognition as something useful, not as something for conspiracy theorizers and cult members.

This review of McLuhan's book, https://archive.org/details/everymansmcluhan0000gord/page/80... , seems to agree, saying "The book’s relentless blizzard of ideas illustrates one of its own key points: faced with information overload, the mind must resort to pattern recognition to achieve understanding."

Also, "connecting dots" brings to mind "ley lines" (https://en.wikipedia.org/wiki/Ley_line), and Matt Parker's discovery that "Woolworths stores follow uncanny geometrical patterns." (http://web.archive.org/web/20100812112158/http://www.standup...). Alfred Watkins didn't need digital databases to connect those dots back in the 1920s.


"Marshall McLuhan in Conversation with Norman Mailer" came up from a DDG search.

Canadian Broadcasting Corporation 1968, "The Summer Way"

Transcript: https://marshallmcluhanspeaks.com/media/mcluhan_pdf_4_gOLK6y...

Video (quote around 10:26): https://www.marshallmcluhanspeaks.com/interview/1968-marshal...


Wow! Well done! Oddly, my DDG search for "When you give people too much information, they instantly resort to pattern recognition" only found 6 Tweets and one web page with just the quote and no citation.

Interesting how the quotes I found are different than the transcript, which says: "When you give people too much information, they instantly resort to pattern recognition – in other words, to structuring the experience. And I think this is part of the artist’s world."

You can also see how his comments in 1968 follow the same theme from his book, with "there is in IBM, for example, a phrase that information overload produces pattern recognition."

And, fundamentally, McLuhan appears to be saying that pattern matching is a good thing. "The artist, when he encounters the present, the contemporary artist, is always seeking new patterns, new pattern recognition, which is his task, for heaven’s sake. ... The scientists are going to wake up to this shortly and will be resorting en masse to the artist’s studio in order to discover the forms of the materials they’re dealing with."

That's just about the opposite of how this linked-to essay interprets the phrase.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: