Hacker News new | past | comments | ask | show | jobs | submit login
Royal Society cautions against censorship of scientific misinformation online (royalsociety.org)
1016 points by steelstraw on Jan 30, 2022 | hide | past | favorite | 926 comments

Based. Nullius in verba [1]. The whitewashing of censorship is accelerating the erosion of trust in the institutions, and it's refreshing to see a long-standing institution take a bold stance.

[1]: http://en.wikipedia.org/wiki/Nullius_in_verba

The problem is that we sometimes have to rely on other people. We can’t all be experts in everything. We can’t all reproduce or witness every experiment.

What we can do is delegate the process.

There’s a difference between saying “whatever X says is true, because X said it”, and “I trust what Y says because Y has explained the process and methods of how it was derived, and I don’t have reasonable doubt to assume Y lied”.

What to do when you have two very different opinions coming from experts:

- Anthony "I am The Science" Fauci


- Dr Malone (mRNA inventor)?

Who to believe?

Difficult to say, but these days, sadly, the first (and prudent) thing to do is to follow the money. While credentials matter, the first thing I check when I read a paper/publication is who funded the work.

Doesn't really work. Social media grifting can be really profitable - it brings in audience, clicks, ad money. Even just audience is enough as long as you can come up with something to sell later.

Even if the original paper was an unfunded honest mistake, grifters can find a way to profit from spreading it especially if its novel and sensational.

All claims I've seen that Malone is "the" inventor of mRNA vaccine technology trace back to Malone himself. He possibly has some real claim to be "an" inventor, among the top dozen or so contributors of early advances that ultimately enabled the present vaccines. It's also possible that the contribution he's claiming as his own was basically his supervisor's idea (Felgner's), and Malone is just the one who did the lab work:


Either way, I'm pretty sure Malone's claims about vaccine safety are dangerously wrong. After literally billions of doses of mRNA vaccines, the adverse effects he's been warning about simply haven't materialized. The rate of those effects isn't zero, and surveillance at mass scale has identified adverse effects that the original trials weren't powered to detect; but for now, the risk/benefit ratio for the vaccines looks highly favorable. Malone's baseless assertions otherwise are causing real harm.

That said, please don't take anything above as a defense of Fauci or of censorship. While I'm pretty sure that Malone is dangerously wrong, this pandemic has repeatedly demonstrated that the mainstream consensus can also be dangerously wrong--so like the Royal Society, I'd rather tolerate false information from the contrarians than risk a false mainstream view that can never get corrected. That doesn't mean contrarians are automatically (or even usually) right, though.

"Follow the money" may sometimes be a useful standard, but I don't see the relevance here. Fauci appears to get his money from the government, where he's the single highest-paid employee but still earns less than countless anonymous tech workers. This seems more to me like a matter of prestige, ego and power for both men.

> I'd rather tolerate false information from the contrarians than risk a false mainstream view that can never get corrected. That doesn't mean contrarians are automatically (or even usually) right, though

I don't. I think it's reckless to tolerate contrarian information (and in fact, it's causing a much higher death toll during this pandemic) in public.

I agree that specialists should still be allowed to discuss contrarian information, via papers, peer review and the overall scientific process.

But should Software Engineers (like most of us here) really be warranted a platform, a listening audience on their ideas about virology?

What exactly is the "false mainstream view that can never get corrected"? I'm likely out of the loop to some extent, but WRT, e.g masks, that was retracted and corrected; vaccines prevent transmission/reception, that was retracted and corrected. What else am I missing, or is it simply that people don't like "flip-flopping", which is strange cause there wasn't massive outrage at the way "cigarettes don't give you cancer" was propagated, or Absestos, etc

In the two examples that you give, I don't think there was significant censorship of the initially-contrarian viewpoints ("masks are likely enough to protect against SARS-CoV-2 that they're worth a try", "the vaccine is highly effective against death and serious illness but much less so against infection long-term"). Perhaps in part because of that open discussion, those viewpoints indeed quickly became the mainstream.

I was thinking more of stuff like the origin of the pandemic. It's far from proven, but I believe it's entirely possible that (a) the SARS-CoV-2 pandemic originated from reckless virological research; (b) if this research continues, then new pandemics will occur by similar means in future, with similar or worse mortality; and (c) the only force likely to prevent that is public outcry against continued funding of such research. For about a year, discussion of this possibility was grounds for a ban from Facebook. It's only thanks to the huge reputational risks taken by a small number of researchers that this topic has now entered the mainstream. Without those few dozen people, we could easily have landed in a world where that topic remained forbidden indefinitely.

Closer to the Malone case, the BMJ published an article alleging improper conduct by a contractor that conducted some of Pfizer's vaccine trials. (For emphasis, they're not saying that any of these allegations, even if true, would mean the vaccine is unsafe. The point is to expose and correct localized procedural failures before they become patient safety failures.) Facebook marked this as "partly false information", and indicated that users who shared the article would see their posts deprioritized. This has been widely publicized, and Facebook hasn't changed its position.


So if the vaccine were actually dangerous, then would I ever find out? (Again, I strongly believe the vaccine is safe; but I'm asking this hypothetically.) I think I still would for now, because the censorship on the major platforms isn't complete, and I spend enough time on other platforms that they don't have total control anyways. It's not a comforting trend, though.

We know how the pandemic originated, let's stop spreading FUD and propaganda


A group of authors including Ralph Baric, the father of modern coronavirology, published a letter in Science calling for investigation of all possible origins of the pandemic, explicitly including a research accident:


Last I checked, Baric still thought a natural origin was more likely; but the point is that he considers an unnatural origin sufficiently likely that an open investigation is required. The FBI assessed with "moderate" confidence that the pandemic had unnatural origin, while some other agencies assessed with "low" confidence that it was natural and the rest declined to judge.

We don't know how the pandemic originated. Nothing is proven in either direction, but mainstream consensus now absolutely includes the possibility that it originated from such a research accident. Your dismissal of that as "FUD and propaganda" now puts you in a position as fringe as the opposite would have been eighteen months ago. From your other comment, I guess you think I should have been censored for entertaining it back then. But given that the mainstream consensus has since changed--apparently without you realizing--do you believe that you should be censored for rejecting it now?

Unless you somehow become our dictator, you are unlikely to find that the censors' arbiter of truth always agrees with you. Specifically on this website, I believe that Paul Graham is our dictator; and from his Twitter account, I'm pretty sure he disagrees with you on this matter:



He surely can afford to moderate as strictly as he wants. But he doesn't seem too inclined to censorship, so you got to post your comment. I'm fine with that, among other reasons because it gave me the chance to explain the basis for my beliefs, and possibly change your mind (or that of someone else reading).

As to your link, I assume you're aware that the host and guests on TWiV have advocated for and performed exactly the kind of high-risk research that may have caused this pandemic? That certainly gives them special expertise that deserves attention, but to trust only them on this question is like trusting only Monsanto on herbicide safety.

These are some fair points, but I have some objections:

> mainstream consensus now absolutely includes the possibility...

I don't think this is the case... Your belief is obviously different to Ralph Baric's which is yet different to the belief of those who worked in the Wuhan laboratory.

I.e. the scientific consensus was that it was extremely unlikely, Baric now says that this is less likely than the zoonotic origin, and you say "entirely possible".

You can use the catch-all "includes the possibility" to treat all of them together, but if you had to pick numbers for the probability of the lab leak, you'd likely pick different numbers than Baric or others.

I.e. just because something is not impossible but merely "extremely unlikely" it's still reckless to have our media talk about it in the way it has.

If anything, if the real consensus is that we don't know... The media should not carelessly talk about any hypothesis without also mentioning the others, and why they are more/less likely.

This is not a topic that needs to he hashed and rehashed every few weeks: the consequence of treating this like it has been done, is that now a bunch of people think that the lab peak is what actually happened, and just today I've seen another article which defends Joe Rogan by saying that there's "consensus on the lab leak hypothesis"...

I.e. if there should be censoring about this, both me and you should be censored, for not being concrete and impartial enough.

> I'm pretty sure he disagrees with you on this matter

I'm not sure what this appeal to authority wants to imply. Of course the decision process for how/when to censor it's a delicate one, and ideally left far away from millionaires who think that they are more competent than they actually are.

To clarify, just because a private person owns a platform, it doesn't mean that they should be the only ones to make rules on what contents are allowed. They can make things stricter, but they shouldn't be able to make things laxer by allowing what's otherwise illegal (obviously, that depends on jurisdiction, which is why countries censor websites via DNS or routing)

> I assume you're aware that the host and guests on TWiV have advocated for and performed exactly the kind of high-risk research

The only people who describe this as "high-risk" are also the people who believe in a lab leak being actually what happened

Do you think that EVERY "gain of function" experiment is high risk?

> If anything, if the real consensus is that we don't know... The media should not carelessly talk about any hypothesis without also mentioning the others, and why they are more/less likely.

As I said in my previous comment in exactly those words, "We don't know how the pandemic originated". So I'd certainly agree that the media should make that clear. (I mean that as my personal opinion, not a call for censors to force them to.)

Given that we don't know, regardless of whether one thinks a research accident caused the pandemic with p = 0.01 or p = 0.99, I believe that an investigation of all possible causes is required. Ralph Baric and I probably disagree on the exact probabilities, but we agree on the investigation. There are many significant unexplored paths for that, even without the PRC's cooperation, including subpoenas for the records of the WIV's American collaborators.

With millions dead, such an investigation is inevitably political. You'd probably rather the investigation were left to scientific experts, and I would too; but someone has to choose those experts. In a democracy, that job goes to elected politicians. The performance of those politicians is ultimately judged by the voters. Without open discussion, I don't see how the voters could make an informed choice. (I guess the politicians could decide what information the voters deserve to know, and the voters could judge the politicians according to that filtered information; but I assume you see the flaw in that system.)

> I'm not sure what this appeal to authority wants to imply. Of course the decision process for how/when to censor it's a delicate one, and ideally left far away from millionaires who think that they are more competent than they actually are.

I mentioned Paul Graham's beliefs not because they were specially valuable in themselves, but because under present American law, he's probably the person with authority to decide what is censored on this site. If he were inclined to censor, then I don't think he'd decide in your favor.

It seems like you believe American law should be changed, by amending the constitution to eliminate the First Amendment, and the American government should exercise strong powers of censorship over such forums directly. That seems very unlikely to happen. But even if it did, a majority of Americans (including a majority of Democrats) believe not only that an unnatural origin of COVID is possible, but that it's the most likely explanation:


So if the American government were censoring, then do you really think they'd be censoring in your favor? If you think stronger government censorship early in the pandemic might have changed public opinion now, then remember that Trump was president at that time. If his government had had that power, then I can't imagine you'd have been pleased with how they used it.

It seems like you're hoping for censorship in the abstract, in service of perfect truth. That can't exist. Censors are humans, and censorship is subject to the same mistakes and corruption as any other human endeavor, especially those affecting the flow of political power. All of this requires human judgment; and once the wrong humans get the job, any apparatus designed to suppress falsehood works just as well to suppress truth.

> Do you think that EVERY "gain of function" experiment is high risk?

Almost any biological experiment involving genetic engineering (or even just culture with artificial selective pressure) may be reasonably anticipated to cause some gain of function. Most such experiments present minimal risk.

The research of concern is the search for deadlier and faster-spreading human potential pandemic pathogens, whether by laboratory gain of function or by collection from nature in areas with minimal other human traffic and thus minimal risk of natural spillover. This was a concern even before this pandemic, and is absolutely a concern even to those who believe that's not the origin of this pandemic:

> “That’s screwed up,” the Columbia University virologist Ian Lipkin, who coauthored the seminal paper arguing that covid must have had a natural origin, told the journalist Donald McNeil Jr. “It shouldn’t have happened. People should not be looking at bat viruses in BSL-2 labs. My view has changed.”


> It seems like you believe American law should be changed...

Probably yes, especially since the rest of the world still often takes inspiration from what the US does.

But frankly, I don't live in the country in which I was born, and neither of those are the US. It would be enough for the law to be changed in my relevant jurisdictions

and then, if news.ycombinator.com but especially other sites with user generated content (e.g. Facebook) are not compliant, they could just be blocked (forcing me an others to use a VPN, if we'd still want to interact with these sites)

> Censors are humans, and censorship is subject to the same mistakes and corruption as any other human endeavor, especially those affecting the flow of political power. All of this requires human judgment; and once the wrong humans get the job, any apparatus designed to suppress falsehood works just as well to suppress truth.

Definitely true, but not censoring anything is not a neutral decision (just like deciding to censor is not a neutral decision). There are risks either way.

To tie back to your previous point:

> The performance of those politicians is ultimately judged by the voters. Without open discussion, I don't see how the voters could make an informed choice.

But do they make an informed choice, on aggregate?

Are people like Trump and Biden legitimately the best that the US could muster? Isn't this facet of democracy mostly a popularity contest, in which popularity is hugely affected by which claims are most often repeated in the media (and less on the actual compentences, policies espoused and reliability track record of those political figures)?

I think democracy can be achieved in a different way (but this is getting out of topic)

> https://www.technologyreview.com/2021/06/29/1027290/gain-of-...

That's very informative, thank you. I knew about the BSL-{1,2,3,4} rating... But I didn't know that the Wuhan labs were only BSL2

I'll look up more info now, but this is definitely something that should be addressed (and I'd be surprised if something hasn't been done about it already)

Edit: the issue of the BSL level of the laboratories in question seems to have already been addressed:


If you think there's a way to have a democracy without an informed electorate, then it makes sense that you'd be less concerned with censorship. I don't see how that could work, though. I'm not impressed with Biden, and significantly less so with Trump; but I'm also unaware of any system that works better. I didn't grow up in the USA, and I'd prefer a parliamentary system to the USA's republic; but that's a minor question compared to democratic vs. nondemocratic systems, and we're depending--however fragilely--on an informed electorate either way.

> Edit: the issue of the BSL level of the laboratories in question seems to have already been addressed:

I'm not sure what you think is addressed there? In that interview, Dr. Shi confirms that they were working with bat coronaviruses at BSL-2. Various papers published by her group before the pandemic also confirm this. They also had a BSL-4 lab for animal experiments, but experiments on the viruses in cultured cells were continuing at BSL-2. That's what Lipkin thought was "screwed up" (i.e., presented an unacceptable risk, regardless of whether it actually caused this pandemic).

As far as we know, all of Dr. Shi's work was performed in compliance with her institution's safety standards. The question is whether those safety standards were adequate, though--her standards were already a step below Ralph Baric's, and long before this pandemic academics like David Relman thought Baric's experiments were at or beyond the edge of acceptable risk:


Baric and the WIV later submitted a proposal to perform exactly the same kind of research as in Relman's hypothetical, not with SARS-1 and MERS but with novel bat viruses collected by the WIV:


> “We will introduce appropriate human-specific cleavage sites and evaluate growth potential in [a type of mammalian cell commonly used in microbiology] and HAE cultures,” referring to cells found in the lining of the human airway, the proposal states.

That proposal was rejected by the American government for safety reasons, but there's no way to know what work continued in the WIV with other funders.

No, looking at funding is a terrible shortcut. If you can't study the papers yourself, maybe see what other scientists think of their work? For COVID there is plenty of discussion.

If you can't figure it out, hedging your bets is better than choosing a side.

I trust Malone way more. Fauzi has vested interest in continuing COVID measures. Check out his pay (higher than western leaders in the world). Check also his stock holdings. Still Fauzi has way more credibility than the entire WHO which has been a laughing stock for the past 2 years.

Why are you intentionally spelling his name wrong? It certainly doesn't add credibility to your argument.

You can delegate whatever process you want. Please allow me to make that decision for myself, we have different capabilities and values.

> The problem is that we sometimes have to rely on other people. We can’t all be experts in everything. We can’t all reproduce or witness every experiment.

Implying that anyone has to have expert knowledge.

> I trust what Y says because Y has explained the process and methods of how it was derived, ...

Doesn't that directly contravene what you said before?

> ... and I don’t have reasonable doubt to assume Y lied”.

Completely different issue, same problem though. Reformulated by sylogism, you are basicly saying "I trust that I don't have reasonable doubt, because we can’t all be experts in everything. We can’t all reproduce or witness every experiment.", which is really a definition for what you might consider reasonable. It's a recursive definition at second order when "What expert X says is true, because they do not have reasonable doubt. They sometimes have to rely on other people. We can’t all be experts in everything. We can’t all reproduce or witness every experiment." Whereby the default case for the recursion is "I don't trust X, because Y said they's an ass" or something like that where being an arse can come in many coloures.

Of course you have to rely on experts, that is not the criticism. The criticism is that people with opposing views were silenced and declared as spreaders of misinformation. It is not hard to craft a counter argument if you have evidence for it. Just stating that you believe that X is wrong is enough. Having X removed hints that you do not have contradicting evidence. If said evidence is not politically communicable, you still should not ban people and remain on your position.

I see the root cause of all this debate as a failure in scientific communication.

When it comes to complex scientific decisions with significant impacts on the public, nuanced and detailed justifications are required. Instead, we often get simple declarations written by PR departments at juvenile reading level. This spawns chaotic and poorly conducted debate in all walks of life.

What is required is a robust and transparent framework for honest analyses of topics. For example, if the CDC has a specific recommendation, They should provide an outline of the arguments, counter-arguments, assumptions, and supporting data for each part.

It seems that the status quo is to completely ignore any arguments against a given recommendation. This suffocates any honest discussion in the crib. It also suggests to some that the justifications are not robust enough to survive the light of day. If you cant show your work, people will be skeptical that you did it all.

This is applicable to any science based public policy, but especially obvious to covid policy.

I am pretty skeptical that complex science can be explained to normal people. I just tried to explain to my school that since my child tested PCR positive then negative for COVID, he shouldn't take a PCR test for 90 days (highly prone to reporting positive even when he is non-infectious), but instead (like the doctor's note says) should take an antigen test, and if he tests negative on that (and has no symptoms) he can return to school.

I explained this fairly simply but the message didn't get through (to a reasonably intelligent person) and they thought I was saying the complete opposite of what I was saying. Instead, they are insisting on continued PCR tests (for "more data" even though I have a doctor's note saying to use antigen tests.

This is a symptom of no centralized framework. You don't need to people to memorize or understand the entire message, but a comprehensive reference source. We ship manuals with cars, we don't expect people to memorize it at the dealer.

In your case, wouldn't it be helpful to have a a recognized stand reference you could just point them to and let them read, instead of having to explain it and point to some random web page faq that they may not trust?

> I explained this fairly simply but the message didn't get through (to a reasonably intelligent person) and they thought I was saying the complete opposite of what I was saying.

If I detected you were being condescending I would do exactly this. Mostly to piss you off for thinking you’re better / smarter than me. You may know more than this staff member on this subject but there’s an appropriate way to explain things. Assuming from the get go they’re too stupid / uneducated is offensive.

Is the CDC (1) a scientific institution or (2) a government agency enforcing regulations?

If you think of it as (1), you see failures in scientific communication.

If you think of it as (2), you see an authority expecting to be obeyed.

> I see the root cause of all this debate as a failure in scientific communication.

How can you have scientific communication without nuance? We know nuanced discussions can't happen at scale. Is this even solvable?


You can't, discuss a nuanced topic without detail, but It is solvable with parallel mediums. Not every debate fits in a tweet so you issue long form communication in parallel. This is common practice for many topics. You wouldn't expect a nuclear power plant or rocket company to conduct all their internal business in self contained tweets. You expand the medium used to provide the bandwidth nessicary. You might have a power point with a single slide summary, several dozen supporting slides, and then a pdf going deeper.

I would dispute the idea that nuanced /messaging/ can't happen at scale (discussions are a different thing altogether). While I understand that everyone is up in arms about COVID-19 communications these days there are lots of instances around the world of effective communication that in the end had to be fairly nuanced.. most populations understand that vaccines are useful and effective but not perfect, most populations do understand that their particular risk is low but population-level risk is quite high.

I guess we can quibble about the definition of "nuance" but the idea that vaccine science and hospital capacity management is very direct doesn't hold much water with me, maybe others disagree, IDK?

>I guess we can quibble about the definition of "nuance" but the idea that vaccine science and hospital capacity management is very direct doesn't hold much water with me, maybe others disagree, IDK?

This is what I am getting at. If we claim these decisions are science based, communicate the science! e.g. if X% more people are vaccinated, we will have Y more hospital beds, and Z more elected procedures. make an honest calculation, show the work, and stand behind it.

> if X% more people are vaccinated, we will have Y more hospital beds, and Z more elected procedures..

But the problem there is you are not asking for nuance, you are asking for.. lies? The answer is that more vaccination means more hospital beds and elective procedures but no one could credibly argue any specific numbers - it was just not possible to make that calculation without enormous error bars, you can only make the nuanced argument that vaccination produces reduced case counts and hospitalization, and that the public has to understand that this will allow for better services against an unvaccinated baseline. The good news is that a huge majority of people do understand that!

Then give an estimate with error bars.

Alternatively, if you haven't done the math, or it doesn't actually support a robust position, say :"This is not a scientific argument, be we are really hoping this helps. no promises"

Speaking for Ontario, one province over from me, models with error bars is pretty much what the public was presented with every time they imposed new restrictions.

It's simpler than that. It's accountability.

Whether you communicate, or do without communicating, there needs to be accountability.

When there's no accountability, it boils down to power games, which is what we've always had, because as soon as you start playing accountability, you have to justify why you have power and I don't, and that makes people in power extremely nervous, because most of them are really not that bright and any requirement to use rationality exposes it all too clearly.

Totally agree. There's an inherent tension between protecting the freedom of speech and the potential harm that speech might cause. The current incentive structure motivates people to say the most click-baity outlandish things without worrying about any of the consequences. Fact-checking can never catch up with all the crazy sh!t people come up with. That's why censorship feels like a tempting "easy way out" for combating misinformation.

Maybe one mitigation is to make public figure / media accountable to the avoidable damage their speech end up causing? E.g., if I listen to your anti-vax radio program and consequently decide not to get vaccinated, before catching the disease and dying, then my family can sue you for damage as long as it can established that you have purposefully misled / failed to assess the potential harm.

After a few class-action lawsuits like this, public figure & media will probably be more careful when they want to spread misinformation.

"Failure of communication" is a generous way to put it.

The leaked Collins and Fauci emails show there was a deliberate decision to trade truth for control of the narrative. [0]

This isn't incompetence or messaging being too simple. This is a failure of philosophy: that of "noble lies".

Much of our leaders behaviors can be explained through this lens.

Masking, natural immunity, lab leak, etc.

At every turn, the facts were skewed and skeptics were penalized in order to railroad everyone into vaccination.

The failure is that the people in charge don't see this line of action as a failure, but rather a necessary evil. A means to an end.

[0] https://news.yahoo.com/reps-comer-jordan-expose-fauci-160210...

Fair enough, but I would consider a failure communication philosophy a subset of communication failure

Perhaps. I only see it as a "subset" in the sense of: many things have to go right for good communication to happen -- ethos being one of them.

But its still "upstream" from communication. Philosophy informs action.

Your original point is that our leaders need to get better at communicating nuanced issues. That's a non-starter if the people in charge see nuance as an obstacle and straight up adopt an anti-nuance attitude.

My point is, they're bad at communicating nuance in the way that a bulldozer is bad at building a house. It just wasn't their mission. And that didn't become obvious until after these emails were leaked.

In history, every time you set up an institution it becomes an entity in its own right. Company’s, orgs, religions, federal entities and countries all vie for power at their own level and within their own space. The problem with an ‘information-control/censorship/verification’ is… it’s exactly that, the age old ‘we are controlling information for the common good’. Every major nation in history eventually evolves to this point. They claim that the internet made this a new thing.

Listen to Lux Friedman - episode 254 with jay who professor the medical department at Stanford. It’s entitled the case against lockdowns.

‘ Social cohesion is a necessity, and mankind has never yet succeeded in enforcing cohesion by merely rational arguments. ossification through too much discipline and reverence for tradition, on the one hand; on the other hand, dissolution, or subjection to foreign conquest, through the growth of an individualism and personal independence that makes co-operation impossible.’

-Bertrand Russell

Edit: lol, little disinformation quip of my own. He’s just a professor, doesn’t head the department after a quick google search.

I've seen that episode but not made that connection, very interesting.

However, the way the quote ends paints a slightly different picture:

`The doctrine of liberalism is an attempt to escape from this endless oscillation. The essence of liberalism is an attempt to secure a social order not based on irrational dogma, and insuring stability without involving more restraints than are necessary for the preservation of the community. Whether this attempt can succeed only the future can determine.`

Well - all the evidence presented in that episode would be exactly the type of information the royal society (and every other federal entity in the world) seems to be wanting to supress. Alot of the information presented is very 'anti-mask', 'anti-lockdown', 'anti-vaccine'. He makes many points that we have devolved into some kind of mass histeria when it comes to covid.

Also, the reason i left the rest of the quote out is it takes away from his argument because despite the fact of him writing an entire treaty on revolutionary thought, he didn't seem to study most of the 'Revolutions' that occured from those thoughts. You can replace 'The doctrine of __x' with anything in the sentence and it would have range true historically. when the Tzar of russia decided that liberalism was needed in the late 1800's, the more people he freed the more those people who were freed swung further and further left until everyone was screaming some kind of collecive anarchism forwarded by bakunin and the Tzar was assassinated.

I agree with pretty much everything you said, but feel compelled to point out that I think the virus origin is only a minor footnote in the communication failures. The topics of lockdowns and vaccination weigh much more heavily in my mind and I think have much greater relevancy for most people. Here the goal of informed consent was largely discarded in favor of manufactured consent. I would love to see the CDC or some authority have a transparent and reputable weighing of the tradeoffs involved.

Science communication is a curriculum and career nowadays, and with that comes a lot of mediocrity.

You would have had a higher vaccine compliance if the dialogue would never have been about mandates. Just give a health recommendation. You wouldn't have reached everyone, but the science also said that you do not need to.

In some countries we will see mandates without any perspective. For the third dosage? The fourth? The opposition is correct when it says you only get your freedom back if you boot out those responsible for this in the first place. No politician comes out and admits mistakes, not even in democracies.

People talk as if a political and a scientific opinion are equivalent. That is obviously not the case most of the time.

I think that many people have a low opinion of the general public, but I think they are more intelligent than people give them credit for.

If you want to someone to take an action, tell them specifically what they have to gain, dont just provide some general platitude that it is good for you.

If you want someone to get boosted, say you have X% reduced chance of death and Y% hospitalization. Make an honest calculation and keep it up to date.

If the data doesn't exist, we have much bigger problems.

This was done for COVID, and was received as fearmongering. You have “experts” saying you “will die” if you get COVID, yet stories are passing around the entire time of unvaccinated people alive after getting COVID with no long term issues and never having seen a hospital.

So you have one group of people literally scared to death and that’s why they take the vaccine, and another group seeing other data that makes them say “I’ll take my chances”

Really? I'm speaking from a US perspective, but my opinion is the communication out of the CDC has too much nuance and is too concerned about being scientifically accurate. They forget they're speaking to a largely scientifically illiterate crowd, and that crowd sees the constant back-and-forth not as scientific progress but as a reason not to trust the institution. Never mind that it's often delivered with a tone of condescension people have come to despise from their institutions, and some of their policy recommendations are... debatable to say the least (recommending mass testing for high school sports/band/choir, for a recent example).

Honestly I only trust the CDC because I have the knowledge and capacity to verify what they say and ignore the bullshit. A lot of people don't have that. And there is bullshit to filter.

The message to the public, assuming the CDC is interested in actually convincing people, needs a simple path forward individuals can take to get said individuals something they want, even if it taps into a selfish motivation and isn't 1000% technically optimal. (i.e. Get vaccinated and the mask requirements go away). And it should be repeated in every statement, regardless of new data. It should have been consistent for the past 2 years. It should have a catchy slogan. It should be on fucking billboards. People would still bitch and grumble, and some would never be convinced. But we'd probably have more people vaccinated if that approach had been taken. Instead we confused people and handed ammo to the anti-vaxxers by the truckload, relying on peoples' better natures and assuming a higher average level of education than actually exists in the American populace.

In short the CDC should optimize for leadership and "good enough" solutions, not strict scientific accuracy and trying to save each and every life in their public statements. Publish the hard science/raw data in more obscure releases that only scientists will care about. And they need a better front person than Fauci. Fauci just radiates highly-credentialed-beltway-insider arrogance however nice he tries to sound and however right he may be.

Of course you could argue such a role should be the place of the executive, and the CDC was trying to fill the gap left by a largely absent Trump administration, and by the time the Biden admin took over Fauci's position as front-man/leadership was already too solidified. Maybe so, but the CDC and public institutions in general have failed the task of public leadership, although thankfully have had success in logistics/vaccine development.

I believe this would have been an even worse strategy for policy acceptance. We were and still are dealing with unknowns and these unknowns are able to justify wearing mask as a precautionary measure without a (not so) noble lie being necessary. Society isn't a military unit and has profoundly different dynamics. Some will ignore this advice but the repercussion of that are very likely managable.

This type of PR will not work with a modern information infrastructure anymore or at least significantly less effective.

> handed ammo to the anti-vaxxers by the truckload

It is a stupid image of an enemy and policy was crafted with a steady look on these groups. That is a mistake in leadership if there is one. You have provided them with legitimacy without anything for you to gain. A complete waste for nothing and you elevated your opposition without benefit to you. That is neither competent leadership nor politics. Figuratively not looking them in the eye would be better. They are beneath you and don't even warrant attention. Few politicians would have even used the term, at least the smart ones.

That said, since I oppose mandatory vaccination I am an anti-vaxxer myself in the eyes of many.

The issue that plagues us is a lack of trust in long-standing societal authorities.

We can't follow the entire scientific proof chain for every piece of information we encounter, because we don't have time.

So we rely on authority to some extent, whether that be peer review, government, independent bodies, etc.

We need to be able to trust that these bodies are telling us the truth and aren't seeking to mislead us. Because when we start to doubt them, we then inevitably elect alternative bodies, simply due to limited thinking capacity/time - as explained above it's impossible to do otherwise, no-one derives from first principles every opinion they hold.

The best way, _overall_, to convince someone to do something, is by clearly explaining to them the positives and negatives and letting them come to their own conclusion.

It doesn't always work, and there are specific situations (e.g. someone is holding a gun to your face) in which the cost/benefit analysis is very different - in such a situation, the short term is all that exists, the long term effect of misplaced trust is irrelevant - you simply have to neutralize them.

But in general, I'm absolutely sure that education over coercion is the correct approach for society.

Because if you force them, sure, you've got a short term win, at the long term cost of trust. What is the long term cost of lost trust?

I'm skeptical that education is a solution to collapsing societal trust.

First, the primary organs that would be capable of this education are the media, who have a profit motive to stoke any and all fears (immigration, radiological warfare, world war, coffee being carcinogenic).

Second, I think this distrust goes beyond "the CDC said something wrong once, I don't trust them as much anymore." That could be solved with education. But we're seeing a more fundamental rise in broad-based distrust of any authority figure, and it's a much more instinctive response.

In a world where authority is more powerful and centralized than ever, some are bound to feel powerless. Someone's actions on the other side of the world have more impact on your life than they ever could (pick depending on your priors: a Wuhan wet market customer's appetite, or a Wuhan Institute of Virology employee's carelessness). With such an interconnected world, how could conspiratorial thinking not be appealing to some people?

Power inherently fosters distrust and cynicism. You can have as much pedagogy and education as you want: rational discourse just can't combat a fundamental instinctive reaction, a growing sense of claustrophobia and powerlessness that some people are bound to feel. And the response to this distrust? More control over discourse, over communication mechanisms[0], and ultimately, over citizens. Governments and corporations are more powerful than they ever were. Surveillance, and enforcement of rules, are easier and more automated than ever. How could our primitive, Dunbar-number brain wiring not react with distrust, or even hostility towards (some) authority figures?

[0]: https://www.reuters.com/world/europe/politician-says-germany...

It's not just that the CDC and other institutions of scientific authority 'said something wrong once.' I think most people understand and accept that science is not perfect.

Instead, those institutions have shown that they are willing to publish nobel lies. Specifically, they are willing to bend the truth and publish false information if they believe it is in the best interest of society / greater good. Given an entity that has shown willingness to bend the truth 'for the good of society', isn't distrust and cynicism is rational?


EPA famously did the same during 9/11. It would have been better if they instead trusted us and said something along the lines of "the dust is full of dangerous asbestos, but we need heroes this day".


Instead they prefer messaging to achieve an outcome instead of direct truth.

This kind of failure happens as a consequence of an intellectual environment: the impossibility of carrying a nuanced message. This perhaps can be solved by "more education", in the broader sense. Until then it's hard to blame individual instances of public officials who try to do their best to distill a complex tradeoff in a public announcement where they either have to talk with an authoritative tone otherwise they'll be criticized for being weak if they don't or they will be criticized for being too paternalistic if they do. They are between a rock and a hard place and various figures are trying to carve their own paths without any rule book, and most of them will spectacularly fail.

> otherwise they'll be criticized for being weak

politicians, sure, but public officials such as the EPA?

Ok. Replace weak with "shifty" and "unclear". In any case a public communication that doesn't reassure will quickly be replaced by one that does, by escalating the issue to the nearest political figure who takes on the job of providing such assurance.

I would think lying would make someone seem more shifty than making the facts known.

> that doesn't reassure will quickly be replaced by one that does

sure, just doesn't need to be the EPA that does it. pre-empting politicians and doing their manipulation for them is not justified - politicians are not a source of truth.

don't get me wrong; I don't think anything of that is justified. I just think there is a selective pressure for this behaviour to emerge. High-profile public officials are politicians for all intents and purposes. They shouldn't be, but they are.

> It's not just that the CDC and other institutions of scientific authority 'said something wrong once.' I think most people understand and accept that science is not perfect.

I disagree. I think a big part of the problem is people having unreasonable standards of accuracy or consistency. On COVID, the CDC was never going have everything perfectly figured out in the beginning, and then consistently repeat the same message from January 2020 to January 2022. There have been too many unknowns and too much of a need to act urgently before everything was figured out. To this day you still have people people justifying their distrust by saying things like "I though we just had to stay home for a few weeks to flatten the curve."

Also, there's an unreasonable expectation that messaging should be be all things to all people. If a message widely disseminated and understood, it needs to be simple and clear. However that will lose nuance, and some people will launch into distrust because the nuance they demand isn't there. The problem is those goals are contradictory, so someone's not going to get what they want.

> Instead, those institutions have shown that they are willing to publish nobel lies. Specifically, they are willing to bend the truth and publish false information if they believe it is in the best interest of society / greater good. Given an entity that has shown willingness to bend the truth 'for the good of society', isn't distrust and cynicism is rational?

I don't think distrust and cynicism is rational in that case. It would be if they "lied" [1] selfish reasons, but if you assess that their motives are pure, then descending into "distrust and cynicism" is throwing the baby out with the bathwater.

And frankly, if "noble lies" bother you, wait until you learn the kind of ignoble lies people start believing when they become distrustful and cynical.

[1] Quotes because frankly most things people call lies aren't actually lies.

If an institution cannot give sufficiently accurate facts to me, why wouldn't it be rational for me to try to figure out the truth (to my satisfaction)?

You may assume that "I" do not have sufficient education and training to interpret state-of-art publications, and/or assume that "I" would end up trusting dubious alternative theories, but you do not know. The "average" person may indeed be incapable of finding out the truth by themselves, but I'm really sick of "science-y" people giving blanket statements on what others should believe without actually checking whether they deserve the condescending remark in the first place.

> You may assume that "I" do not have sufficient education and training to interpret state-of-art publications, and/or assume that "I" would end up trusting dubious alternative theories, but you do not know.

If you're that smart, you should be smart enough to 1) understand the pressures they're under, 2) be empathetic, and 3) seek out information that may not be adapted for mass communication.

> but I'm really sick of "science-y" people giving blanket statements on what others should believe without actually checking whether they deserve the condescending remark in the first place.

Check with who? The public? You personally? Throughout this whole pandemic, precisely zero "science-y" authorities have had conversations with me on this, so I'm speaking exclusively about mass communications.

The lies people tell to the distrustful and cynical don’t matter much. When you take the time to evaluate things from first principles, someone saying something misleading around you just isn’t that impactful.

Institutions being wrong and misleading is something that is immensely harmful because of trust.

Most of the CDC guidance has been verifiably insane from the start of COVID — often from the CDCs own advice before the pandemic.

We need to stop excusing their behavior.

> I don't think distrust and cynicism is rational in that case. It would be if they "lied" [1] selfish reasons, but if you assess that their motives are pure, then descending into "distrust and cynicism" is throwing the baby out with the bathwater.

Perhaps not purely rational, but predictable and consistent with emotional norms.

> Perhaps not purely rational, but predictable and consistent with emotional norms.

That makes no sense to me. On an emotional level, my assessment of someone's intentions is a far more important factor in determining how I feel about them than any statement or group of statements.

I think the noble lie the parent is talking about has to do with the CDC's lie about masks early on in the pandemic.

> I think the noble lie the parent is talking about has to do with the CDC's lie about masks early on in the pandemic.

Are you talking about the "don't buy masks" thing? Because when I dig up the actual early communications, they're not lies:


> CDC does not currently recommend the use of facemasks to help prevent novel #coronavirus. Take everyday preventive actions, like staying home when you are sick and washing hands with soap and water, to help slow the spread of respiratory illness. #COVID19 https://bit.ly/37Ay6Cm


> “Seriously people — STOP BUYING MASKS!” the surgeon general, Jerome M. Adams, said in a tweet on Saturday morning. “They are NOT effective in preventing general public from catching #Coronavirus, but if health care providers can’t get them to care for sick patients, it puts them and our communities at risk!”

Both of those statements were made at a particular time under a particular set of unknowns, and the only way I could see them as "lies" is if I misinterpreted them by failing to consider that context.

The most egregious false statement is probably the thing about hand washing, but it's a not a lie because given how little was known at the time and the fact that the CDC isn't up in some ivory tower, so they had to say something and just recycled previous advice. In Feb. 2020, there wasn't much COVID period, so running around with an N95 to the grocery store would just waste it, especially if it's worn incorrectly (and I saw lots of that, especially using only one strap).

Also, both those statements are far more nuanced than I remember, which I think kind of proves my point about nuance. People claim they want it, but they still strip it away and remember some highly simplified version.

Here's what Fauci said after the fact about the CDC's early stance on masks [0]:

> So, why weren't we told to wear masks in the beginning?

> "Well, the reason for that is that we were concerned the public health community, and many people were saying this, were concerned that it was at a time when personal protective equipment, including the N95 masks and the surgical masks, were in very short supply. And we wanted to make sure that the people namely, the health care workers, who were brave enough to put themselves in a harm way, to take care of people who you know were infected with the coronavirus and the danger of them getting infected."

I think it's pretty clear that they intentionally downplayed mask wearing for the public with the goal of saving masks for hospitals, even though kn95 masks were on the market[1], and wearing masks would have saved lives.

[0] https://www.thestreet.com/video/dr-fauci-masks-changing-dire...

[1] https://web.archive.org/web/20200407162126/https://wiphone.i...

Uncertainty and especially admitting when something is not yet known is perhaps the most important part of science. The point is to do experiments and actually test things to get closer to the truth. One must admit epistemological uncertainty until the actual experimentation.

The CDC still has not performed or directed any large-scale experiments of any kind. They are a political institution, not practitioners of science.

Saying something definitive like 'Masks are NOT effective in preventing #Coronavirus' before the evidence is in is quite anti-scientific.

> The CDC still has not performed or directed any large-scale experiments of any kind. They are a political institution, not practitioners of science.

> Saying something definitive like 'Masks are NOT effective in preventing #Coronavirus' before the evidence is in is quite anti-scientific.

1. The statement you quote came from the Surgeon General, not the CDC. The Surgeon General is a political appointment, and as far as I know, the CDC does not report through him.

2. IMHO, the CDC's problem was actually the opposite of what you are complaining about. It was too scientific. Science is slooooooow, so if you rely on it too heavily then your problem will get "inside your OODA loop" and you won't be able to act quickly enough to tackle it effectively. The CDC has multiple hats, and it needed to push its science hat back in order to wear its crisis leadership hat in front.

Science is not the right tool for every job. It works great for things that change or move slowly (like physical laws) and very poorly for things that change or move quickly. It's an error to expect/demand every problem be solved with science

I believe that people at large are better at sniffing out bullshit than the elitists running these institutions realize. Unfortunately most aren't as good at separating out the actual parts being manipulated thereby leading to increased distrust of "science" and to crazy flat earth groups becoming a thing. sigh

I'm also skeptical. An unprecedented percentage of Americans have well over a decade of full-time education, yet here we are. People question it's quality or assert that some degree-holding buffoon proves it worthless, but I doubt they have convincing empirical evidence that our population knew more in another era.

I agree with you that this won't happen, simply because (via an evolutionary basis) the maintenance of social order comes before the dissemination of truth. The two might coincide, but where they don't, social order comes first.

This is ultimately what happened during the early days of the coronavirus pandemic. It's what happens during fuel scares, it's what happened during 911 as a below poster explains, it's what happens within companies when layoff rumours circulate.

Even within the closest of romantic relationships each partner is not 100% truthful with the other because there are some things that are simply better not said.

Unless the idea is that we descend into some sort of authoritarian nightmare and the general population becomes irrelevant though, I'd argue that social order does require some semblance of trust in power even if only in the basics (e.g. if I don't cross the State, the State won't cross me).

Once that falls apart the whole thing is in ruins.

I wasn't social media that eroded the trust in research and science, it was the introduction of politics.

We need to separate research from politics. It needs to be a separate pillar of the state. You don't want reseachers chasing funds continously and doing short-term research for local sponsors. You want sizeable investments in long-term research projects, high autonomy and following of the scientific method. This way people would trust the results.

It really was social media.

Tell an algorithm to optimize for engagement and it will promote the content causing the most engagement: novelty, conspiracy theory, outrage, surprising "revelations" and all sorts of clickbait.

If we removed Twitter, Facebook and Tik Tok and reverted YouTube to the pre-algorithmic (subscription based) era, we'd still have free speech on the internet but probably a lot less bullshit.

Note: Twitter would still promote bullshit even without a ML algorithm. The rules of engagement:

  - short content
  - individual popularity means content popularity
  - retweeting is instant, requires no critical thought
create an environment where clickbaity emotion-viruses spread at exponential rate, and things that require engaging the brain are ignored. Its essentially a crowdsourced rating system for tabloid headlines by design.

It wasn't social media who decided that the government should strait up lie to people just because it was the best way (in their opinion) to get the best outcome.

A big part of the anti-vaxx movement comes from the fact that the government have made it very clear they will lie and misrepresent data whenever it is politically advantageous for them.

If you want a very recent exemple, just look at this interview from someone very high up in the food chain at Bayer (youtube.com/watch?v=qowDwaYx7vI) and then try googling about whether or not mRNA is gene therapy. it was considered dangerous misinformation to say what he just did and yet he is overseeing the fabrication of covid vaccines.

It's far from the best example, but it's one where I won't have to waste an hour getting proper sources

It's hard to defend this without arguing for allowing the government to lie to people without any defense.

However, even if it delivers to you true information about abuses and lies being told to you, you should still be aware that the system is doing this so that you will be outraged, rather than that you will be informed.

Yeah, this is a classic example where social media spreads falsehoods, including mistakes.

Here the exec is stretching the definition of the term gene therapy to include things that don't alter genes "mRNA vaccines are an example for that cell and gene therapy". He basically makes a mistake, because the same platform is also used for gene therapy.

To use a software analogy, just because both mRNA vaccines and mRNA gene therapies are software, doesn't mean that mRNA vaccines perform system updates (instead they trip the suspicious behavior detection of the anti-virus and trigger a download for new virus definition files).

Additional subtle distortion happens later on social media when people take this mistake and "shrink" it in the opposite direction "actually mRNA vaccines are NOT vaccines because they're gene therapy".

Joe Rogan then takes that and stretches it further to imply this is why the vaccines don't last long

> This is really gene therapy. It's a different thing. It’s tricking your body into producing spike protein and making these antibodies for COVID. But it’s only good for a few months, they’re finding out now. The efficacy wanes after five or six months. I’m not saying that people shouldn’t take it. But I’m saying, you’re calling it a thing that it’s not. It’s not exactly what you’re saying it is, and you’re mandating people take it

Sinopharm makes a COVID vaccine from dead / inactivated virus. Its used widely in China as well as in the developing world. We've also seen the same reduced antibody neutralisation there, and it isn't necessarily because of the passage of time but to a large extent due to the evolution of the virus to be significantly different from the original.

So yeah, this is a perfect example of how social media spreads engaging novelty - whether its deliberate disinformation or simply mistakes - and continues to distort it further by continuously applying stretching and shrinking. The info blob continues to evolve to evade people's "immune defences" and cause them to re-share it (e.g. retweet).

This is not something new, traditional media have been doing it for decades ever since the rise of the yellow press, with varying levels of subtlety. The difference is that this here is an "organic", crowdsourced process that yields better, more convincing results, as it has to defeat the "immune systems" of many people to get a good reach via resharing. Only the highest quality misinformation makes it through.

The worst bit about this phenomenon is that corrections don't work well. People who haven't heard the rumor don't care about the correction (its not interesting) so they don't share it. People that have heard the rumor before don't like being wrong, so they don't share it either. So it doesn't matter how many scientists explain the difference, the explanation will never have anywhere close to the reach of the original (likely hundreds of millions of people)

Predictably, got the downvotes. Nobody likes corrections and nobody likes elucidations of truth distortion.

Social media (especially short-form ones like Twitter and TikTok) is just a natural evolution of media trends that were already present in society. TV replacing radio/text and tabloid newspapers are just two examples off the top of my head. If Twitter/Facebook/TikTok weren't there, other alternatives would've been there to fill the void.

Just like we recognized tabloids for what they were, we need to recognize social media for being the same phenomenon disguised in a "grassroots" cloak (i.e. crowdsourced tabloids)

There is nothing organic about the speech that happens on YouTube. Veritasium explains the shift that happened when youtube switched to an algorithmic model from the subscriber model: https://www.youtube.com/watch?v=fHsa9DqmId8 - you need to carefully manipulate people to click your thumbnail based on the picture and the video title.

Making outrageous but believable shit up to sensationalize your content is a well known tactic of tabloids (which does continue today to some extent). YouTube and other social media incentivise exactly that.

>If Twitter/Facebook/TikTok weren't there, other alternatives would've been there to fill the void.

The particular platform that has these characteristics is arbitrary and beside the point. No one is arguing that Tik Toks particular implementation of a short form engagement optimizing feed is dangerous.

It really is just people. Look at all the information about HIV in the 80's, long before social media. People believed some truly crazy, racist, and misinformed things about HIV and its spread.

People are multifaceted beings. Social media AI automatically exploits any and all of our failings in the name of engagement. Traditional media used to do that before to some extent but the current situation is unprecedented in its reach and efficiency - custom tailored to individuals.

Technology is supposed to compensate for our failings, not exaggerate them to the extreme.

It really was crazy. Back in 1985 Neil Young apparently thought that he could catch HIV if a homosexual clerk handled his groceries (obviously nonsense and highly offensive).


There's one with Fauci doing the same XD

The Clown Timeline might have been the last Cold-War era weapon, a doomsday weapon.

But Neil Young didn't died by HIV... Wise man.

"You were -too cautious- about that new disease that was killing everybody in 1985" feels more like a dark joke than a serious claim. Everybody was shocked when Rock Hudson died in 1985.

History repeats itself.

I strongly protest the concept that the recent introduction of politics is making things worse than they used to be.

If you ask anyone who was senior working in these institutions in the 1970s, things were far more dodgy back then.

Corruption and bigotry were rampant. But it was far easier to cover up back in the day.

Politicization is mostly driven by transparency.

Politicization is not a new thing, it's a fact that there's a lot of extremely conservative individuals in each country. And technology has bought them closer to their institutions.

Society is being brought very closely together, and we're working through the teething issues.

>I strongly protest the concept that the recent introduction of politics is making things worse than they used to be... Corruption and bigotry were rampant.

It is wild how people think things like institutional bigotry aren't politics. How is a scientific organization that excludes large swaths of the population due purely to the circumstances of their birth apolitical? Science has always been political because politics is how people interact with each other. That is true whether we are discussing how governments behave on the world stage or how that big project is being managed at your job. Wherever people exist, politics exists.

> Politicization is not a new thing, it's a fact that there's a lot of extremely conservative individuals in each country.

How are you using “conservative” here? The right certainly has no monopoly politicizing science.

Most people accept that science in past generations was, to a certain degree, corrupted by the prevailing prejudices and societal/cultural/political/ideological beliefs of the time.

But, if we acknowledge that was true then – why shouldn't we expect the same to be true now? The prevailing prejudices and ideologies may have changed, but has anything changed about the actual practice of science to make it more immune to those forces? I think a lot of people can't see it, because it is far easier to see when science is being influenced by beliefs you reject than when it is being influenced by beliefs you yourself share.

I think the politicisation of science is inevitable, and science is never going to be entirely politically neutral. Scientists are human beings, and as Aristotle famously said, "man is a political animal". I think science is going to achieve the maximum possible political neutrality when it is either (a) things so clearly established no one can sensibly debate them (which especially happens when scientific theories get translated into widely adopted working technologies–the very fact that the technology works is proof the theories behind it must be mostly correct); (b) things that avoid political entanglements because they have little relevance to everyday human life (e.g. string theory vs loop quantum gravity isn't influenced by broader societal politics because it has no relevance to issues of social/cultural/economic/political power, although even it is surely influenced by the micro-politics of the theoretical physics community, its funding models, etc.)

Take something like human-induced climate change – personally, I accept the mainstream account of that topic as most likely to be correct, but I can't be completely sure that it is right, surely there is some non-zero probability (even if a small one) that it is wrong. But, I expect people in a couple of centuries from now will be able to reach a much firmer consensus – they'll know whose predictions turned out to be right, plus technological and economic developments will likely greatly reduce the number of people with a vested economic interest in the non-mainstream answer. I think the most likely outcome will be that the mainstream account will be vindicated, and almost everyone will acknowledge that the minority were wrong; but it isn't absolutely impossible that it could turn out the other way around. And whichever way it turns out, I think people will see politics as having played a major role – if the mainstream account is confirmed, people will see much of the opposing minority as being led astray by politics; but, conversely, if the sceptical minority turn out to have been right all along, people will reach the same conclusion with respect to today's mainstream.

I think the vast rate at which things are improving every decade is really dramatic.

Compare now to ten years ago, before me too. Things are hugely different.

I fully expect that if you give it a long time, things will be markedly better. In a hundred years for example.

Economic prospects are certainly worse for the younger generations than some generations before, especially in regards to affordability of living. They have more tech gadgets and more access to information though...

Are things "improving every decade" at a "vast rate"? What exactly is improving? Doesn't that depend on one's values as to what counts as "better"? Do you mean technological progress, economic progress, social progress?

Perceptions of progress seem to depend a lot on (a) where in the world you are, (b) what things you value, (c) whether you choose to optimistically focus on the positive developments (from the viewpoint of your own values) or pessimistically focus on the negative.

Even if we agree that certain forms of progress are occurring, do those various forms of progress necessarily entail scientific progress? I think, the only form of progress which necessarily entails scientific progress is technological progress – but even for that, some scientific fields have much greater technological relevance than others, continuing technological progress requires continuing scientific progress in technologically-relevant fields, yet is compatible with going backwards in areas of science which lack direct technological application.

You can be philosophical about "improving", but in the context of the thread and the post, I think we are talking about technological improvement over the last few years as an example of tangible scientific improvement, even if a lot of papers that are published suck.

Okay, but that comes back to the point that some scientific fields are more relevant to technological progress than others.

Getting physics and chemistry right is really essential to technological progress. How can technology advance if your physics and chemistry are wrong?

But, what about "softer" sciences, such as psychology or the social sciences? It seems you could be moving in the complete wrong direction with them, and yet still be advancing technologically – because technology largely doesn't depend on them.

And even for the "hard sciences", technology generally only depends on certain areas of them. How much of the Standard Model of physics is actually technologically relevant? Technological progress is strong evidence you've got the technologically-relevant parts of physics right, but is compatible with serious error in low technological-relevance areas. What is the technological relevance of electroweak unification, for example?

We also don't know what we are missing out on. Neuroscience is an area of biological research in which many promises of clinical translation have thus far failed to deliver. Is that because this the brain is very complicated, and we just have to wait longer and invest more before we see more results? Or could it be that we are doing the science wrong, and we'd clinically be in a much better situation now if we were doing it right? How can one tell which of those two possible situations we are actually in?

That won't just happen. We have to make it happen.

> Most people accept that science in past generations was, to a certain degree, corrupted by the prevailing prejudices and societal/cultural/political/ideological beliefs of the time.

Why say "in the past"? Do you think it's not now?

I don't think you've understood me. You sound like you are disagreeing with me, and yet you are saying (more or less) the same thing as I was.

The amount of resources available to spend on research is finite. Distribution of limited resources is politics.

Historically, scientists either were well-off and independent (e.g. Ulug beg, Newton, Cavendish), or worked at the expense of very rich personal patrons (e.g. Tycho). This limited the number of potential scientists, and the amount of science done, quite drastically.

In 18th-20th centuries it was somehow overcome, when partly states, partly industries gave more scientists more resources. By mid-20th century, the system started to develop cracks, as science became increasingly driven by formal metrics, such as impactful publications. The metrics are actively gamed, but worse, since science is unpredictable, honestly checking everything and building a picture of reality became a worse strategy than going for a flashy if more weakly researched result.

I don't know a good way out of the current trap.

How about giving the institutions very top-level goals, on say 20 or 50-year basis, and then just give them a bag of money?

Research is a high stakes, high rewards type of game. You need to pursue many potential ways forward, and most likely only one will win.

Rewarding short term wins makes academia no different from industry, and prevents finding anything but the local maxima.

> I wasn't social media that eroded the trust in research and science, it was the introduction of politics.

It's not like the medical system was a highly trusted and respected institution that politicians somehow undermined in the last few months. The lost of trust in the medical system has been literally over 150 years in the making. If you read some books on the history of the rise of homeopathy in the United States in the mid 1800s, the parallels with today aren't exactly subtle.

I'm curious - what is the story of homeopathy in the United States? I know it is oddly popular in Germany.

> We need to separate research from politics.

That absolutely will never happen as long as the funding comes from the government.

I'd add that even if it received funding through private enterprise, it would still be infused with the politics of said backers - whether intentionally or not.

The best you could possibly achieve is funding purely through small time (non business) donations from the public - and I'm still a little skeptical as to the scope of research that could realistically fund.

> it would still be infused with the politics of said backers

Of course it would. Whoever pays the piper always calls the tune.

It's no surprise that NPR and PBS advocate for the government point of view.

Only true to a certain extent. Is the judicial power not separate from politics? Maybe not completely, but it has measures in place to secure some type of autonomy.

What I'm talking about is removing micromanagement and ensuring that the institution is trusted.

I would argue that it wasn't just social media. Before the internet, it was relatively easy for a politician to 'pivot', to say something to one group, but quite a different thing to another. Now it is carefully catalogued and these days carefully tracked by anyone with any interest to care.

Social media just put some gas on a fire with engagement optimization. The rest was just human nature.

Thanks for your comment. I broadly agree with your points and find them insightful.

In particular I appreciate that you model authority as an analogue for trust, and trust as an analogue for independent verification.

I think trust is a prerequisite for successful convincing, because it is hard for human brains to overcome the emotional and relational elements of discourse, and we need to be in a receptive frame of mind. [citation needed]

We should study how authorities have built and lost trust, but I don't think we should be rebuilding those authorities using this knowledge. Instead I think focusing on how members of the public and different experts and scholars can build more diverse networks of trust relationships would create a more resilient system in the Internet age.

For example, instead of "I trust the CDC because they are an authority on disease" or "I trust Fauci on COVID because he is an immunologist", what if the common relationship was "I trust Fauci on COVID because his work on Zika and HIV was very interesting".

A related problem is what I call the 'knowledge gap'. Researchers develop a very comprehensive understanding of their chosen subjects, whereas members of the public have only a lay-person level of understanding. As experts become increasingly experienced and knowledgeable, the gap between lay-person understanding and expert comprehension grows. This means researchers are not always able to understand what a member of the public needs to know or how to explain it: how do you compress years of learning into a few minutes of talking, as part of a conversation? Good science communication skills are essential for experts if they want to develop trust relationships with members of the public, and if we as a society want to move from authority and coercion and towards education.

Thank you.

I've posted elsewhere in the thread about my own opinion that I believe the main issue we have is not with trust in the scientific sense (e.g. I think that most people generally _do_ believe that epidemiologists know what they're doing and that the outliers are exceptions), but with trust in the plan of action e.g. the entire chain of process.

Examples, all primarily via trust (I'm not a biologist):

I believe that excessive consumption of sugar can cause diabetes.

I know that alcohol can (will, given enough time) cause liver disease.

I know that driving my car drastically increases my risk of being in a car accident and therefore meeting a swift end, an agonising end, or perhaps a lifetime of disability.

I still engage in those activities despite the downsides because I believe them to be of net benefit, both to myself and to society as a whole. I'm willing to fully argue through that process.

Where I think the issue lies at present is that we're sorely lacking in convincing full-chain argumentation.

A person can trust that the UK health authorities know their stuff, they can believe that coronavirus exists and can be dangerous, but that doesn't solve the problem of whether or not they should skip out on their friend's birthday party or not. It only makes up a very small part of the jigsaw puzzle, the rest still has to be filled in somehow.

An interior minister that is dutiful will most likely always opt for more surveillance. Safety of every citizen is his responsibility and he would not leave anything to chance. If if the minister is no friend of authoritarianism, his position will most likely encourage certain policies in a predictable way.

But that safety has to be balanced at some point. Same is true for any ministry of health.

Dying sober might not be the preference of everyone and optimizing on a trivial metric might do a lot of wrong. In fact I believe most people would be horrified to lead a completely rational and reasonable life.

Agreed, it can be very difficult for anyone to make useful decisions outside of just trusting the existing consensus.

I mean there is no real way for me to know if a vaccine is safe. I assume it is, and prefer to gamble on it not harming me and potentially helping the current situation.

But there is really no way for me to know for sure.

To put it another way, when the Omicron variant emerged, there were newspaper articles that often showed pictures of the spike protein labelled with different mutations.

As it happens, I did research in protein structural bioinformatics, so I have a good idea what the detail in these images 'mean' in a certain sense. However they are of zero use in making a single meaningful decision about how to alter my behaviour (or not!).

Of course, the journalists add these images as decoration, but to me it symbolises the mass of irrelevant detail provided. Which is not the same as saying 'just believe science!' as that is also wrong and unhelpful.

I'm not sure what the answer is, really.

I actually think we may learn a lot by studying whom someone distrusts and the underlying causes. I imagine we would find that most of the times, the distrust is projected distrust—in other words, I distrust X person in my life because they did A thing and person Y is like X therefore I distrust person Y as well.

For example, I think many distrust Fauci less because of Fauci himself and his behaviors and more because his behaviors may remind them of their primary care physician who treated them condescendingly, saying they were stupid for asking a question. Or because their parent got Alzheimer's or cancer and don't know where it came from and were told by doctors that their family member would be fine, even though their family is not fine.

I guess I just wished we paid more attention to the transgressions that happened to find the presumed cause of distrust so we can repair it, because I think trust can be highly efficient and distrust highly inefficient at times.

And yet few people want to hear "your family is not fine". Many people don't want honesty, they want comfort, and they (wrongly, IMO) look to experts for comfort. This puts doctors in a damned-if-you-do-damned-if-you-dont situation. It's easy to translate "doctor was honest" to "bad bedside manner".

I think people want both—I believe I do. I want people to be honest with me and considerate. I don't think it has to be one or the other, yet we often haven't learned how to communicate with both.

I can say, "I want to be honest with you, your relative may only have a few weeks left. I imagine that might feel really scary to hear or perhaps you might even feel angry, I don't know. I wanted to be honest with you and to let you know that my team and I will be here for you and your family."

I don't disagree, and it's probably the right option - but most people are lazy thinkers. You want the simplest answer that explains the problem in a way you understand.

If someone is trying to take advantage of a situation that's inherently complicated or unpleasant - it's easy to do so with a simple message that makes it digestible, even if it's wrong or harmful. We've seen this time and time again over the last few years. And then worse still, social media perpetuates it by keeping you in a bubble and pushing you further down a path.

You could be right, and as far as I can tell - that's what we've done.

But is it sustainable to lie to people in order to effect a desired (even if positive) change? How long can it go on for before people just turn off?

Maybe the answer is indefinitely. Empirically at least in my country this doesn't seem to be the case.

I've heard this been described as "pandemic fatigue", but it seems to more revolve around taking orders than believing in or being concerned about the pandemic itself.

People are willing to follow counterintuitive rules for a while, but eventually they need to understand some sort of final benefit derived.

The vast majority of people don't get "not drinking bleach fatigue" because they know it's bloody awful and that if they don't drink it, they're gone.

I see that as a failing of the policy setters and communicators. You can communicate a simple message, but you have to show your work too. Pretending counter-points don't exist only leads to "gotchas" and distrust down the road. This has been the the theme of last few years for me. If you want more than a public policy soundbite, you should be able to get a public position paper describing the assumptions, arguments, and counter-arguments.

> because we don’t have time

This part is a huge society-wide problem that can theoretically be fixed so people can become more informed and active participants in the framework of their daily lives. Don’t ask me how, though, because I just got off work and am too exhausted to do anything except let the news tell me what to be afraid of today, then probably binge The Office and slam a few brewskis until I fall asleep on my couch ‘cause I got work again tomorrow

Even if I do have time and energy, my lack of knowledge on the subject matter would probably lead me to waste that time on dead ends, wrong conclusions, or learning things I will never use again outside of fact-checking one study.

> dead ends, wrong conclusions

You have to trust yourself to try anyway. Beats doomscrolling even if both are equally meaningless in the long run. Something that helps me is refusing to accept any narrative or conclusion about any topic where it would require a specific named group of people (e.g. members of any nationality, profession, corporation, ancestry, presentation, etc — anything) to be uniformly dumb or malicious.

That might sound silly but it means there’s no downside to entertaining any consideration, not even the downside of my own bad feelings from negative ideas. It totally removes a layer of mental latency I didn’t even realize I had until recently, even about topics where that wouldn’t ever be a question that’s possible to ask.

Instead of getting bogged down checking one story after another it’s easier to ignore entire storyline arcs at a time once they’re more obviously one set of jangling keys after another all vying for my attention at a higher priority than any self-directed interests :)

We, as a society, bought a dog, and it took at literary shit in our communnal kitchen. No one wants to clean it up, and frankly, I can't blame you/me/any one individual for it. It's exhausting but a society-wide problem. It sucks that the only response from platforms has been to ban the discussion, rather than pay for a writers corp to come up with easy to parse FAQs why some of the garbage misinformation is a pile of lies, and to write systemic takedowns of odious dreck. But here we are.

The CDC recently dropped to 50% public trust, after repeated failures, mixed messages and outright lies in some cases.

They lie and then act incensed when the public stops believing them. Given the massive damage they've done, I'd consider it justified.

The CDC is an incompetent, bloated and fundamentally dishonest organization that was perfectly comfortable throwing grandma to the wolves via lies (masks don't work!) if it meant their unpreparedness wasn't exposed.

Via anti-masking lies, CDC irreparably damaged their reputation and also likely got people killed who believed them. The damage is deserved.

I remember pre-covid thinking that the CDC was the only federal agency that had its stuff together, but early on in COVID they really came off as inept. They failed to stockpile PPE (masks and such) or even make provisions for procuring the same. Then they told us masks don't work (as you noted) before eventually flipping on that position. Then they bungled the tests (instead of going with the WHO test like the rest of the world, CDC rolled its own which didn't work and was much less efficient to process).

Still, I don't share the distrust of a lot of folks regarding their policies and decisions over the last year or so. I think they're probably making reasonable calls for the available evidence even if hindsight eventually proves them wrong. To the extent their critics are correct, I think most will have arrived at their conclusions as a matter of luck (e.g., someone who knows nothing of virology happens to hear some study or other that hints that the CDC may be wrong and bases their entire worldview on that study without any awareness of the counter evidence).

For long term scientific correctness, what the hacker news crowd could help with tremendously is reproducibility and that means versioning and continuous integration with data, for research software. In context of a paper or doing research, you have to write software anyway.

The other one that pops to mind is probably statistical calculation helping services.

Maybe there are lots of other things as well - I haven't been involved with scientific research in a long time.

The main thing I think is lacking is representing data in helpful and actionable ways.

A theoretical climate change example might be:

Car A emits 10% less CO2 than car B.

Driving is 30% of your annual emissions (i.e. it's significant and one of the low hanging fruit).

That 30% of your annual emissions if multiplied by 7 billion people is likely to cause X.

If everyone drives car B instead of car A, Y will happen instead.

Y is better than X, because we end up with a more verdant Earth, cheaper food, better weather, less war (insert reason here).

Therefore, our overall quality of life is higher if we switch to car A.

Without the whole information chain, you just have isolated statistics that don't really mean anything.

Convincing argumentation is not only about mathematics, it's about appealing to the things that matter to people.

> Y is better than X, because we end up with a more verdant Earth, cheaper food, better weather, less war (insert reason here).

The problem is that while it's easy to calculate exactly how much less CO2 car A emits than car B, it is close to impossible to use science to predict to any useful degree of accuracy how much cheaper the food, better the weather, or less of the war we will get. It's often hard to even predict the sign, not even trying to determine the magnitude. This means that when scientists or politicians tell you that "we need to do X, so that we get Y", they're very often fooling you, because they have no good idea how much (if any) Y you get for each unit of X.

I think you missed a step in the chain; if you increase the quality of life for humans, what are the consequences for human population growth and the consequences for the rest of the biosphere as a result?

This is not helped by the lack of scientific and logical literacy in the public at large who will cherry pick to support either position on a topic and call that "homework done".

The scientific process involves testing the accepted and pushing ideas to there extremes. The fact there is a test of 5G on mice isn't surprising, but when these show a statistically insignificant blip and it's not in the conclusion that's not "big science hiding the real truths", it's that this blip in isolation is a true statistical anomaly...

>The issue that plagues us is a lack of trust in long-standing societal authorities.

What caused that trust to be lost? Is it something they did?

Absolutely. At least here in the United States there’s a full on collapse of trust in institutions and supposed expertise and for extremely good reasons.

Over and over again those with authority have squandered it. An opioid epidemic devastated so many communities after doctors, health care executives, and McKinsey and Company said the new class of drugs were non-addicting. Our manufacturing sector evaporated after economists and politicians said free trade would benefit everyone. Thousands of young people were sent to die in the Middle East in a war all the experts said was necessary and would be relatively painless.

Those are just the first three examples off the top of my head. I could keep going. We were promised technology would improve our lives. We were promised borrowing money for education would improve our lives. And on and on and on.

All bullshit. All promulgated by supposed experts with impeccable credentials. It’s a full on disaster and the chickens have come home to roost.

There has also been a huge emphasis on institutional misdeeds, driven by the outrage economy.

Nobody takes note of the hundreds of life improving drugs developed in the same time period as opioid epidemic. Nobody takes note when cops arrest the right guy and saved the day. Nobody takes note when a country doesn't militarily threaten or invade the US.

Interesting point. Do you think were it not for outrage economy, would US tackle it faster/slower/at the same rate? Maybe the reason for outrage economy is that there is a fair amount of things to be outraged about.

I personally think the majority of the outrage economy is counter productive. It is important to identify and communicate areas for improvement, but I think most of the solutions generated from the state of outrage are off target. I think emotional outrage is a natural response to some of the injustice in the world, but rarely helpful in actually solving a problem.

Is the life of one innocent person an acceptable price for success?

What do you mean by success here? Depending on your definition, I think there are times where the answer is yes. There are clear analogs to the trolley problem in public policy every day.

Is the life of one innocent person an acceptable price for the lives of several innocent people?

Note well: I'm not actually a consequentialist. I'm just pointing out what "for success" actually means, at least some of the time.

The answer we have generally come to is based on a psudo kantian interpretation of causality. One innocent can die due to inaction, but you can't actively kill one to save many.

We don't allow nonconsentual body snachers to harvest one person to save many. Alternatively we may allocate finite funds to save several lives instead of a single life. We may also choose not to spend the money to save a life

Sorry nobody gives a fuck that our global military dick swinging prevented Senegal and Guatemala from sailing into New York harbor and carting off our treasures.

The outrage economy tends to focus on people and interpersonal drama. Some comedian said this, or maybe some company did something stupid and there’s a walkout.

If anything there’s a massive shortage of outrage. Given the degree of misery the entrenched elite is inflicting on the masses, the fact that we don’t have a general strike or torch wielding mobs swarming over the gates of billionaire’s houses is frankly astonishing.

Outrage is only good in that it generate interest in the topic. There is a surplus of hot-headed illogical people bogging down any real constructive discussion. There is a shortage of people that understand the realistic challenges involved and working to come up with viable Solutions on either side of a given issue

The problem is that the only viable solution is for an entrenched elite with disproportionate power and money to lose much of that power and influence.

It’s a struggle between the masses and the ruling class about who gets to make decisions in our society, not like lunch meeting where we can find common sense action items to agree on.

How does outrage help overcome this challenge. A good chunk of the country stands with the status quo and elites on any given topic.

I don't think outrage changes minds, just engages your base. It also encourages the working masses to lash out at each other. Most outrage topics serve mostly to pit the masses against each other, based on skin color, gender, or who has bigger scraps from the economic table.

If you want real change, the solutions are relatively simple IMHO. Ranked choice voting, proportional representation in senate and congress, and less winner take all closer to a Parliament.

Outrage economy is about making the opposite decisions from the "ruling class" no matter what those decisions are

> in a war all the experts said was necessary

Just to clarify, many experts from generals to peacenik linguists to gonzo journalists predicted immediately that the wars would be an expensive and unwinnable series of atrocities. They weren't given airtime.

Which is also why the trust in the institutions that give people airtime is virtually nonexistent, and people on the internet, sometimes those censored experts and sometimes people masquerading as censored experts, are getting all the attention.

You've touched one of the real veins of this issue that very few really want to talk too loud about. There are still a handful of good journalists doing good work out there, but they seem to have been shoved over into "crazy man on a blog ranting" corner, making it much easier for propagandized coincidence theorists to shut down all discussion from them as crazy conspiracy theories.

I think a good analysis of the history and changing landscape of media is highly in order if we want to get to the bottom of this.

As for me, I like to remind people of Operation Mockingbird quite a bit lately, and rest assured the name just got changed after the Church Committee and given the explosion of the internet they are certainly extremely pervasive in the new system.

As CIA director William Casey said: "We’ll know our disinformation program is complete when everything the American public believes is false."

Well yes that’s the point. By literal definition the people who were so obviously wrong weren’t experts.

But at the time, if you were observing “expert” opinion that was it.

That's not the way I remember it at all. The government were very obviously going into war because they wanted to and scrabbling around for anyone who would support them. You only have to look at the death of David Kelly.

> Lawmakers knew from the beginning the shakiness of the Bush administration’s case for going to war with Iraq, and Biden not only went along with it, he championed it.

- https://www.vox.com/policy-and-politics/2019/10/15/20849072/...

> [Biden] was able to choose all 18 witnesses in the main Senate hearings on Iraq. And he mainly chose people who supported a pro-war position.

- https://www.theguardian.com/commentisfree/2020/feb/17/joe-bi...


> Iraq in 2002 was devastated by economic sanctions, had no weapons of mass destruction, and was known by even the most pro-war experts to have no missiles that could come close to the United States. The idea that this country on the other side of the world posed a security threat to America was more than far-fetched. The idea that the US could simply invade, topple the government, and take over the country without provoking enormous violence was also implausible. It’s not clear how anyone with foreign policy experience and expertise could have believed these ideas.

> Senator Dick Durbin, who sat on the Senate intelligence committee at the time, was astounded by the difference between what he was hearing there and what was being fed to the public. “The American people were deceived into this war,” he said.

- Also from the Guardian link.

And that's just war. We were lied to about sugar; about margarine, about recycling, about oil spills, about agricultural pollution, about public transport. We were lied to about trade, espionage, healthcare. Repeatedly, barefacedly. Lies are the most bipartisan issue in America.

In the face of all this, we're told the enemy is distrust itself, told to trust the science, told to believe that Biden and the Dems are fighting for us. We're told anti-war news outlets and comedians are secretly communist, Russian funded. It's all such utter wank.

Eg, a doctor at a jail tested Covid treatments on inmates to see if it helped. That's a ground-shaking loss of trust! Where that's one crazy doctor and an isolated case, it's hard for the disenfranchised not to generalize to the rest of the medical establishment.


It's the noble lies more than any direct action that erodes trust the most.

> The issue that plagues us is a lack of trust in long-standing societal authorities.

Hmm. I wonder how we got there...[1]

[1] https://i.imgflip.com/63b63m.jpg


You are making some pretty uncharitable assumptions here, and frankly I find this entire post very distasteful. Who decides what is "pseudo-truth" instead of "actual truth"? I'll take a wild guess... you are the one who knows the "actual truth" unlike all the ignorant people who disagree with you...

....and that is exactly why no one trusts institutional actors anymore.

> I think your comic is referencing the political "left" vs "right" divide, you are part of "the right"

That's a big assumption. I could very well be part of the left who simply doesn't share "the left"(tm)(c)pat. pending)'s current party line because I find it to be nonsensical.

> People who exhibit this pattern are usually stuck in a rut of pseudo-truth.

There it is. That arrogance right there. "You don't share my class interests, therefore you are wrong". That, in essence, is exactly why no one trusts institutional actors -- because people realize they've aligned themselves with the bourgeoisie against the proletariat. There's no disguising it anymore. They will do or say whatever those who hold the purse-strings tell them to say.

> It doesn't help that a lot of institutions agree with the actual truth a lot of the time

Which truth is that? The one that said that masks helped halt the spread of the virus, or the one that said it didn't and we should stop hoarding them? Roll 2d6 and consult your Propaganda Master's Guide on page 666 to find the answer.

> People who exhibit this pattern often reflexively disbelieve whatever any authority says, just because they're an authority, i.e. Oppositional Defiant Disorder.

...or they've correctly identified them as class enemies who will do or say whatever it takes to defend their class interest such that you can't believe anything that comes out of their mouth. It honestly doesn't matter if they're right at that point because the lack of trust cannot be amended.

> The thing that makes it really insidious is the mentality they're right and everyone else is automatically wrong.

Indeed. Calling someone a part of 'the right' and saying they're from a side stuck in a 'rut of pseudo-truth' because they point out that institutional actors are biased to their own interest, have a history of lying, and this is why no one trusts them anymore, is quite insidious.

> I ask you: Hypothetically, if what you believed was not actually true, what would convince you of that?

Institutional actors abdicating their posts in favor of someone of proletarian class interest so proletarian ideals can be put to the test without interference.

Edit to add: I found out what happens when you present a video testimonial strongly calling into question vaccine clinical trials - it gets downvoted.

So what happens when you present a video testimonial like this - https://www.youtube.com/watch?v=L2GKPYzL_JQ - and people just dismiss it, people not understanding that with clinical trials you're meant to keep track of and report all outcomes, and not only that - by limiting the scope via an app that has limited options, and no free form input, you're crafting a very narrow scope of outcome, literally doctoring the outcome by minimizing the potential range and severity of adverse events? (If the people even watch the video or understand the implications of what Maddie's mother has shared; and if not, then you're not able to educate or inform them further - probably because it's a counter-mainstream narrative point, arguably the mainstream and government captured, regulatory capture from industrial complexes, etc)

Does the video described build or harm trust of our institutions? Does that this happened with this clinical trial, does that not automatically extrapolate that all of the other clinical trials were ignored, not given proper oversight, nor having integrity to them?

Anyone can make a video about anything. There are lots of videos explaining why the moon landing was a hoax. Was it?

Are you telling me you watched that video and that you're claiming the mother is lying?

The interesting thing about trust is it takes much longer to build than to spend.

Given the level scientific institutions have been spending trust over the past few years I'm not sure how this problem can be alleviated within our current framework.

You can't spend trust, you can only violate it.

It can’t be.

Let people draw their own conclusions and don’t censor or otherwise penalize them for questioning one interpretation of data vs another. Encourage people to do their own research and investigate who is funding the sources used for that research. There is no need for conspiracy theories when financial incentives exist.

>There is no need for conspiracy theories when financial incentives exist.

Why do you assume that the financial incentive is for the truth rather than for the conspiracy theory?

I don't. My point is that we should always be aware of the financiers behind any piece of research, prescribed solution, etc. and teach people to think critically about this.

But someone with an invested interest in research funding that research doesn't inherently mean the conclusion is wrong. If anything your suggestion would lead to more conspiracy theories and not less as people are now more focused on the motives for the research rather than the body of that research.

It doesn't mean that but it should always be treated as a possible factor, and analyzed in light of other connections. Collusion is very real when billions of dollars are on the line and it is naive to assume otherwise.

>It doesn't mean that but it should always be treated as a possible factor, and analyzed in light of other connections.

But do you not see how this will result in conspiracy theories? You are no longer engaging with the merits of a study and are instead questioning the motivations and source behind that study. You are shifting from a more objective argument to a more subjective one allow more room for conspiracy theories.

>Collusion is very real when billions of dollars are on the line and it is naive to assume otherwise.

Yes, which is why "do your own research" is often misleading because those billions of dollars have greatly impacted what research is available to you.

Ok, let's look at a scenario. If a board member of a powerful pharmaceutical company was also a chief executive at a news organization responsible for "fact-checking" claims about that company's medicines, would it be inappropriate to wonder about financial incentives?

You seem to only be looking at the surface of this issue. I agree that it is appropriate to be concerned about conflict of interest, but what happens when you take it one step further beyond your comment? I decide to "do my own research" and plug the drug's name into Google. The top site that comes up is the same news organization's fact check because no one else has a financial incentive to investigate this issue and report the truth. Me "doing my own research" leads me to the same biased information. The "do your own research" portion doesn't fix anything.

I’d be willing to bet that most people with the gall to do their own research will look beyond the first result that appears on the SERP...

It’s fundamentally an epistemological question: what is the ontological basis for thinking we know something?

>I’d be willing to bet that most people with the gall to do their own research will look beyond the first result that appears on the SERP...

Once again this is thinking that doesn't go beyond the first layer. If this situation is possible for the first link, it is possible for the second and the third and everything on the first 5 pages. However deep this hypothetical researcher is willing to go, if there are enough billions at stake the manipulator can keep pumping out misinformation. They have a financial incentive to lie and no one has a strong financial incentive for truth so how does this researcher find the needle of truth in the haystack of lies?

It goes far beyond the first layer, to the ultimate layer: what is the basis of our presuppositions about knowledge? How do we authenticate what we think we know?

We fundamentally can’t for most of our knowledge given the complexities of the modern world. Which is why the whole “do your own research” suggestion is BS. There simply isn’t enough time in the day to become an expert in everything in order to be able to accurately judge the validity of most research. That is why stopping the spread of misinformation has value because most people do not have the expertise to identify all misinformation.

“It’s too complicated for you to understand, we will mediate truth for you” is the language of priests.

But people should be helped to draw their own conclusions with annotations where every time some crazy idea is repeated it is automatically flagged with "This idea has repeatedly been debunked by 37 national societies of medicine and 19 meta analyses of 2,000 peer-reviewed studies with a quality deemed high or superb," etc.

Here’s one better: show people how to reproduce the same results given the same set of controls. Reproducibility should be the standard, not committees of debunkers.

All the time those fact checkers are provably biased (haven't seen none of them fact checking the early statements that masks were useless) they are useless.

If however fact checkers were a pain in the butt for mainstream media and politicians as well they would be much easier to listen to.

>every time some crazy idea is repeated it is automatically flagged with [...]

This is basically "fact checking". In theory it's good, but it inevitably degrades into a farce with stuff like [1] or [2].

[1] https://twitter.com/msnbc/status/785299708730339328

[2] https://www.bmj.com/content/376/bmj.o95

This is a commonly suggested solution that fundamentally misunderstands the problem. Most of these people aren't believing falsehoods because they don't have access to better information. It's because they don't trust the institutions that they believe drive science. Additionally, as these annotations only represent an opinion and are subject to change when they aren't correct they likely cause even greater distrust.

Do you mean "Do their own research" like make up conspiracy theories out of thin air and parrot nonsense deep state rhetoric? Completely divorcing yourself from reality? You want more of that?

The world will always have heathens and heretics. Some people will always come to the wrong conclusion some of the time.

Skepticism of any claims is a pretty good heuristic, but should be equally applied to counter claims.

Right. Why do we care so much if everyone agrees?

Thing is, to a large degree, we haven't had to before. If someone random person wants to believe the moon landing is hoax, it doesn't affect my life in the slightest. What's changed is that random person walking around with no mask and no vax. Now, a random person in a Target market a thousand miles away from me isn't literally a threat, but where that attitude is supported online and a random someone walks into my local Target, it causes me to need to care. Someone can chime in to question the underlying beliefs on the matter of Covid, but the point is this is an example of an issue where we care. There are extremist views like white supremacy or similar violent ideology that are a bit further outside the scope of discussion but are another example of why we care that enough everyones agree.

Isn't the fact that we (as in, society) don't agree that it does affect your life an indication that it's actually a borderline issue, though?

Everyone would agree that a person wandering around the street firing live rounds into targets is acting in a dangerous manner.

No-one would agree that a person drinking a cup of tea in their own house is acting in a dangerous manner.

But coronavirus isn't one of those things. It's genuinely a borderline issue, and not because of the science, but because it's almost exactly on the line in which some people consider it to be a disaster and others consider it to be a bit of a nothing burger.

As a result, you end up in a situation as described by rgrieselhuber below.

Some people think that we need to close (e.g.) schools to stop coronavirus. Other people think that we need to stop caring so much about coronavirus, so that we can go to (e.g.) school.

The question of whether there's misinformation is almost tangential to the point because there are huge numbers of people who do not doubt official statistics or modelling but also believe that data or those models are not severe enough to require us to 'stop the world' so to speak.

I guess my thoughts are that, if we can't even convince each other (and from what I can tell neither of us have really fallen for misinformation), what hope is there for the actual 'vaccines cause 5g' nutjobs?

There were people during every atrocity (e.g. genocide) who argued the atrocity didn't affect their lives. Did that make it a borderline issue? No, it did not.

I like chocolate. Do you like chocolate?

What does this have to do with the price of fish?

Ok, in your threat model, a person who doesn't wear masks or take vaccines represents (even though the vaccine you took should keep you safe?) an existential threat to you, and therefore you should have the right to dictate their beliefs and actions. However, they may view your actions as an existential threat to them. How to resolve?


You have it backwards. Discussing controversial issues in the clear minimizes conspiracy theories (no, it doesn't completely eliminate them). The volume of conspiracy theories was driven up precisely because our epistemological institutions (e.g., media, academy, etc) abandoned the pursuits of objectivity and neutrality. To get back to a semi-rational state of affairs, we need to get the ideology out of our institutions (or at least diversify it).

Do you think there would be less moon landing deniers if the media talked about it earnestly? If legitimate scientific journals published "research"? Honest question.

No, because there's some margin of people who won't be convinced no matter what. If you've failed to persuade that 2% (or whatever the floor might be) then there's no need to discuss further--those 2% are inconsequential. But if you're at double-digit percentages you can be sure that you're not near the floor.

Moreover, it's not just about the media's coverage of COVID or the moon landing, but about their overall credibility during the prior decade. For example, the media has reliably and brazenly lied and mislead on racial justice protests from ~2014 through 2020, which almost certainly affects its credibility on other issues (e.g., covid). Of course, the same applies to the academy and other institutions.

Moon landing deniers won't read, nevermind believe, legitimate scientific journal investigations. What's needed is Buzzfeed-esque, at that level of writing, top ten reasons why moon landing hoaxers are wrong.

There are far more MRNA vaccine skeptics than moon landing deniers.

Right, because the media and academic environments following the moon landing were (for all their faults) much more committed to objectivity and neutrality than they have been in the last 10 years.

This is often people not understanding the technical details of the science so fall for the logical mistake of "person on YouTube explained it so simply it must be true"... Obviously this isn't helped when the people who are supposed to be trustworthy and unbiased in presenting the science to people are repeatedly caught lying that drives people to that simpler world view.

Should I be penalized if my interpretation of the Israel data is that Jews are dirty and should be exterminated? That would be a pretty outlandish interpretation! I would HOPE I would be censored for that!

I think unfortunately we also need to protect science from idiots making very bad conclusions based on not understanding a paper... That's how we end up with 'bacon cures cancer' headlines in the daily fail...

The dailymail writes “bacon cures cancer” because that headline gets clicks and shares, so it makes them money.

I’m not sure that has anything to do with “science”.

This brings up an interesting point I was thinking about but didn't feel like typing until now.

I think the problem could be "science communication", not bad communication or anything like that, but the existence of it in general. There's a casual interest in scientific stuff in the public, there's an entertainment market for it, and so now we have a scenario where people that publish research have to weigh this impact when publishing, including it's benefits. But most people interested are only casually interested (and most vendors of the entertainment have incentives not necessarily aligned with presenting fact, but that's not my point, only a secondary problem). I think it's possible that it might be better if, when someone was interested in something, they had to dig it up online. If we get rid of the whole "science communication" thing, truthful or not you get rid of all these problems.

I know it's impossible to do that, but it's interesting to consider that the problem isn't bad communication, but this drive to communicate everything all the time skewing the incentives and public perception.

How do you protect science from idiots? Is science going to stop working if some people think that bacon cures cancer?

OK so to counter,

Do you think 'scientific research' is in a great position from a funding perspective right now?

Have you looked at the press or out the window as to how "science" is portrayed due to covid? The lack of sanity in both political directions is alarming and depressing all at the same. Both sides of the public on this are both, equally, damagingly, wrong. However, they pick up and throw around scientific papers as though it gives them the right to be the next pope...

If science is paraded around like nonsense in the press there's a matter of time until we end up with politicians who make the decisions sit there and say "I don't see any public benefit in researching this".

That day will only come sooner if things carry on as we head into a recession.

Edit: My proposal to fix, as I've suggested elsewhere, keep scientific papers behind some (non-for-profit) walled access, obviously that has damaging impacts when you say X from country Y can't read paper on Z, but frankly maybe the public isn't grown up enough for completely open discourse when they can't understand the material.

No, but science won't have the opportunity to work when people are passing up vaccines for invermectin.

Science is a tool for falsifying hypotheses, not dictating policies or individual choices.

And wearing a mask in the same confined space with no air circulation for about 20min is pointless but still enforced by idiotic govt policies...

Unfortunately "bad science" cuts both ways when it comes to miscommunication.

Trust wasn't lost first. Talking heads swooped in for money or power, and they sell whatever they have to in order to achieve those goals. There isn't any sort of fact they are arguing for in good faith.

You can add any number of layers of scientific rigor and it won't convince someone who has a favorite talking head.

The strength of science used to be that it was anti-authoritative - experiments not institutions used to be the ultimate source of truth.

Is it not still that way? There may be a bias towards believing trusted institutions, but that's because they have reputations for delivering truth based on experiments.

> The issue that plagues us is a lack of trust in long-standing societal authorities

Yeah, because there's no rational basis to continue to afford them that trust. You're describing the symptom, not the disease.

> one derives from first principles every opinion they hold

But it's the only correct way, if credible studies contradict an official agency (more often it's the degree of the opinion where there is disagreement, not the opinion itself), my advice would be to trust the studies. For instance MRI contrast agents: the EMA banned most of them while the FDA did not, which is not the advice one would form by reading studies on the matter.

I believe if you take for yourself the right to neutralize or silence an opponent, he gets to do the same in kind. State authority is mostly based on show and relies on people complying, even in autocracies. So you should be very sure about it if you do silence people and about the costs. There was never a point in the recent pandemic or would you disagree?

It is also difficult to come up with an example where it was warranted to suppress information. Can you name one? Educated and intelligent people are often very distrustful, especially of authority.

Society is not a boot camp of grunts.

even if someone doesn't walk us through the entire chain of proof I'd settle for clearly delineated junctions and paper trails--basically full transparency with the ability to augment and shim any gaps people identify along the way. Nothing is implemented by default though but it would be great to have this public body of work where someone can spot check on the fly to see if things line up. It's a lot to ask for with no incentive to push such a thing through but maybe I'm not seeing a potential profit motive behind such a project.

What do you do while education lags where it needs to be? Because enough education to understand, say, climate change takes about a decade.

(... And that's assuming it can be provided without being shut down as "indoctrination").

That's ... not true? Children can learn a basic model of the greenhouse effect in school, and do so in most civilized nations. Whether this is sufficient is another thing entirely, but I would personally contend that (non-propaganda) education can never be 100% effective at causing someone to believe something?

Right, this hits at the more fundamental issue which is that the purpose of stuff like "misinformation" filters is not to affect what people believe, but the second order effect - how they act.

Loads of people are well aware that wearing a mask would result in less coronavirus particles being emitted from my face if they were infected.

But they don't act on that for various other reasons unrelated to the virus.

The same is true of climate change. You can think that it's real and still drive your car for precisely the same reason that you can think that cirrhosis is real and still drink a few beers.

> The issue that plagues us is a lack of trust in long-standing societal authorities.

This misses a lot of the cause. People aren't listening on the vaccine because there were no WMDs in Iraq. All the "experts" said that too.

The fact that huge frauds have been perpetuated at scale against our entire society and nobody has been held to account is a major reason for collapsing trust. It would be very helpful if the parties responsible had at least been censured even if the political class couldn't muster the will to truly apply penalties, but few in the political class have even admitted that anything bad happened.

When trusted authorities perpetuate frauds or display extreme incompetence and then just paper it over (often at others' expense), they are destroying the fundamental trust basis of society. Opportunistic politicians are destroyers of civilization.

What if the scientific proof chain was published as a knowledge graph? This could be easily followed and all questions by skeptics could theoretically be automatically answered.

In theory the references and bibliography sections of published papers would be more than enough to construct a knowledge graph.

Being able to construct a graph does not mean it's possible to check whether the claims are correct though..

This is how I learned (approximately) how GPT-2 worked. Start with GPT-2 and it's full of nonsense you don't understand. Follow the references, follow more references. Get to basic neural networks with one hidden layer like you learned about in college. Reverse the chain and build upwards.

It's less about checking whether claims are true but about how they are rationalized. E.g. the claim could be a node "masks should be mandatory" with connections to reason, "masks reduce spread of virus" and papers supporting this policy. These would also need to be connected to nodes showing values, e.g. "reducing hospital admissions is more important than freedom of choice".

In this way it would be more obvious why policies are choosen and provide a way to express current knowledge. As more nodes are added to the graph, the policy prescriptions would change which would prevent mistrust that is based on "health department said do x, now they say don't do x".

Well the trust has to be earned by a history of proven visibly good outcomes. Never forget that there is a long history of cases where the established scientifically accepted ideas were actually harmfully wrong, particularly in the medical arena. In the USA in particular, medical practice is far from a benign charity, so people are right to question. One example: I had a tonsillectomy when I was a kid. No actual evidence that this is good medicine..

On the other hand, I don't disagree that the level of misinformation is ridiculous. It's difficult to understand why someone would trust someone with a popular podcast so much more than licensed professionals. Maybe there shouldn't be censorship, but where are the medical malpractice claims against these people? I think it's all tied in with the general acceptance of the "these statements have not been evaluated by the FDA" supplement industry. As long as your supplement does not do too much direct harm, we allow it.

> It's difficult to understand why someone would trust someone with a popular podcast so much more than licensed professionals.

I'm less convinced that this is a matter of trust, I think it's more about value systems and priorities.

I trust my dentist to recommend me the best filling or crown material for my teeth, even in the face of financial incentives.

But his advice does not tell me, in an absolute sense, whether it'd be worth me paying $1K for a crown if that means I miss the rent next month. It's not a tractable problem.

I can't speak to the US situation, but in the UK in my peer groups most coronavirus-related argumentation doesn't revolves around whether e.g. wearing a mask reduces the spread of coronavirus, but around whether wearing a mask is a net positive for the society and/or the wearer. The former is a matter for epidemiology, the latter is nowhere near clear cut.

Well I agree that there are actual valid arguments about priority, by which I mean the math works out- the costs are significant either way. Excess deaths vs. broader economic impact or even questioning how much economic benefit from allaying fear by an otherwise dubious activity (cloth masks..).

But in the US, we have this extreme irrational vaccine skepticism where the math just does not work out. People believe the vaccine is much worse than its benefit. It's hard to argue with since prominent people are supporting this idea.

I personally think that a lot of vaccine scepticism comes from the perceived force involved. For example, via passports and mandates.

I think once you get into the situation in which people think "I'm being forced to do X", you've created an impossible catch-22 and you've basically ensured that some % double down.

It's hard for me to explain because it just seems intuitively obvious. I really enjoy a beer after work. But if you forced me to drink one, and made me document the fact in order to go on holiday or whatever, I would feel less inclined to do so because I'd instinctively think that something fishy must be going on.

I've personally had the generally recommended number of vaccines at the right times etc, but when it comes to stuff like showing barcodes to get into places, I've faked it on principle, I think it's a dystopian horror and it doesn't surprise me at all that a lot of people have responded by just tuning out and refusing everything.

> I personally think that a lot of vaccine scepticism comes from the perceived force involved. For example, via passports and mandates.

You would think that, and they promote that, but it does not. It is obvious when you consider that all the vaccine skeptics were also vaccine skeptics before there was any hint of force involved.

I guess it depends on what you define as "scepticism".

I know a lot of people who took the first or second vaccine but are refusing further due to the coercion involved.

Or more generally, they behaved "carefully" in the early days of the pandemic but then switched off over time as they tired of being given legal commandments from God rather than sensible advice.

I agree that the same set of people think crackpot stuff like "5g causes corona" or the vaccines have microchips, I don't think those people are really relevant to the discussion.

Yeah, it's not nice but what should be done after the "natural" vaccination rate had been reached by rational argument alone?

I mean, the only reason for me to care if someone else is vaccinated is to the extent that I care for their wellbeing.

To me it's analogous to like, telling one of my overweight friends to go on a diet so that I can pay less for the NHS or something. They don't want the advice, it feels rude more than anything else (even explaining this feels a bit ick!)

If it was shown that we could eliminate sars-cov-2 via herd immunity I'd probably be a bit more keen to encourage it but I still don't think I'd support mandates unless we had some crazy ebola mutation and people started dying in massive numbers.

> If it was shown that we could eliminate sars-cov-2 via herd immunity I'd probably be a bit more keen to encourage it but I still don't think I'd support mandates unless we had some crazy ebola mutation and people started dying in massive numbers.

Triple vaccinated and completely in agreement.

An argument can always be made that they fill up hospitals, but that same argument can be made against parachuting, being overweight, racing (or driving in general), all alcohol etc.

If choice is to meaningful it includes bad choices.

> t there is a long history of cases where the established scientifically accepted ideas were actually harmfully wrong, particularly in the medical arena.

This isn't actually a problem so long as those past errors were caused by the limits of what we knew at the time and efforts are made to help prevent similar issues in the future when it's possible. It's inevitable that as our understating of science and medicine evolve we're going to discover that what made sense before is no longer a good idea.

The problem comes when we weren't wrong because of what we didn't understand, but because people who knew better just thought they could get more money if they manipulated results or outright lied. We had the tobacco industry pay off scientists to lie about the cancer risks the industry knew to be a problem. The resulting rise in people with lung cancer wasn't a mistake. We had doctors pushing opioids on people at insane doses because they were paid kickbacks if they did. That wasn't a mistake either.

What we need is strict regulation and oversight so that when science and medicine do get it wrong, it's because we couldn't have known better given what data we had at the time. That'd be a huge step up from where we are now.

This was a problem when past scientists stated that there was a consensus when in fact the underlying evidence was limited and allowed for multiple possible interpretations. If you dig into some of those past errors in medical guidelines you often find very shoddy, limited research which doctors uncritically accepted as fact.

I remember when the government and scientific consensus informed me that margarine was good for me. For decades.

Who knows what damage that has done to my internals.

I'm out of the loop here apparently, is margarine bad now? What's consequences does long term use have?

Multiple theories and trains of thoughts exist for margarine:

- it plastic no food

- it’s vegetable good

- it’s cholesterol bad

- it’s trans fat bad, butter no trans good

- cholesterol ain’t real, margarine good

I don’t care :p

- Accepted fact 1: eating saturated fat causes more coronary heart disease thank unsaturated fat (very high correlation, and accepted as causal)

- Accepted fact 2: eating trans fat is bad. A diet high in trans fats can contribute to obesity, high blood pressure, and higher risk for heart disease, because intake of dietary trans fat disrupts the body's ability to metabolize essential fatty acids. Trans fat is also implicated in Type 2 diabetes.

Historically margarine was low in saturated fat but high in trans fat due to partial hydrogenation - hence the questionable "health benefits" of it vs butter.

Currently in most of the developed world vegetable spreads are not allowed to contain significant amounts of partially hydrogenated oils, and thus margarine should be healthier - but there is a powerful dairy lobby so the bad reputation will last for a long while...

> Accepted fact 1: eating saturated fat causes more coronary heart disease thank unsaturated fat

Several studies suggest that eating diets high in saturated fat do not raise the risk of heart disease, with one report analyzing the findings of 21 studies that followed 350,000 people for up to 23 years.


It's made of hydrogenated oil, which is bad for your vascular system.

It turns out it’s been a long term indicator of covid as those that eat it have no sense of taste.


The best diet is the one eaten for 1000s of years by your ancestors

You mean the one for people whose life expectancy was less than half the modern day? Who lived completely different lives?

Pro or con the diet, the difference in life expectancy is mostly antibiotics.

Clean water was the single most important advancement in human health of all time. Next was hygiene, especially during birth. Next is antibiotics. Next is anesthesia.

> the difference in life expectancy is mostly antibiotics

Life expectancy made huge gains throughout the century before antibiotics.

If you look at life expectancy graphs, they mostly correlate with GDP per person.

You can't really see the invention of antibiotics clearly in the curves.

The way I think of it is that of course penicillin increased life spans, but the discovery and industrialization of it was highly correlated with rising GDP.

Antibiotics rock, and I think penicillin is rightly praised as being as closed to a Panacea as it gets. However, we shouldn't underestimate the positive effect of better hygiene, vaccination, and advances in surgery techniques.

I share the feeling that better nutrition is probably not as much of a significant factor in developed countries. For what it's worth, we are more obese than our ancestors.

Our World in Data has this nice chart of life expectance by age in England and Wales [1]. I have this pet hypothesis that a good chunk of the increase in the life expectancy of the younger people in that chart is due to vaccination and better hygiene. And, for older ages, the a good chunk of increase is due to advances in surgery.

The effect of antibiotics is probably more uniform across the ages. It is note-worthy that penicillin was discovered in 1928, so the increase in life expectancy before then can't be due to antibiotics.

I have no expertise in this area, so you should take all of this with a pinch of salt. The Our World in Data chart is really cool though. That steep fall and rebounce due to the Spanish flu, and the contrast with COVID-19 with respect to affect on different age-groups is also interesting. [2]

[1]: https://ourworldindata.org/life-expectancy#it-is-not-only-ab...

[2]: I think comparing the naming of the diseases also tells us something about how society has changed.

This argument pretends that diet is the only cause of death.

No, it doesn't. It acknowledges that a diet for hunter-gatherers that rarely lived to 50 might not be best diet for a people who expend less energy and live far longer.

I don't buy the paleo stuff, but you're not disproving it. We don't live longer lives purely because of our diet.

Also, low life-expectancy in the past was in large part because of child/infant mortality, not croaking at 50.

Overall life expectancy was low due to massive childhood mortality. If you survived to adulthood your life expectancy was much higher.

Powhatan lived to 70, Opechancanough lived to 92 (when he was murdered), Massasoit lived to 80, Canonicus was 82.


You should make sure you understand what outliers are. The fact that you can name people from outside modern times that lived a long time is meaningless to the discussion.

As if I don't understand what outliers are.

They weren't outliers. I simply looked up famous Indians because I remembered one had lived to a startlingly old age, and skipped those who got killed by other people. If their diet caused health problems, you wouldn't have all these folks living to a decent age. All the Indian leaders would have bene these young guys who died off at 45 from eating too much venison.

I definitely don't think modern diets are better than what I imagine Native Americans ate hundreds of years ago. Venison sounds like a very healthy staple in a diet.

If your ancestors have a line that was able to create you, clearly they were doing something right

I have been thinking a lot about this exact point lately. Very interesting but hard to answer.

Do we know what they ate in detail?

They certainly didn't eat margarine. Or refined sugars. Or simple carbs outside fruit. Etc.

Because they all lived long healthy lives I'm sure /s

Of course some of us did a better job than others, but enough lived that Humanity (collectively singular) is still here, and to me it seems the only way for anyone to learn anything is via the negative image formed by all the ideas that didn’t work.

Sorry I know this sounds a little woo-woo fruity, but that’s Humanity’s godlike power to transcend the limitations of our bodies’ material dimension where energy can’t be created or destroyed.

Collectively we’re like a blockchain holding a mirror-image dimension of all the best ideas just by virtue of the other nodes not making it, growing over time as our new embodiments get to start at what from their point of view is “zero” despite being unknowably old and full of patches and bug fixes. Storage and retrieval are the same operation!

Sarcasm nonwithstanding, it took over 12000 years of technological advances for agricultural societies to catch up to their palaeolithic bretheran in terms of average life expectancy and even metrics like height.

I'm not a primitivist by any means (I'm using a pc right now as it so happens) but we continue to have a lot to learn from modern hunter gatherer societies both in terms of social organisation and food consumption.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact