Hacker News new | past | comments | ask | show | jobs | submit login
Attempts to scientifically “rationalize” policy may be damaging democracy (mitpress.mit.edu)
235 points by anarbadalov 3 months ago | hide | past | favorite | 258 comments



Scientists' and doctors' views on questions of "ought" can also be influenced by their background in weird says. A dermatologist might insist that people always wear long sleeve shirts and pants outside. Even if moderate sun exposure causes people to statistically live a bit longer on net they see lots of people who die of skin cancer and no more people dying of heart attacks than anyone else so the former are a lot more salient for them.

As the famous saying goes, "nothing is as important as you think it is when you're thinking about it".

Every scientific discipline inculcates values that may be different from those held by the general public ranging from how important it is to give credit to the originators of ideas to the relative importance of difference species to whether naturalness is valuable or not. This isn't bad, its probably necessary. But it's something that has to be accounted for.


I see all the time PhD's who, when dealing with matters in their own field, assiduously adhere to reason, facts, the scientific method, evidence, etc.

But step outside that field, and they latch on to something that emotionally appeals to them, and all that reason, facts, etc., flies right out the window.

Of course, I never do that myself.


Critical thinking is domain dependent. For example, just because you can kill it on white board interviews doesn't mean you will be able to navigate relationship issues. The way I imagine many PhD's I've talked to is like the RPG character where you max out one line of development at the expense of all the other lines. The counter-example would be the "T shaped" development profile which is the ideal imo.


A research PhD should be able to instantly recognize logical fallacies outside their field, but in my experience they fall for them every time.


Absolutely!

I often wonder whether the epistemology we are "born with" can actually be improved very much (in a general sense), or whether the best we can do is teach domain-specific techniques to override our defaults on those topics.


We all fall for it. Just search for: biases

We also need to deeply care and understand ourselves.


INT and WIS are separate, orthogonal stats


And it would be great if the public debate was about what outcome is better given two policies produced by honest but differently focused scientists.

It would also be good if all those models were used as basis to discuss divergent interests and that politics would remain politics as a tool to select policies that maximize a majority (but not everyone's) best interests.

I feel, instead the issue is that it's become unacceptable to disclose what are someone's own best interests and to defend those.

Instead everyone produce fallacious models that show that their prefered policy is in everyone's best interests.


Politics is a fundamentally alien experience to most STEM people.

Politics is manifestly not about the public good - not even with compromises.

It's mostly about a few unpleasant - sometimes charismatic - individuals pursuing power for their own personal benefit, and covertly that of their funders and sponsors.

Science has nothing to contribute to this, because the entire business is so hopelessly toxic and corrupt that rational debate about policy doesn't even get to first base.

If you want rational policy you don't want politics. It's perfectly possible you don't even want democracy.

What you want is administration - a very different process where mature competent executives and managers who do not have Dark Triad personality traits make intelligent, informed, and compassionate decisions in the public interest based on evidence, expert guidance, and their own good instincts. And their power is strictly limited to the lightest possible enforcement required to do the job.

No country has even been run like this, but a few have managed to operate like this in selected subfields.


> What you want is administration - a very different process where mature competent executives and managers who do not have Dark Triad personality traits make intelligent, informed, and compassionate decisions in the public interest based on evidence, expert guidance, and their own good instincts.

Which is basically inhuman. For an administration of any sort, you need a selection process and the selection process will then be gamed by exactly the people who seek power.

Which is why we have large bodies of political representatives: people will not stop seeking to acquire power and act in their interests and the interests of their backers, but they will be counteracted by people doing exactly the same but pursuing different ends. That’s a very human thing.


I only have some experience in low-level politics in a parliamentary system based on proportional representation, though I know a number of mid-level politicians and kind of know some top-level politicians in my home country.

Based on what I have seen, most politicians genuinely believe they are working for the common good. Their understanding of what the common good is obviously varies.

In a system like that, politics is mostly about playing the long game. You build a personal brand that gets you elected and re-elected and gives you influence within your party. You build and maintain your networks, you collect political capital by supporting others' goals, and you spend the capital to advance your own goals. You try to find a balance between short-term and long-term goals, because your opponents today may be your allies tomorrow, and you don't want to alienate them by playing too hard.

Unpleasant and toxic people may sometimes survive in politics. They are rarely successful, because politics is all about people skills, at least in the system I know of. (Such people are more common in administrative positions, because their status is more secure in a meritocratic hierarchy.) People focused on specific topics often also have a hard time in politics, because they have a tendency of becoming unpleasant when things don't go their way in the niche they are interested in.

The public image of politics can be toxic, because the current publicity game requires it. The same politicians often appear quite different behind closed doors, where they are allowed to speak off the record.


The purpose of this particular device was to permit a vote in the National House of Representatives to be taken in a minute or so, complete lists being furnished of all members voting on the two sides of any question Mr. Edison, in recalling the circumstances, says: "Roberts was the telegraph operator who was the financial backer to the extent of $100. The invention when completed was taken to Washington. I think it was exhibited before a committee that had something to do with the Capitol. The chairman of the committee, after seeing how quickly and perfectly it worked, said: 'Young man, if there is any invention on earth that we don't want down here, it is this. One of the greatest weapons in the hands of a minority to prevent bad legislation is filibustering on votes, and this instrument would prevent it.' I saw the truth of this, because as press operator I had taken miles of Congressional proceedings, and to this day an enormous amount of time is wasted during each session of the House in foolishly calling the members' names and recording and then adding their votes, when the whole operation could be done in almost a moment by merely pressing a particular button at each desk. For filibustering purposes, however, the present methods are most admirable." Edison determined from that time forth to devote his inventive faculties only to things for which there was a real, genuine demand, something that subserved the actual necessities of humanity. https://www.gutenberg.org/cache/epub/820/pg820.txt


I agree completely: much of science now is highly politicized. We have seen what happens to professors or research scientists who try to take a position that countered popular opinion on a sensitive topic - they have protests against them, they are spat upon and assaulted, and they lose their jobs.

All of the other scientists and researchers see this and adjust their own behavior and areas of study to avoid being mobbed.

I have had to become increasingly cautious and wary of trusting all published science, especially that in social areas now.


I think the article wants to argue against that.

It isn't rational that you love your wife more than other women. Why constrain yourself to such rules for policies?

Is dying for your freedom rational? Investing in symbolic architecture? Skydiving could be seen as fundamentally irrational. Some still do it.

All depends on how exactly you define rationality.

Overall it might be prudent to value rational arguments, I complete agree with that.

edit:

> It isn't rational that you love your wife more than other women. Why constrain yourself [...]

Reading it again, this can transport something different than the point I tried to make...


I think you can rationally assert that the feeling of loving your wife or paragliding is more important to you than the risk of missing out or the risk of death.

It is a preference and that is what form the basis of interests.

> dying for your freedom Is somewhat different, it's already the execution of a policy (fight) for a preference. And I think it can be perfectly reasonable outcome as it's not even hard to find chapters in History during which conditions made it the only acceptable choice


True, but you meet the same difficulties. A preference doesn't really have to be rational. Fund the park or the bath, paint the city hall blue or yellow. Especially in a democracy there often is no pure rational solution.


This is what the article is getting at, there are questions to be answered before you ever get to scientists evaluating evidence for and against different policies aimed at solving a problem.

What problem are we solving, what is or isn't a problem, what constitutes a better outcome, and what types of policies are allowable may be informed by science but are often (and often appropriately or at least inevitably) decided by culture, law, art, force majeure, etc.


> Even if moderate sun exposure causes people to statistically live a bit longer on net

This reminded me that the weight range we term "overweight" has lower all-cause mortality than the weight range called "normal weight". (There is another tier above "overweight", "obese", which has high mortality.)


I'm pretty sure the study you are thinking of [1] only applies to ages > 65 [2] for which there are a lot of possible explanations. Other studies have found that past a certain age no lifestyle choices matter including smoking & diet since it takes too long to influence when you die.

If you look at [1], a ton of the studies they looked at adjusted for preexisting illness like hypertension and diabetes. It's basically like saying people who fall off sky scrapers don't die of cancer.

It is also important to remember the associated costs and decrease in quality of life associated with being overweight.

That said it is possible the lower-end of the overweight BMI range could be moved up a point or two but that's probably less than the variance between populations anyway.

[1] https://academic.oup.com/ajcn/article/99/4/875/4637868?fbcli...

[2] https://agsjournals.onlinelibrary.wiley.com/doi/abs/10.1111/...


Reminds me of the stereotype of dentists giving out toothbrushes on Halloween.

Maybe if you're life is about preventing cavities you think that's what kids should always have in mind, but life would suck if before every action we asked "will this contribute to tooth decay" - much more than it would suck with a few cavities.


> As the famous saying goes, "nothing is as important as you think it is when you're thinking about it".

That seems like a very dangerous saying! As a scientist, I am biased, but I think it's also important to remember that nothing is as simple as you think it is when you're not thinking about it.


As a (former) scientist, I should also admonish that it is generally in the professional scientist's best interest to prolong the perceived complexity of any given thing to keep the grants coming. Even if they're not influenced by an interest outside of their scope, it's a mental habit that can be hard to break.


Perhaps wisdom starts by accepting the conjunction of both the saying and your retort, which is not so far from a better-known saying, "the devil is in the details".


Scientists often think about one topic at length, and therefore tend to overestimate it's importance. This is a bias that needs to be corrected for. Thinking that a topic with which one isn't familiar is simpler than it really is is another unrelated bias.


I remember an ad on the radio from the American Podiatrist's Council or some such organization, recommending that everyone get their yearly foot exam.

Sure, it was trying to stir up business; but I think that ad mainly existed because podiatrists really did, in good faith, think everybody should get their foot examined once a year.

You just lose perspective.


Presumably this only applies to scientists and not to politicians, economists, think tanks and PR firms, and so on.


Fun fact: incidence of skin cancer in notoriously cloudy and rainy WA is substantially higher than in almost always sunny HI or CA or TX. I bet the vast majority of "scientists and doctors" don't even know this.

https://gis.cdc.gov/Cancer/USCS/#/AtAGlance/


The rates are extremely similar once you control for race (which the page you linked allows you to do, by the way). White people get skin cancer at much higher rates than non-white people, because melanin acts as permanent mild SPF, and white people have less skin melanin than non-white people: that's what makes them white, after all. Since Washington is more white than Hawaii, California, and Texas, if you don't control for race it looks — counterintuitively — like Washington is more risky than the others in terms of skin cancer. But it's not, it's just that you're selecting different population demographics.

Also notable is that sunniness doesn't really determine UV exposure; partial cloud cover can actually result in more UV-B exposure, counterintuitively: https://www.drgurgen.com/are-the-suns-uv-rays-really-stronge... And even full cloud cover doesn't completely block UV. So WA and CA aren't as different as you might think.

Meanwhile, as one might expect, most states that get freezing cold for months at a time have lower skin cancer rates than those that don't when controlling for race: if you don't go outside for more than a few minutes a day, while bundled up under multiple layers, for months at a time, well — you see less skin cancer. Some heavy farming states seem to have more skin cancer, but again that makes sense since farm workers are outside a lot.

Vermont and New Hampshire seem like odd exceptions to this rule — but I suspect there's also just some other selection bias at play. UV exposure from the sun causes skin cancer at high rates in white people; trying to make assumptions about states isn't necessarily an easy thing to do, since many other factors are at play when considering who lives where and how much they go outside.


That could have something to do with it, but as someone who lived in WA for quite some time - you don't go outside without clothing for ~7 months in any given year in WA either because it's raining and unpleasant. So that's where this hypothesis falls apart a little. My hypothesis is that constant relatively low level exposure to UV is more beneficial than acute exposure for just a few days / weeks in a given year. I don't have any data to support this hypothesis though.


> you don't go outside without clothing for ~7 months in any given year in WA either because it's raining and unpleasant

Also because one doesn't want to get arrested.


There was (and maybe still is) Solstice parade where people ride their bikes through Seattle completely (or partially) naked. So that's only partially true.


I know a lot of time I'm guilty of quick/skim reading articles I like shared on HN. I think this one is worth reading more thoroughly than normal. I found doing so rewarding. I was particular pleased that a conclusion I was forming as I read through was then voiced near the bottom. Namely:

> The science policy scholar Daniel Sarewitz goes so far as to argue that science usually makes public controversies worse. His argument rests on the logical incompatibility of policy expectations and what science actually does — namely, that decision makers expect certainty, whereas science is best at producing new questions. That is, the more scientists study something, the more they uncover additional uncertainties and complexities.

I'm not sure I've ever seen this basic contradiction put so cogently. We want policy (politics) to create certainty and stability. "Science" increases our risk of the unknown by making us more aware of it.

Loved the article. (and still an avid scientist)


> We want policy (politics) to create certainty and stability.

Maybe it's this half of the equation that needs to be examined.


That's not going to change. Certainty and stability is what people want.


Doubtful. There is a way to project confidence while acknowledging fundamental uncertainty, I think it's probably more effective and sustainable than outright lying. Being untransparent about difficulty in the long run creates distrust in authority because eventually you accumulate enough fuckups. I think the biggest error in modern leadership practices is confusing certainty for confidence.

Anyways your statement is literally begging the question:

["people want certainty and stability" is] not going to change. Certainty and stability is what people want.


Ahem, need - at least from our leaders. It may be that creating certainty amid uncertainty is itself the core of leadership. The stories we are most certain of are the ones we use to run our lives, to make decisions, and take action.

One heuristic for this is the 40-70 rule - a heuristic for decision making. In order to make a decision you should have no less than 40 percent of the information you would prefer to have, and you shouldn't wait to make the decision once you have 70 percent of the information you would prefer have.


> Ahem, need - at least from our leaders.

I'm sympathetic to this. There is a strong argument to be made that this is a need.

> It may be that creating certainty amid uncertainty is itself the core of leadership.

I would agree with this without reservation.

But the phenomenon here is being driven by what people want, not what people need. If they're benefiting from the certainty they get, that's just a coincidence.

Wanting and needing are different things, and while people may need some certainty, they want much more than they need, and they're getting more than the optimal amount.


I remember a story , may from The Sea Around Us. politicians will decide how many fish they can capture in next year, scientists say 'please reduce x %, or we trust it will be terrible', politicians heard the advice and start taking. finally, they decided a number smaller than half of x.


> "Science" increases our risk of the unknown by making us more aware of it.

I'm not sure I understand the phrase "our risk of the unknown". The risk something poses to us is surely the same whether or not we are aware of it—just that our mitigation strategies, and even the awareness that we need to mitigate, change in response to increased knowledge.

Perhaps the idea is that people are more hesitant to make decisions when they become aware that what they had previously taken as absolute is in fact conditional? That seems like a good thing (although I agree that it can be taken to paralysing extremes).


It's worded a bit ambiguously, but I think it means we have to deal with the knowledge of how much we don't know about something and when making policy, every new think you don't know is a point of contention that can be argued over. In some cases that's beneficial, because it keeps us from making a mistake, in others it's detrimental, because it keeps us from making the beneficial change, but if all policy decisions start tending towards infinite argumentation as more and more things we don't know the answers to are linked to the topic, that's also a problem.


> Perhaps the idea is that people are more hesitant to make decisions when they become aware that what they had previously taken as absolute is in fact conditional?

Most people are horrible at dealing with uncertainty when making decisions. I don't know why this is. But taking an uncertain landscape, making a decision and then projecting certainty works better than conveying the risks for communicating with everyone but people used to making executive decisions.

So if you have two politicians, one who channels a scientist's healthy (and realistic) scepticism and one who takes a random position and blasts it, the latter will tend to be more popular.


> But taking an uncertain landscape, making a decision and then projecting certainty works better than conveying the risks for communicating with everyone but people used to making executive decisions.

I think it depends on what 'better' means. It works better in the senses of getting things done, and of popularity. But, unfortunately, the things that get done are those that are some weighted combination of (a) rewarding in the short-term and (b) in the interests of the person who's good at projecting an aura of confidence.

If the 'right' decision tends to align with the interests of the decision-maker, then it's great to have that decision-maker pushing it through. But, when the decision-maker's interests are not those of the general public, paralysis might be better than populist marching into short-term gratification.

(On the other hand, I also recognize that not making any decision until you know it's the right one is just a long-winded way of never making any decision. Making decisions about whether and how to make decisions is just as complicated as the non-meta decisions themselves ….)


Interestingly there is a correspondence of this in doing startup too.


The best the scientific process can do is establish facts about repeatable processes. Even then, it isn't carving facts in stone, but a case of increasingly accurate approximations of reality.

The reason that it is impossible for mere fact to end political dispute is that facts are only one element in policy. Values are more important in the long run and facts can only be used to improve policy around shared values.

People don't agree on values and they never fully will.


> The best the scientific process can do is establish facts about repeatable processes.

And if everyone had the time and resources to discover and digest every fact, facts might be definitive.

But everyone doesn't have time and resources. To compensate, we rely on others to curate facts for us. When we encounter an internally consistent subset of facts that suits our ideals and our interests, we adopt that point of view.

There are infinitely many subsets of curated facts that can be presented as internally consistent. That's why there are so many different points of view.

To complicate things further, it is difficult to get a man to understand something when his salary depends upon his not understanding it.


That's an interesting topic that sometimes gets explored in sci-fi. If an AI is created that is able to learn all knowledge and form it into a single consistent model of reality but it ends up making conclusions we don't want to hear, what are the consequences for humanity and living things in general?


Those “conclusions you don’t want to hear” are not from the AI. They are an embellishment by the sci-fi author.

If an AI did do that, the “conclusions you don’t want to hear” would be artifacts of the AI’s algorithmic process or data set.

Living things in general don’t worry about conclusions. They just live.


got examples of books where this is done ? I'm currently interested in reading some.


The Hitchhiker's Guide to the Galaxy is almost exactly what Mountain_Skies is describing.

If I remember correctly, Peter Watts has a somewhat more realistic take on this in his Rifters trilogy (under the Novels section here: https://rifters.com/real/shorts.htm), where there's a brain in a box that sifts through loads of information and gives advice to political leaders. The trilogy as a whole is more... deep-sea cyberpunk than particularly centered on the brain in a box, though.


old but good short stories The Defenders by Philip K. Dick a different take - Second Variety by Philip K. Dick.

Not totally omniscient AI but running large ships and space stations - Imperial Radch trilogy by Ann Leckie

AIs with personality running almost every machine and helping rule the empire in the Culture series by Iain Banks.


Sabine Hossenfelder put it as well as I've seen it put:

"Science doesn't tell you not to pee on an electric fence. It only tells you that urine is an excellent conductor of electricity."


That was a good one for sure. She does have some gems.


But the problem is that people aren't even agreeing on facts.


> people aren't even agreeing on facts.

From the TFA.

"what happens to politics ... when rhetorical appeals to 'the facts' start to dominate people’s justifications"


There are problems both on the producing side and the consuming side. Any scientist knows that research is often driven by politics, money or ego.

The bigger problems are probably on the consumer side, though, and not only in under-educated social groups. Here's an interesting post I stumbled across recently: "People who trust science are less likely to fall for misinformation -- unless it sounds sciency" (https://digest.bps.org.uk/2021/08/10/people-who-trust-scienc...)

This confirms my sense that we easily glom on to things that "sound right", and if you're a scientist or engineer "sciency" statements sound right to us. Do we really have the time to dive in deep enough to figure out whether it's pseudoscience?


That article is really interesting and something that, as an academic scientist, is pretty obvious. In theory, we have peer-review to keep ourselves honest, but as the article points out (maybe unintentionally), bad science gets through that process (really, as long as the "data" look believable enough and the "experiments" are methodologically sound and the interpretation of the "data" is also reasonable, it's reasonably undetectable to peer-reviewers).

There are a lot of problems with the way science happens, and a lot of bad data is never found out, because it can be hard to show, definitively, that it was fabricated or manipulated without someone from the lab in question speaking up about it (which probably doesn't happen enough).

At a certain point, though, there's too much information & data, good or otherwise, for any one person to parse. That's sort of why we have journalists: to acquire first-hand accounts and deliver them to a broader audience. The problem really arises when journalists significantly editorialize or disregard conflicting information. The Wakefield paper is a great example of that. Some journalists' cum pundits' gross negligence with regard to the retraction of that paper constitutes misinformation, but has shown to be very difficult to discredit because those actors abuse what we've all agreed is the role of the journalist: to give us valid accounts of actual events.

The study discussed in the article you cited even used a real, but heavily criticized (unknown to the participants) scientific study in their experiment. I think the question is "what is the amount of effort that it is reasonable to expect a lay-person to put into the in/validation of information presented to them?" with the caveat that trust is generally earned over time, but once earned, can be abused (and I use the word abuse very purposefully here, because it is a violation of one's relationship in a harmful manner). Should one be expected to more rigorously critique the statement of a trusted peer because of the potential for abuse?


> Should one be expected to more rigorously critique the statement of a trusted peer because of the potential for abuse?

It's a little counterintuitive, but people don't complain about stuff they don't deal with.


> "what is the amount of effort that it is reasonable to expect a lay-person to put into the in/validation of information presented to them?"

Exactly, I think that's the crux of it


I am immediately suspicious of any scientist (or expert in a subject) who doesn't say "well, actually, it's sort of complicated..." because it always is. Everything is complicated and confounded by innumerable factors, and requires years of study to even see the full shape of.

But, politics and media do not currently thrive on presenting complexity, so if you're going to run the risk of asking a scientist their opinion, you either select scientists who are willing to dumb down the science, or you ignore everything they say that doesn't match what you already believed.


I have a cognitive bias poster on my wall that categorizes about 1/4 of researched biases as "Too Much Information" [1].

In the context of Democracy, the trouble with "actually, it's sort of complicated", is that there is absolutely no way that all citizens can approach everything this way and still have time for, e.g. the Pursuit of Happiness.

In other words, trust (and trustworthiness) is key. We must delegate to someone, whether expert or not.

Traditionally, organizations with cult-like properties envelop people in a kind of "information bubble" that makes deciding who to trust a tractable problem.

The culture of science does this to some extent, but is unable to compete well for a number of reasons.

[1] https://upload.wikimedia.org/wikipedia/commons/1/18/Cognitiv...


Don't we delegate the job of understanding and digesting facts and conflicting information to our representatives, theoretically?


Yes, but delegation is always accompanied by principal-agent problems (like moral hazard). Lobbyists ensure that all but the most morally unimpeachable representatives will inevitably place moneyed interests before their electorate.


Increasingly, our representatives are chosen by what district we live in.


Frankly, the agenda of this book puzzles me.

Is there political counterflow that compromises integrity in some parts of science? Of course there is. For millenia this has been so. From Gallileo to Darwin to Haber to Einstein, the political or religious disruptions arising from science theory or experiment or technology often prompt someone to argue that governmental policy should not change in the light new facts or a new interpretation -- not if that change disrupts cherished societal values or impedes vested interests.

It sounds like the author's argument is that political winds WITHIN scientific communities are fomenting bias in their work because they're not satisfied with publishing papers but now want to effect political change. Therefore they tolerate no dissent from an official party line.

Perhaps climate change warrants such circumspection, but I know of no other scientific subdiscipline that does. As such, the rise in politics in ONE scientific subject doesn't justify a book that seems to tar and feather ALL of science.

I suspect a book that attacks only the climate science community was deemed to narrow to attract a broad audience. And of course, exploring that topic wouldn't compellingly break new ground either.

Finally, it seems to me that this is exactly the WRONG time in history to be impugning science or scientists. Without concrete and viable suggestions of how to redress the forces that have broadly compromised scientific integrity (which I doubt the author proposes), chinking away at science's armor can only aid the cause of anti-science, and feed the rising barbarian horde.


> It sounds like the author's argument is that political winds WITHIN scientific communities are fomenting bias in their work because they're not satisfied with publishing papers but now want to effect political change. Therefore they tolerate no dissent from an official party line.

> Perhaps climate change warrants such circumspection, but I know of no other scientific subdiscipline that does.

Anthropology is especially notorious for this, far more so than climate science (!), but you could describe all of the social sciences this way.


To me this article seems to miss the point. Obviously science can't provide objective answers to subjective matters.

The problem we have in society right now regarding science, as I see it, is that we have a significant group of people who disagree about the objective parts.

The people who think the world is flat, the people who think the world's age is measured in thousands of years, the people who think all the world's top climate scientists are part of a massive conspiracy, the people who think COVID is a massive conspiracy, etc.

Our science problem as a society is that those people are mainstream. They're politicians, they're pundits, they're your next door neighbor, and they're all objectively wrong but they use the appearance of science to intentionally spread lies.


Alas, the people most willing to draw bold conclusions from science also put the least effort into being scientific.

Ted Cruz is a good example. He argues that he's a scientist not because he has done science, but because his parents have done science. Thus, he's a "legacy" scientist, and because they emanate from him, his arguments must be scientific.

The man isn't this stupid. But in endorsing such nonsense, his supporters are willing to be. "We don't need no stinkin facts!"

How can reason hope to overcome willful illogic of this magnitude?


Flat earthers are mainstream? Man, I am out of touch.


They kind of were for a bit, but like many other conspiracy theories, it all got rolled up into QAnon in the past few years.

And that is definitely mainstream.


One thing that I have been increasingly aware of is people confusing science with utilitarianism. Science investigates truths about the world we live in. Utilitarianism incorporates science, but is built on subjective moral and ethical assumptions.

I see a lot of people who disagree with utilitarian proposals labeled as "anti-science".


2500 years later, Aristotle is still correct, and the three modes of rhetoric must work in concert or you're doomed to fail.

Logos, Ethos, Pathos: you need all three. "This is true. I am trustworthy. This true thing is important."

It is very frustrating, especially to many of my fellow scientists, that bellowing "THIS IS TRUE" as loudly as possible is insufficient. But it is insufficient.


In your role as a scientist, shouldn't providing evidence of truth (or falsehood) be your main concern?

Statements about importance are value judgements that no doubt you have made, but are not really science and are basically political.

An easy example, "covid restrictions curb covid" may be true and you can provide experimental evidence. "We need to impose restrictions" makes a judgement about the overall situation under the restrictions being preferable to the one without them. This is not a scientific call, this is a societal judgement.

People too often pretend that an action immediately follows from some identified causal relationship. Smoking kills you does not immediately imply you should quit smoking, without the intermediate step of "you value your later life and health more than the ongoing benefits you get from smoking" (disclaimer, nonsmoker)


The science community as a whole needs to do the work of all three. The work of an individual scientist may be able to focus on just the one, and their lives are easier if they work in a context that gives them that specificity of responsibility.

But eventually, someone needs to do the work to convince people "science is trustworthy." And for a variety of reasons (both objective and social), scientists should be on the front lines of that work.

The question of "this is important" is tricky because it must involve a two-way discussion with the audience. But there are cases where a scientist must be making that case, e.g. in order to procure funding.


> The science community as a whole needs to do the work of all three.

This is exactly the source of the trouble. Hearing scientists project their value system onto everybody else destroys their credibility. If people felt that scientists were sticking to the facts and avoiding ideology and values, science would be better received all around.

Whether or not the Earth is getting warmer, and the cause of that warming, are scientific questions. Also science is a discussion of possible remediations and their costs. Which remediation to choose, if any, is a question of values and outside the scope of science.

Also important for science is being clear about levels of uncertainty and ranges of possible outcomes. Recent global temperatures have little uncertainty. Distant-past temperatures have a much higher level of uncertainty, and predictions about ranges of possible distant-future temperatures have yet more uncertainty. I never hear much about this from scientists or the media that cover them though.


> Whether or not the Earth is getting warmer, and the cause of that warming, are scientific questions. Also science is a discussion of possible remediations and their costs. Which remediation to choose, if any, is a question of values and outside the scope of science.

So how do you propose scientists "stick to" the discussion of remediations of "we can remediate X by doing Y at expense Z" without it being portrayed maliciously as pesky interfering scientists saying "we should do Y"?

The problem of lack of trust is hardly the sole responsibility of scientists here.

Do you think discussing uncertainty more is going to raise trust or just be more fodder for the people with non-scientific reasons to oppose action?

I propose scientists "only stick to the science" if and only if everyone else in the world sticks to their wheelhouse as well. Lotta citizens, politicians, and pundits out there without much training in making good ethical decisions either!

As long as the output of a scientist will be interpreted through other people's political lenses its fair game for them to frame it politically too.


>As long as the output of a scientist will be interpreted through other people's political lenses its fair game for them to frame it politically too.

Not if they want to be credible.


> This is exactly the source of the trouble. Hearing scientists project their value system onto everybody else destroys their credibility.

> sticking to the facts and avoiding ideology and values

I think a few different things are being confused here, but as far as value is concerned, the choice to do science, to take scientific findings into account and to decide what to investigate are themselves the result of value judgements. There is no fact-value dichotomy. There is no "clean" separation between fact and value (as if value were a dirty word contrary to fact).

This fact-value dichotomy can be tied to the materialist worldview which denies objective value because it presumes a metaphysics that renders the world a kind of theater of senseless extension in space. Any value must therefore be a matter of subjective projection (and therefore delusion, putting to one side materialism's inherent inability to account for subjectivity). But this metaphysics is, to put it gently, problematic. The wish to separate fact from value (which I take to occupy one order, not two) is no doubt further encouraged by liberalism's pretensions to neutrality and the inherent tension within Lockean liberalism between science and liberty.

But I do agree that the actual deciding of policy is not to be left to scientists but to politicians and the like. Scientists are specialists who can supplement our knowledge in specific ways that generalists can then take into account along with other data and understanding when making judgements.


The problem is when people aspire to be scientists because they want to be authority figures rather than find out facts about the world. And then once they become an authority figure as they always wanted they just push their agenda because they never cared about the facts in the first place.

Of course lots of scientists cares about facts first, but the scientists with an agenda are much juicier for journalists to interview and write articles about so they are who we see.

(I believe in climate change, I am pro vaccine. Just noting this here since many will think that me having the above view means I am a climate change denier and anti vax)


The problem of correct motivation (and here I would argue understanding is the aim of science, not fact collection per se) is a separate problem, but it, too, is a matter of value judgement (what is the correct aim and motivation for science?). The concern for corruption cannot erase the essential value-ladenness of all activity, including scientific activity. Science is both shaped by and shapes value judgements and is itself suffused by them. You cannot escape from value. If you judge something to be valuable, you will act and shape reality according to that understanding of what is valuable. This is the essential nature of practical reason. All action is determined by value judgements.


> what is the correct aim and motivation for science

I think you misunderstood, I am not saying that they try to discover facts to support them using science. I am saying that they want the credibility of a scientist. You know how people start to listen a lot to the views of a Nobel Laureate regardless if it is their field of expertise or not? That sort of authority is very attractive to a lot of people and they will work really hard to get it, I am saying lots of people go into science since they want that authority, they don't have any care at all about doing science.


No, I understood you. What I'm saying is that a) your concern for bad motives is separate from the question of whether science if value-laden (which it is), and b) the question of "what is the correct aim and motivation for science?", meant rhetorically, demonstrates that your judgement about what is a bad motive for entering a scientific field itself involves a value judgement.


> demonstrates that your judgement about what is a bad motive for entering a scientific field itself involves a value judgement.

Yes, I judge people who try to corrupt our view of science. I never denied that.

> our concern for bad motives is separate from the question of whether science if value-laden (which it is)

Science having value is exactly why I don't want people to corrupt it. If you agree with me that science has value then you should agree that we should try to stop people from corrupting it.

If you argue that we can't judge who is corrupting, then I'd argue that you are so out there on the clouds with your definitions that we could just as well argue that a random youtube commenter is also doing science and that is a good thing and that we can't really say that youtube commenters are worse scientists than the people at universities since that is just a value judgement.


I think you're reading things into what I wrote.

W.r.t. value, the only point I was making is that there is no fact/value dichotomy. It doesn't follow that I am therefore arguing that one cannot make value judgements. On the contrary, if no fact/value dichotomy exists and value is a matter of fact, then it follows that we can indeed make value judgements on par with factual claims.

But what I was addressing in an earlier post was the suggestion that there is a fact/value dichotomy and the notion that problems occur in science when value mingles with fact. I rejected this claim by arguing that there is no such dichotomy and by implication that the diagnosis is incorrect. Questions about corruption are fine as far as they go, but they are not relevant to this thread because they do not address the question of fact/value dichotomy and they presume value judgement.


>>Also science is a discussion of possible remediations and their costs. Which remediation to choose, if any, is a question of values and outside the scope of science.>>

This is where I've noticed mainstream political discussion often go off the rails. The best example is when "because science" is used to end conversation on the idea that 100% of the population should wear masks. Science, at least in this case, is clearly not prescriptive, so it can't be applied as a single justification like that. Perhaps science confirms that masks reduce transmission, therefore 100% of the population wearing masks is certainly one valid design. But one could come up with multiple other designs that would be equally confirmed by science to be effective at reducing transmission. So reducing transmission is not the hard part. The hard part is all the other variables that cause consequences in economics, mental health, other areas of healthcare, etc. Each model needs to be tested for it's utility across a variety of factors, but that idea is lost with the "science" cancel cudgel.


I felt more like what people had problems with was unnegotiable nature of logical thinkers and their plans(add air quotes as necessary). People expected to be involved but weren’t, and only plans that ignore them worked. That no one liked.


it just do not work, usually only scientists know truth, and they try to let people help, no one care, until Rachel Carson displayed how to use emotions let people 'understand'.


Science is Politically incorrect.

It will not care about narratives and may bring facts that we don't want to accept.

One sample is -> https://onlinelibrary.wiley.com/doi/abs/10.1111/j.1467-9477....

After this paper several other papers tried to milk other interpretation.


> But eventually, someone needs to do the work to convince people "science is trustworthy."

It's harder than that though.

Science is complicated, frequently messy and often adversarial, and that is how science makes progress.

There is no single narrative for scientific truth -- even when there is overwhelming consensus, science places high value on the coherent arguments at the margins.

This scientific embrace of complexity has been weaponized against us.

"Society" prefers a clear narrative they can comprehend, and science does not always provide it.

The media used to be relatively responsible stewards of the narrative, but that is very much no longer the case. (not the blame the media -- the media is us)


For an example of how science can end up with competing ideas and differing conclusions from experiments, see the little battle between Steve Mould and Mehdi "ElectroBOOM" Sadaghdar. They each made a few videos trying to explain the Mould effect (the effect where a chain dropped from a jar rises up from the lip as it falls). Each of them make compelling arguments as to why they're right and the other is wrong, but if not for them competing we'd just be assuming that Mould's original explanation was correct.


> The question of "this is important" is tricky

This question is not tricky. It is weighty and scary. If it was tricky, there would be a puzzle you could solve to answer it.

Instead as you say it must first involve listening because people have the freedom to choose what is important to them.

Otherwise, you might assume that people place a highest value on sustaining human life. This can fail if you go to New Hampshire and encounter a person who value liberty more highly than life.


> But eventually, someone needs to do the work to convince people "science is trustworthy."

Yes, and science as an institution has been failing pretty badly at that. I posted this on HN just last week, but it's worth repeating:

1. Science journalism is almost universally terrible, so people already get sold half truths and sometimes even outright falsehoods from allegedly reputable sources. Messaging needs dramatic improvement.

2. The replication crisis has shown that up to 50% of published results in medicine can't be replicated (and up to 66% in social sciences), and there are virtually no incentives to replicate or publish negative results, and too many incentives to data mine/p-hack and publish sensationalized results (in fact, results that fail replication get cited more). There are now some efforts towards correcting this, but it's only just beginning.

I think there's another part that is also underappreciated, which is that the honesty of the public faces of science should be above reproach, and if someone in public facing positions lose their credibility by violating public trust, they should no longer be the public face.

The best recent example is Dr. Fauci. He has openly admitted to lying to the public on multiple occasions, such as whether the public should wear masks and about vaccination levels for herd immunity. It doesn't matter whether you think he did the right thing in those cases, he has unquestionably violated public trust and eroded trust in science as a result.


>In your role as a scientist, shouldn't providing evidence of truth (or falsehood) be your main concern?

Strictly speaking no, but that's more of a long digression on epistemology than what I think you mean. (but think of indiana jones here: "archaeology is concerned with _fact_, not _truth_ ...")

My role as a scientist is to work diligently to understand, as best as we are able, the natural world.

Yet, "I will always be conscious that my skill carries with it the obligation to serve humanity by making the best use of the Earth's precious wealth."


The point, I think, is that laypeople and scientists alike fall further and further into the fallacy of appeal to authority: 'a scientist said it's true, therefore it is true!' (for simplicity, true == a fact correctly expressed).

That thing may be true, but the rising disdain for not accepting a conclusion when the evidence isn't presented alongside is worrisome. The difference between science and religion is that science draws conclusions from reproducible research. Yet - especially now - many people take the naked word of 'a scientist' the same way a religious fanatic takes the word of their spiritual leader.

This definition gave me a lot of clarity: an expert is someone who has, can get, can make, or can cause to be made - and presents - evidence that supports their conclusions.


Statements of importance are indeed value judgements, That's why you have to argue that its universally good ( a virtue ) in order to get around the political stifle. This does happen quite alot but you don't always hear about it. The US happened to approve a 1 billion dollar funding request for Israel's Iron Dome. The vote was 420 to 9.


But that's not a publicly salient culture issue in the US. It's a simple matter of do you want aipac backing you or your opponent in the next race. Easy calculation, no higher principles necessary.


Is a scientist not also a citizen?


More concretely, there are also concerns with one side judging their own arguments (be them of any of the big three types) as valid without concern for their ability to convince others.

If you treat logos, pathos, and ethos as a self-certified checklist, you are doomed to fail as well. You must provide arguments of each three types that are convincing to others. Above all else, preaching to the choir is (IMHO) the biggest problem in my political environment (USA).


In the context of politicians in the USA they will not perceive its a problem until it stop working.

Given how strong gerrymandering is there's no benefit to reaching out to the other side, just hold the line while scoring political points.


There are many other problems as well, but I think the topic of conversation in this thread is the oratory skills of policy makers, which might explain a few downvotes.


Was going to comment that it is not science that solves political issues, but rather, engineering. Even logos, ethos, and pathos together do not prevail over trebuchet.


Engineering doesn't solve political issues either. War is politics by another means.


I was thinking about this the other day; politics, war and climate. Americans (of which I am one) are very good at looking at political threats in terms of "others" and versus. Democrats vs. Republicans. America vs. ISIL, etc, etc. When it comes to fighting a more amorphous, but equally viable threat, like climate change, we have a hard time grasping the threat. It causes fire damage to California, Oregon and Washington; flood and wind damage to Louisiana, Mississippi, South Carolina, North Carolina, Georgia, Florida and Arkansas (at least); a migrant crisis at our southern borders; and shipment and food supply issues. If a foreign threat were causing these issues, no political force would stop "either side" from stumping to stop it. Yet, here we are, waiting it out. The Republicans denying and the Democrats in endless debate, unable to take action. Both sides unable to take action against an ongoing asymmetric war, waged upon us by a new form of enemy that we have created over the last century.


Well said! As a Canadian living in the US, I encountered the "us vs. them" culture as very subtle but real. It's like everything is a football game here, and that is the only way to frame things such that it gets attention.


"This is true" obviously needs to come from someone trustworthy or it is irrelevant.

This is why I categorically ignore everything from Dr Fauci. He's known to have lied to the public before to influence behavior. Since I'd have verify his claims from some other source anyway, there's no sense using him as a source at all.


I know his recommendations have changed over time, but can you cite well documented instances where he lied?


He said that masks wouldn't be helpful march 08 2020. A month layer New York had a thousand detected covid deaths a day, so people really should have started to wear masks march 08 2020 since deaths lags infections by about a month. If they were giving proper advice instead of telling people that covid wasn't a big deal at the time then maybe that disaster could have been averted from the start? At the time the dangers of covid spread was well documented based on how quickly it got into Italy, but at the time it was the democrat party line to downplay the disease to avoid racism against the Chinese so that was the line Fauci took.

https://www.cbsnews.com/news/preventing-coronavirus-facemask...

> "There's no reason to be walking around with a mask," infectious disease expert Dr. Anthony Fauci told 60 Minutes.

> While masks may block some droplets, Fauci said, they do not provide the level of protection people think they do. Wearing a mask may also have unintended consequences: People who wear masks tend to touch their face more often to adjust them, which can spread germs from their hands.


What I find ironic about the people that criticize Fauci for lying about the need for masks are often somehow still anti-mask.

Like, they acknowledge Fauci lying about not needing masks, which would imply that they should be wearing masks, but will now refuse to wear a mask because they think Fauci is lying about needing masks.


I'm sort of tired of being downvoted for saying this, but the data on masks other than well-fitted N95s is pretty shotty. Lots of P-hacking, lots of motivated reasoning, and the results pre-2020 differ in notable ways from that post-2020.

The situation gets even murkier when you talk about mask _mandates_ instead of individual decision making. The argument that mask mandates are helpful is tough to support in the face of the differences in the delta-variant curve, for example, in different counties in California.

Just to say it: even daring to compare the results in contra costa county and san diego county, california (which have different mask requirements) got me shadow-banned on reddit. The reasoning here is mostly political, not scientific/rational. No one cares what the science says.


A lot of the problem I think comes from messaging and deliberate bad-faith interpretations of messages.

Claims that "Masks slow the spread of COVID" gets interpreted as "Masks stop the spread of COVID", and so when we have mask mandates and yet COVID still spreads, people use that as evidence that masks are worthless.

It's interesting that people can draw opposite conclusions from the same scenario. COVID has continued to spread despite mask mandates. Some claim that means the masks are worthless. Others (including me) would claim that, despite how bad it is, the spread would be even worse without them.

> individual decision making

In most cases, I agree that people should be able to make their own health choices. You wanna eat McDonald's for every meal and walk less than 50 steps a day? Go for it. Hell, snort a few lines of cocaine for dessert if you want to.

But when it comes to a pandemic, it's different. Sure, the vaccines are 95+% effective, and masks might be X% effective, and social distancing is Y% effective, and so on...but when >30% of the population has zero interest in doing any of that, then you can take every protective measure you can (Besides just staying in your house) and still get the disease from some asshole at the grocery store that doesn't care if they spread it.

Also, consider last year's toilet paper shortage, and the short gas shortage a few months ago. Individuals will often act irrationally in their own interests rather than what's good for everyone as a whole.

To think of it another way, when at a pizza party, you will have some people who take 3 slices of pizza because there might not be enough for everyone so they want to make sure they get their share. Others might only take a single slice because there might not be enough for everyone so they want to make sure as many people get some.

Individual decision making only makes sense if people aren't selfish.


I agree that the data on masks can be interpreted both ways. This suggests to me the effect is small and probably second- or third-order (i.e. masks encourage more distancing, and that's actually what matters).

It isn't just that COVID continues to spread despite mask mandates. It's that the curves look nearly identical in areas with and without mask mandates. And, to show their effectiveness, epidemiologists have resorted to pretty serious P-value hacking.

Separately, I find it hard to get worried for my personal safety because of the 30 percent of people refusing to vaccinate themselves. It's just not that hard to avoid the sorts of places where such people are likely to be. And, being vaccinated and healthy makes it less of an issue for me than, say, the risk of a car accident. Sure, I could pass it on to someone else if I get it, but with reasonable precautions I don't think that's likely at all.


> He said that masks wouldn't be helpful march 08 2020. A month layer New York had a thousand detected covid deaths a day, so people really should have started to wear masks march 08 2020 since deaths lags infections by about a month.

I don't see how that is necessarily a lie. It could have been the best public health recommendation he could make at the time based on the available information. Research into how best to use masks is ongoing, so I would not expect today's masking recommendations to be the same as tomorrow's.


No, watch the interview. He actually says that wearing masks could actually be worse than not wearing them. In a later interview, he admits the suggestions against wearing masks was a lie because they were afraid of PPE shortages.

It's also not the only lie he's told and admitted to. See his claims about herd immunity, where the numbers kept going up, and when he was questioned on this, he outright said he just gave out numbers that he thought the public would accept at the time.


I don't know which interview you're talking about, but in any event you don't know what his motivation was for saying what he said when he said it. Perhaps he was wrong, perhaps he changed his mind, perhaps he was rationalizing on the fly, or perhaps he lied.

It's clear that what he and other scientists said about masks early in the pandemic certainly changed over time. I think science is like that. People who like simple, certain, unchanging answers can get them from religion or ideology. People who don't mind complexity, nuance, and change are more comfortable with science.

Assuming Fauci was lying because what he said then isn't what he is saying now just isn't logical.


> but in any event you don't know what his motivation was for saying what he said when he said it

He literally explained his reasons for lying.

Fauci admits to lying about masks and explains why: https://www.youtube.com/watch?v=kLXttHlUgK8

Fauci admits to moving the goalposts on vaccination rates and explains it's because of his "gut feeling that the country is finally ready to hear what he really thinks": https://www.nytimes.com/2020/12/24/health/herd-immunity-covi...

> It's clear that what he and other scientists said about masks early in the pandemic certainly changed over time.

This isn't about that, this is about literal deception from a public health figure about public health.


The point of these arguments is to show that public health pronouncements can be political in nature and our chief authorities are not afraid to lie in order to achieve what they believe is a greater good.

It is not the specific content of the lie that is the issue, but the lack of integrity on display. It is used as a retort to "official X declared Y", and is meant to undermine the integrity of official pronouncements in general. There are many who bristled at these initial claims by pointing out (correctly) that promoting "noble lies" is terrible for public health officials and doing so would come back to bite them. For some reason, the medical profession seems to accept noble lies as being justified when the rest of society does not. This goes back to the old saw of doctors lying to their patients about their own health. It's a blemish on the profession, and one that needs to be erased and apologized for ASAP, and IMO, Fauci belongs to that old school and doesn't really get it -- and probably never will.

Also, as a protip to your finding of the fact that masks were being lied about but they themselves don't want to wear masks as being "ironic": the literal meaning of "irony" refers to saying something but meaning the opposite. For instance "Sure, I trust you", when the speaker clearly doesn't. There is also situational irony, which would be when the opposite of what is intended happens. E.g. trying to kill someone by giving them a poison that ends up curing them. So in this case, the irony would be saying a "noble lie" with the intention of saving lives but actually causing more lives to be lost -- that would be the true irony here.


The "I am trustworthy" part is totally taken for granted and too many supposedly trustworthy people lie straight to the public face just to maintain the narrative they are pursuing.


Ethos is the hardest claim to assert. You have to live that one out; your community must, quite literally, bear witness to your actions which earn trust.


It seems that many like to believe and follow leaders who affirm their beliefs even when they are caught lying repeatedly about important things. Posturing loudly and demonstrating confidence seems to trump all of the other principles for many.

It's similar but more emphasized with chimpanzees.

The problem is that we're not chimpanzees. We are smart enough to make very powerful technology, but not smart enough to use it sustainably.


> Posturing loudly and demonstrating confidence seems to trump all of the other principles for many.

Nice word choice.


This requirement places too high a burden on one person. Only a saint (or God) could achieve the standard claimed here. Objectivity on topics should be sufficient. Scientists who lose their objectivity cease to contribute usefully to the discussion.


Only God could be Objective; that whole Descarte's Demon thing makes True claims based on empirical observations a lot more fraught than you'd really like it to be, even in the best of cases where everyone agrees on: what language game we're playing today, what "an observation" is, that it was faithfully reported, and so on.

Nobody's perfect; what builds the sort of trust that permits action is a history of publically doing one's best:

I would like to add something that’s not essential to the science, but something I kind of believe, which is that you should not fool the layman when you’re talking as a scientist. I’m not trying to tell you what to do about cheating on your wife, or fooling your girlfriend, or something like that, when you’re not trying to be a scientist, but just trying to be an ordinary human being. We’ll leave those problems up to you and your rabbi. I’m talking about a specific, extra type of integrity that is not not lying, but bending over backwards to show how you’re maybe wrong, that you ought to do when acting as a scientist. And this is our responsibility as scientists, certainly to other scientists, and I think to laymen.

Do that, and you'll earn your Ethical appeal easily enough.


Not having a conflicting political agenda that coincides with your experimental results goes a long way with establishing ethos, as well.


Ethos is very hard, but pathos is harder, especially in secular society.

Let me be clear, I think arguments about ethos generate the most argumentation and heat right now, but the silent killer is really pathos. People feel comfortable arguing ethos. Most people will not argue pathos openly though.

We have a crisis of pathos in our culture. How do you claim that something is important without an appeal to authority? You can't. We live in a culture that is fragmenting its sources of authority; different groups of people have different sources of authority.

Here's a good example. The external dialogue that a lot of conservatives give on climate change is that scientists can't be trusted. The internal dialogue that a lot of them (though not all) engage in goes something like, "The earth doesn't matter. God is going to come back and set things right. So we don't need to worry about it anyway."


Nah, the Pathetic appeal is easy. It isn't sustainable, but the simplest form of Pathos is "Hey, fuck THOSE guys, right? You're better than they are, you're special! Git 'em!", and we're practically swimming in it right now.


I don't know. Ostensibly it looks like pathos is easy right now. Politics seems super intense right now, but it also seems super fake. Mostly I think shows of pathos are fake right now. Getting people to actually do things and act seems like the hardest thing right now. Posting on social media doesn't count.


> too many supposedly trustworthy people lie straight to the public face just to maintain the narrative

This is particularly apropos in the wake of Russell Brand's recent video on FB's Fact Checkers for Covid 19 vaccine information [1] funded by BigPharma, having a huge financial stake in bigpharma, and intentionally hiding their funding by BigPharma, and not adding a notification that the fact checkers are in fact funded/invested by/in BigPharma.

I am starting to read completely disinterested 3rd party sources more, because weighing the pros and cons of a thing is more challenging when the supposed experts are deliberating concealing material relationships. Some old Emeritus professor, already retired, with several phds and just a passing bit of information often gives great criticism of a thing without having any stake in the outcome.

[1] https://www.youtube.com/watch?v=44B-OJcOXxc


It is very frustrating, especially to many of my fellow scientists, that bellowing "THIS IS TRUE" as loudly as possible is insufficient. But it is insufficient.

While I understand your argument, I'm believe there's something more going on.

Taking COVID, for example. From the pro-science side, we get:

"COVID is a dangerous virus. We should take X, Y, Z actions" from somebody with a PhD in public health, medicine, or similar.

And from the anti-science side, we get: "It's fake. It's the flu. X wasn't perfect, therefore X, Y, Z are all a plot to make you magnetic" from somebody on YouTube with no credential beyond being on Youtube.

I guess what I'm saying is there's something fundamentally broken about how people process information. All the Logos, Ethos, and Pathos in the world doesn't help when a significant portion of the population is brainwashed.


COVID is actually a great example of some of TFA is about.

Many of the "facts" about COVID are not actually in much dispute. The people who view it as "a dangerous virus" and those who see it as "the flu" are much less far apart than they appear.

Nobody has seriously disputed the R0 values for COVID, nor the (rather) broad range of deaths per 100k cases. What isn't agreed upon is what the implications for policy should be.

Those who view it as a dangerous virus point to its transmissibility and capacity to cause hospitalization, and therefore its implications for public health issues.

Those who view it as "the flu" point to its relatively low death rate compared to other global pandemics of the past, particularly when adjusted for demographics, and correctly note that a given individual's chance of dying from COVID are extremely low (at least in countries with roughly adequate health care systems).

Both can claim "facts" on their side. The question is what policy consequences follow, and that is where the major differences lie. Depending on your perspective, the deaths (and long term illness) caused by COVID are variously worse, better or about the same as the damage caused by policies to contain it. Since making this assessment necessarily involves subjective judgements and questions of morality, it can't be settled by an appeal to science let alone mere facts.

None of this is to say that there are not relatively fact-ignorant people making various cases for certain policy approaches. But we could ignore those people, and the core debate would remain, and it's not a debate about truth or science, but policy.


There are substantial differences (and, I would argue, motivated reasoning) on the subject of mask mandates. To me, that's the area where the political desire to "look like we're doing something" has trumped following the data.

On vaccines, case rates, etc. I agree with what you said.


> I guess what I'm saying is there's something fundamentally broken about how people process information.

As far back as you can go, yes. 2.5M years. But I think the point is not to call what exists "broken," but to understand and then use it to the advantage of humanity. Cult-like behavior is human behavior--i.e. the rule, not the exception. Now how do we use this to advance the species?


Fair enough, but Logos, Ethos, Pathos are also insufficient. And I have no answers on what else is required. It's very frustrating.


Great point. I agree.

For context: I grew up Mormon, which I consider "cult-lite mainstream" on the cult<->religion spectrum. Having gone through that and left the church, I've spent some time deconstructing what influenced me to think the way I did as a believing member.

The best advice I can give is (a) make friends with people who are inside information bubbles [1], (b) people are motivated primarily by feelings and needs [2] despite what they say, (c) people often cover what they're really feeling with political, philosophical, and ideological language, which is often very difficult to decrypt to outsiders, and leads to significant misunderstandings, which usually serves to further separate and prevent friendship from happening, which is what's needed in the first place.

[1] see Daryl Davis (https://www.huffpost.com/entry/black-man-daryl-davis-befrien...)

[2] see Nonviolent Communication, a book that I place in the "top 3" most influential books in my life (https://amazon.com/Nonviolent-Communication-Language-Life-Ch...)


> And from the anti-science side, we get: "It's fake. It's the flu. X wasn't perfect, therefore X, Y, Z are all a plot to make you magnetic" from somebody on YouTube with no credential beyond being on Youtube.

Then why do we see articles saying that Youtube etc bans people with great credentials who are anti vax? There are people with great credentials on every side of every argument. Great credentials doesn't stop you from being an idiot or being wrong. If you believe the first person with great credentials you see, and that person happened to be anti vacc, would you be anti vax? Sounds like it to me. I'm pro vaccine, but I am strongly anti blindly listening to people with credentials.


Sorry, should have been more clear... I'll take Fauci's word over somebody on Youtube (regardless of that Youtuber's purported credentials). Fauci has a career in public health. Random Youtuber, I have no idea if they even have thee credentials they claim.

And there's a big difference between a scientist saying "people with previous COVID infections have more anti-bodies than those with vaccinations" and Joe Rogan saying "COVID is fake/just the flu".

The first is potentially true and most of us aren't qualified to either verify it or derive any course of action from it - regardless of it's truth, vaccination is probably the appropriate action for any individual (better safe than sorry). Using the statement as a argument to avoid vaccination is bad policy.

The second is on outright fabrication, yet we still have a significant portion of the population believing that crap.


There are a lot of weird claims in this post. Firstly, Fauci has publicly admitted to lying to the public on multiple occasions. I'm not sure why his credibility is unimpeachable simply because he's a bureaucrat in public health. Trusting him over a YouTuber "regardless of their credentials" seems like a significant overstatement of trustworthiness.

Secondly, Rogan has never said COVID was fake or just the flu, and the reason people assert that COVID is fake is down to motivated reasoning, which is the same reason Fauci thinks he was justified in lying to the public.

Getting people to change their behaviour starts by not dismissing them or dehumanizing them, and trying to steelman their position so they and you fully understand why they want to dismiss COVID. When that's done, it's often clear where you can compromise. Sadly, that's not what we see going on.


Also, Fauci's "great" career in public health started out by him getting just about everything wrong that he possibly could have with HIV/AIDS.

Including wanting to implement many of the same draconian policies we see today for communities where AIDS was discovered.


You were very clear in the first post, now you changed the meaning completely.


How do your fellow scientists get work through peer review without showing that it is trustworthy?

And how do they get funding to do science without showing its important?


Yep. Science is three things; space, time, field effects.

Human gossip is societies field effect.

A thing may be true but politics is obfuscating the importance.


What to do when it's true, but the proponent isn't trustworthy? Just say no?


Find another source. If it's true there should be evidence other than the proponents words.


> Logos, Ethos, Pathos: you need all three. "This is true. I am trustworthy. This true thing is important."

This ivermectin thing basically proves that logos isn't useful. Ethos + Pathos alone can convince a large population of people.

The Apple "Reality Distortion Field" was never about logic. It was about making people feel good about buying Apple products. That's fine, because Apple has decent enough products (I don't like them myself, but I can see why some others would like them).

But today, we can apply the "Reality Distortion Field" to any subject. Most recently: ivermectin.

-------------

What I don't get: why are people choosing to push snake oil (like ivermectin), instead of pushing the drugs that do work (3 different vaccines, dexamethasone, and monoclonal antibodies)?

Society has developed working treatments for COVID19: dexamethasone cut the death rate in half IIRC, and monoclonal antibodies cut it in half yet again. And yet, people are seeking treatments that straight up have no evidence of working.

I can put papers up for the efficacy of dexamethasone + monoclonal antibodies, and how this cocktail saves the lives of countless people across this country. But then I'm suddenly left in a "Russet's teapot" scenario where I'm apparently supposed to prove-a-negation when discussing ivermectin (even if countless papers fail to distinguish ivermectin from the null-hypothesis).

People's brains turn off. Because today's reality distortion fields / marketing / propaganda are much, much stronger than logic.

---------

Let me tell you how to do things in today's world.

1. Automatically find the people who have the poorest logic. Use ads, memes, and other such "low-quality" discussion points to find the lowest functioning brains. For example, clickbait headlines or "Nigerian Prince" scams. The dumber the argument, the better.

2. Reasonable people will ignore you. The only people who will interact with you are people with weaker argument skills. Spend as much time convincing _this_ group of your benefits.

3. Make it fun: give them memes to share with their friends. Even if its a bad / crappy argument, that's okay. That's what memes are about.

4. Sit back and relax as your crowd automatically spreads whatever argument you want amongst their friends and family. Now they're doing the hard work for you.

5. Bonus points: get enough people moving as a crowd, and even smart people start to get drawn into the masses. You'll start finding apologists who make better arguments on behalf of you. Keep up with the meme culture and pick/choose the best arguments. Crowdsource your marketing: the memes that become popular are the arguments you want to use.

At no point is "working" on logos actually beneficial to building a RDF (reality distortion field). You can build ethos + pathos simultaneously by just seeding opinions into a crowd through meme culture.

Bonus points#2: Use really, really bad arguments (World is flat. Lets to go Mars. 9/11 was a hoax. Hydroxychloroquine can save you from COVID19) as practice. The better you get at seeding bad arguments, the better you get at seeding any argument.


  > I'm apparently supposed to prove-a-negation when discussing ivermectin
i wonder if the bias towards ivermectin is because you can take it yourself, whereas dexamethasone + monoclonal antibodies you probably have to be in serious condition (in a hospital) before you can get it...?

is it possible there is some physiological issues about "going to the doctor" vs "self help/healing" ?


> is it possible there is some physiological issues about "going to the doctor" vs "self help/healing" ?

Its not physiological. Its simply marketing.

Its no secret that ivermectin / hydroxychloroquine makers are benefiting from this snake-oil bullcrap. Its no different from essential oils or other such snake oil products.

We just didn't care about essential oils 3 years ago because its fine for idiots to waste their own money on snake oil. But when the masses are tricked into distrusting COVID19 precautions and start spreading the virus around even more, its a bigger deal.


  > Its no secret that ivermectin / hydroxychloroquine makers are benefiting from this snake-oil bullcrap. Its no different from essential oils or other such snake oil products.
interesting... if thats true, shouldn't there be legal repercussions?

  > We just didn't care about essential oils 3 years ago because its fine for idiots to waste their own money on snake oil.
slightly off-topic, but as someone who's suffered severe allergic reactions to people using "essential oils" it baffles my mind that these (and supplements) aren't more strictly regulated...


Sorry, just one correction:

It's "Russell's Teapot", it's named after Bertrand Russell, not a potato.


To your point, American anti-intellectualism continuously undermines the "I am trustworthy" aspect of science. For a country whose current rise to prominence was so clearly aided by technology, and whose future relevance depends on technical innovation in an uncertain global future...

We are so lucky that COVID is only going to kill a million americans or so (700k and counting, probably 850k by Jan 1st), if this was the death rates of smallpox or the black death... ye gods. Our lack of respect in science is a huge part of vaccine hesitancy and our key role in continuing the pandemic.

The news media are terrible because for decades they have reduced science and technology either to simplistic caricatures, star trek gobbledygook, or even worse the WELL THEY SAID THIS, NOW THEY SAY THAT.

The general media's constant use of the high school nerd trope for all science and technology proficiency has been a long term failure in the progress of our civilization. Only, and only, because such proficiency leads generally to some degrees of upper middle class money to vast wealth is the reason it hasn't been worse.

But that media trope has always colored all science and technology policy as culturally "well, that's what the nerds say, and they aren't cool or fun".

Possibly it goes back to how most ultra-rich """elite""" made their money in America: by control and exploitation. People that have influence and position distrust science, because it is complicated and unknown and preys on their paranoia, and so often for industrialists produces complications to their overall plan of "make money by selling products, offloading the actual environmental cost/impact of those products on ... anyone else".

Maybe I'm paranoid, but it does seem with the mastery of social media propaganda/manipulation by monied interests, sowing distrust in science is now at an all time high.

So the business-friendly right will distrust science because it threatens their money and power. The apathetic center won't like science because it isn't cool. The left... well, scientists are generally white and male so they are distrusted by identity activists, and the rest of the left is too disorganized to rally around science policy effectively.


Maybe scientists should stick to their labs? You can invent a cure for cancer, but that doesn't automatically make you a good politician or leader.


You need all three rhetorical modes in your lab too. Trust me!


I am convinced scientific results should not be publicized, publicly promoted or broadly advertised at least a decade after publication. That's about the timescale it takes specialist communities to cross check or falsify results.

Of course, this is exactly the opposite of how science works today. There are poorly written press releases galore, and flashy but tenuous results that seldom hold up to scrutiny. There's a reason a journal like Nature is routinely mocked in scientific circles.


Why given a timeline, just say results are probably real after two independent replications, and there should significant pop science hesitation on publishing anything that hasn't been replicated yet.


My problem with this article is that it doesn’t know how to draw the line between what is science and what is politics.

Science absolutely exists as a thing on its own, “pure” if you will. And alongside it and with it also comes politics, yes! But that’s a feature, not a bug— both play a role. A very different role.

Science is for everything descriptive and inferential. Politics is for everything prescriptive.


Yeah this is one of my pet peeves - those of us in the science community should be saying: "if we do x, we will see y result"

we should not be saying: "we should do x"


Exactly!

Unless of course we're up front about what our goal/premise is. Then it's absolutely appropriate.

For eg.: "If we want to minimize the burden of disease from X in Y conditions, we should take Z course of action."

Other eg.: rather than saying "you should not urinate on the electric fence", a good scientist will say "if you want to minimize the likelihood of unnecessary suffering, you should not urinate on the electric fence."


> Science absolutely exists as a thing on its own, “pure” if you will.

Interesting. I thought it did a great job of illustrating that while the quoted is true, the quoted represents an "ideal" (or pejoratively put a "fantasy"), as the reality of doing any real amount of such ideal science coordinates with the real world, the context of doing so, means it never ends up anywhere near pure and unsullied. It's like the fact that adding an even number and an odd number will always produce an odd number.


Well, science itself is independent of political motivations. Some alien civilization running the same experiments in a galaxy far, far away will get the same results and, eventually, reach the same conclusions about how the world works.

Politics, being concerned with the allocation of limited resources, does two things to complicate this ideal picture:

1) It decides "which science" gets done right now, which means the progress of science may not be uniform in all areas.

2) It applies research findings improperly to serve non-scientific goals.

Notwithstanding, the science remains just science. And crucially, trying very hard to stop or limit #2 from happening, in particular, is essential in making better _political_ decisions.

Political decisions are inherently about how to manage conflict between social groups, and while it's not really my place to judge the quality of any given political decision, it's hard not to judge the quality of a political decision which is not based in the physical reality that we all share.


Politicians have a lot of overlap in their beliefs, but anything relevant to it will be waved through the legislative process without much comment. We're all left to focus on the remains where there isn't much consensus.

"Settling Politics" amounts to everyone agreeing about what to do on every matter all the time. Otherwise politics just isn't going to be settled.

If the question is why science hasn't settled a particular debate, the answer is typically that the facts aren't really in question, and there is a secret debate about who is going to have to wear the costs.


"Settling Politics" amounts to everyone agreeing about what to do on every matter all the time. Otherwise politics just isn't going to be settled.

But it's a misconception to think that the goal of politics is to "be settled". The goal of politics is to make decisions on how to rule the land right now, they don't get to wait 15 years and then try again (yes, some do, but that's a different problem).

there is a secret debate about who is going to have to wear the costs.

It's very sad that this has become a "secret" debate, because this is the heart of politics. I haven't decided if I blame politicians or the sensationalist media more for this, but these debates should not be secret. This is not just a problem in the US, in Europe I see the same: we are stuck with career politicians that are too chicken to either state their real beliefs in public, or to act on those beliefs in the senate.


Scientists see a block of wood, and observe that it has fixed dimensions, is made from a dead tree, and weighs a certain amount.

Politicians see a block of wood and ask, how do we use that wood? Do we build a house, make a fence, or carve it into an ax handle.

So, it's really apples and oranges. Scientists use the scientific method to determine the truth. Politicians are not seeking the truth, but are using consensus to govern the actions of their constituents. The consensus can be based on science and truth, or on irrationalities such as fear and hope.


The most recent issue of Scientific American has a piece that talks about how detrimental increasingly restrictive abortion laws are. That is a great example of science being politicized. From a science standpoint they could just as easily talked about the tens of millions of babies that have been killed by means of abortion. But the bottom line is that a science magazine should not be expressing an opinion on a controversial moral issue as though it is a matter to be determined by science.


People forget that politics is preference.

Science deals in the objective while politics deals in the subjective (preferences).


Science is conducted by people, all of whom have preferences. Until the day arrives that science is explored only by preferenceless beings, science and politics will continue to have a significant overlap, much of it at the invitation of scientists.


Yes, what is studied by scientists is based on preference (and therefore politics).


The problem comes when preference has been based on something that becomes shown objectively false.

People are so attached to their preference that they dismiss facts and attack the whole concept of knowledge to maintain that preference.


Who has the objective falsifications? Is it actually a preference of risk reduction and complexity? Would be interested to hear your example.


For instance when science shows that burning fossil fuels causes environmental problems.

Rational preference would be to take that into account and consider risks and potential mitigation into the preference to keep using them.

What we observe in much of the political sphere however, is often just abject denial to maintain the current preference.


Good example. I wonder if those in power who make money from coal plants know that they are causing some harm while they are causing some good (providing power) just prefer to deal with the pollution issues after they are dead :)


I don't know how objective science is, because we bring all of our biases to the table when we interpret data. A young earth creationist, an old earth creationist, a fanatical darwin evolutionist, and an ardent panspermia UFOlogist are all going to view the same data very different, even if they have identical educational credentials. The holes in evidence for each don't matter, because they are the persuaded.

If we think about truth bending to subjective reality, I am sure certain that some of the churchmen & scientists opposing Galileo's embrace of Copernican heliocentrism were completely certain geocentrism was the truth because their own astronomical observations were subject to slavish obedience to doctrine. This doctrine in their heads influenced their own observations. Narrative overwhelmed the senses.

It is the mad among us that have the rare ability to completely discount all external input and see a thing for what it truely is. And, that is why we call them crazy.


I don't know how objective science is, because we bring all of our biases to the table when we interpret data

You are conflating "science" and "scientist" here. The way science overcomes bias is two-fold: 1) by training scientists to be aware of their own biases, and 2) by having multiple scientists, each with their own biases, try to reproduce the same experimental results.

Your artificial example is just that, they all fail 1).


> Attempts to scientifically “rationalize” policy, based on the belief that science is purified of politics, may be damaging democracy.

Could anyone explain to me what the phrase "may be damaging democracy" means? I've been hearing it a lot over the last several years, and I've been wondering what democratic ideal people who use this phrase have in mind. It looks as if independent opinion-making based on one's trusted sources "may be damaging democracy", because most of us are not equipped to identify fake news; open and free exchange of opinions on social media "may be damaging democracy" for the same reason plus due to the tendency of falling into warring tribes; and now this tagline from the article claims that bowing to the experts may also be damaging democracy. Is there anything that doesn't, and what does it all mean?


> damaging democracy

IMO, it means something like undermining the trust in elected government and law makers.

There's a lot of policy making for which there's no solid evidence, because –crudely said– social sciences are far too sloppy. However, I don't think much policy is based on it. But in the public on-line debate (whether that's run by trolls or not), many appeal to science in their arguments. Perhaps that has some impact?

The article itself has the same vibe as online debates. The first example of biased science is medical experiments using men, and some totally unnecessary anthropomorphization of the reproductive process, which sounds more like virtue signalling than pertaining to the topic. Whorff-Sapir like arguments complete the picture.

Other arguments in the article point out that scientists can have a rather limited vision, resulting in sub-optimal solutions. But as I said above, I don't think much policy making is largely based on science, and the examples given point as much in the direction of tunnel vision by policy makers as the scientists they consult.

> Is there anything that doesn't

Where trust is absent, everything appears hostile. That would make a good Latin fake quote...


A practical example are the Jan 6 riots in DC and the attacks on the integrity of voting. Both peaceful transitions of power and trustworthy elections are bedrocks of democracy and so attacking them weaken democratic societies. At some point a coup succeeds and democracy is dead.


Weren't the rioters convinced that they were saving the democracy that was about to be undermined by what they considered to be a stolen election? If so, weren't they participating in the democracy?

As a thought experiment, consider protests against fraudulent elections in other countries (Russia and Belarus come to mind, but I am sure I saw some other countries in newspaper headlines recently). Would you consider those protests to be an exercise in democracy?


What saddens me about that business, is that the criticism against certain voting machine companies in the US is considered "anti-democratic conspiracy theories" while the same criticism against the very same companies in other countries is considered "pro-democratic".

The thing is, these companies are running the elections like a black box, Princeton University already has shown it is easy to hack Diebold machines for example, and we had very suspicious cases in other countries (for example voting machines in a state election in Brazil registered more votes than voters, and the amount of "invalid" votes, absentees and "blank" votes were all identical, when someone asked for a recount the response of the government was to say the machines must be trusted and fine the shit out of the candidate that complained, for "education purposes.")


Democracies don't die from a single stolen election. The remedy is in the courts to determine if the election was stolen and then in the legislature to fix whatever problems allowed it. Convincing people that democracies must be saved by violent revolt is what kills democracy. The true enemy is lawlessness, not the other half of the country.

Fraudulent elections are happening in countries that are, in fact, no longer democracies and have not been for some time.

I'm open to arguments that the U.S. is no longer a functioning democracy, but the ability to still have mostly peaceful transitions of power and for election audits to come back clean reassures me a little bit.


Or a coup does not succeed and the powers that be start looking for ways how to actually strengthen the trust in democracy.

For example, Swiss constitution of 1848 was a result of a civil war. Its provisions made the division of powers between the federation and the cantons clear and Switzerland has never had a violent crisis again.


I take it to mean damaging to democratic countries or, in a more abstract sense, damaging to the credibility of the democratic form of government in general. A good example of this is the current effectiveness of disinformation in democratic societies. Authoritarian countries do not suffer nearly as greatly from this.


So, instead of the abstract textbook word "democracy" one could rephrase this more concretely as "damaging to the present political situation in our country"?

When put like this, this statement sounds truly conservative (small c, no necessary relation to parties); but I am hearing this phrase from people who are fine with the disruption of the political arena in the name of what they consider to be progress.


Institutional perversion of soft sciences can't even solve scientific disputes.


Science shouldn't be settling political disputes. Politics is the realm of people's most base instincts and desires - try to step in that and like the man wrestling a pig you'll end up covered in filth and the pig will be deliriously happy.

I doubt the wisdom of people who say - I want to stop global warming so I studied climatology. You're going to open yourself up for a heartache if the plan is to study and learn enough so that you become a voice of authority and then use that voice to control what happens.

It's not like people are terribly confused about what the least damaging course of action to take is. If resources were unlimited there would be little disagreement about what to do. Resources are limited though so we need to choose who gets what they want and who does not. There are messy compromises that can be made. Tricks that can be played. Sometimes there's power enough so that only one side sacrifices. In all of that though Science can inform what choices are available but when it starts trying to make them it will get slapped down.


Alvin Weinberg wrote along these lines years ago in his Science and Trans-Science paper [1] from 1972. Once I understood the concept, I see it everywhere these days.

[1] http://www.quantamike.ca/pdf/Weinberg-Minerva.pdf


"Fanatical certitude" isn't exactly the same as "science". Maybe change title to "Why fanatical certitude will never settle political disputes"?

What bugs me about these thesis is that they lead people to question science, as the best set of tools we have for reasoning, when used properly.


The distinction exists but its also rhetorically insignificant. Nobody is conflating the academic research complex with the scientific method. But just as how other definitions have changed over time everyone knows that when someone refers to problems with science they mean problems with the academic research complex and related topics.

Put another way in modern colloquial English what you call science is almost always called the scientific method. What everyone else calls science is really Academia. When people start publicly calling for other forms of reasoning and problem solving to be taught then your argument will be convincing.


Critical thinking is at an all-time low in Western society.

Enough people aren't being trained to think critically nowadays, and increasingly, societal pressure, which used to put pressure on officials to vet and verify facts and positions, is diminishing. Until that changes, we will fall further and further into the cult of celebrity/personality, which is fundamentally appeal to authority.

A person reading this MIT article should ask "Why should I believe what this article says?" If the answer is "because it's MIT! Duh!", then that's the problem in a nutshell. MIT is not a substitute for their own critical thinking faculties, nor is an MIT professor/academic/spokesperson/author the absolute arbiter of truth.


I'd actually say critical thinking has seen considerable improvement in the past few years. For one thing people no longer unconditionally trust the press and government institutions like they used to in, say '00s, and instead see them for what they are - a little more than Pravda and Politburo. Wars are also a lot less feasible now (thanks, ironically, to the cancer that is social media) than they used to be - people will bypass the media and shit all over the military industrial complex saber rattling. The establishment's control over narratives is at historic lows - that's why you're seeing increasing censorship - "consent of the governed" is getting a lot harder to manufacture. I'm pretty sure NYT would not be able to sell Iraq war to the public in 2021. I like that. I wish there was a less society-damaging way to achieve the same effect, but there doesn't seem to be one.


I've seen a lot more blanket skepticism and cynicism, but these new views are often as uncritical as the credulity they replaced (and as easily exploited).

I haven't really seen an increase in people doing the hard work of gathering facts, considering new perspectives, and challenging their own assumptions or prior beliefs. It's more like switching from uncritical consumption of NYT to uncritical consumption of YouTube.

I did like your idea that this could be a good thing though, not sure I believe it yet.


If all we get from this is the impossibility of droning children in the Middle East, it's a worthwhile tradeoff in my view. But I feel like we'll get more than that, in time.


We've kept a lot of drones in the region, and have continued using them (though at a reduced pace). I hope you're right that it will become an impossibility but I really doubt it.


These are all very good points. The ability for misinformation to propagate means that manufacturing consent has become a lot harder. As we've seen with COVID, that has some downsides, but it also has upsides because selling war is going to be a lot harder. Every tool has light and a dark side.


Think of it this way: the establishment is merely not a single source of misinformation anymore. Misinformation was always easy to propagate. I'd say easier, in fact. See e.g. Iraq war - 100% disinfo, parroted uniformly and enthusiastically by the entirety of the mainstream press.

Except in the past you could create (via mass media controlled by, quite literally, five people) an airtight, completely impenetrable narrative and feed it to the public, and now the public can get both the information, and conflicting disinformation elsewhere. Oops. Bet the CIA did not think of that when they helped create Twitter and Facebook.


> Critical thinking is at an all-time low in Western society.

I think you are viewing the past through rose-colored glasses.


Fair enough. Perhaps it's better to say that we seem to be in a local minimum right now.


I didn't read the article in its entirety and this comment isn't directed at the content of the article.

Science itself has been politicized where the pursuit of truth has become secondary to the pursuit of funding and the alignment with the political agendas du jour. When I was younger my faith and trust in science was quite high. As I have aged and seen more of humanity and how it permeates all aspects of our existence I don't trust science like I did when I was younger. Now all I see are the motivations of those who are doing the "research".

Having studied statistics in college with the express intent of how it is used for scientific studies I am well aware that with enough data you can get any statistical result you want. Even better vaguely word it so it resonates with main stream media and still gives the authors an out with their peers.


> Even better vaguely word it so it resonates with main stream media

This explanation doesn’t resonate with me. Research most often has incremental results that need to be carefully qualified. Isn’t the far bigger problem that mainstream media takes subtle research results and “simplifies” them for the public by adding certainty and often mis-interpreting the results completely?

Political agendas have been recently systematically trying to erode trust in science. (Because science and truth does threaten some politicians.) The idea that science can’t be trusted as the high-level summary is exactly what some people want, and it seems to be working. But what is the alternative? We have nothing better. The point of science is to try to protect against motivation and agenda, and it does work sometimes. Even when people are motivated, when the methods are reproducible and the results are peer-reviewed, that does help filter out some of the badness. And if it’s not enough: what should we do to improve it?


Media distortion is definitely a problem too, but it's not the only problem.

Solar geoengineering is the hobbyhorse I usually use as an example of this. It's an open secret (see e.g. https://www.nature.com/articles/d41586-021-01243-0) that most scientists in the field won't research it, partly due to safety concerns but partly because they think that decarbonization is the right policy and they don't want to risk "detracting from efforts to rein in greenhouse-gas emissions". Maybe that's the right judgment, but it's hardly free from motivation and agenda.


Apparently a small scale version of that is being used to protect the Great Barrier Reef

https://www.nature.com/articles/d41586-021-02503-9


I think that you will find that science has been pretty much that way since the beginning. Can't do science if you don't have funding. If anything, this is simply more transparent these days.

Grant review sessions most definitely have some amount of cronyism, and that famous luminary in a field will almost certainly get funding even if their grant application is worse than that of somebody with fewer citations or younger and with less of a track record. But there's awareness of it at least. And you'll find scientists on Twitter being very open about it, whereas before you'd mostly only find it at the pub during conferences.

There's a great passage from ET Jaynes that I'm having difficulty finding right now. He was a physicist from the 1940s on, and as a young graduate student he talks about how he had to be very careful about what he studied, because if it had the potential to contradict one of the big names of the field, then it could tank his entire career before getting started. If he hadn't "played ball" early on, we'd never have gotten his later Bayesianism.

Science is always a human process, there will be politics to some degree. Politics can only be minimized, not eliminated completely.


Science used to be a self-funded hobby of the upper classes. Then after that came the tenure system, where people had at least had safe and stable emplyment. Nowadays there are legions of precariously employed scientists desperately trying to hawk their science.


Replying to myself, I think I got the first part of that wrong. Patronage used to be important for a long time.


Right, and the current situation is much better, and less political, than having to find a patron before you can even really get started.


There's also just an extreme naivety when it comes to the definition of "politics." If you genuinely think that your position is morally and factually correct, you'll pursue that position.

Now, imagine that your position truly is objectively factually and morally correct. (I understand things generally don't work this way, but stick with me for sake of example.) All you have to do is find a vocal group of people who disagree with the morally and factually correct position, and you can now call this "political stance" which someone can be "biased towards." (a simple example might be flat-earthers.)

Now of course, this same process can work out in nearly any permutation: the mainstream group could be pursuing an idea which they believe is morally and factually correct, but of course simply be wrong. And then the protesting group, calling out the politicization of the issue, could then be the correct group.

It must also be stated that the closer you get to the hard sciences, the less any of this politicization works. It's worth noting that things such as semi-conductors and computer chips all rely on science, and no one is politicizing whether they exist, or work. What is politicized is a bit more predictable: Questions of the cause of the ills of society, (ie, social sciences) questions about the role of men and women, questions about public policy.

I suppose the point I'm trying to make is that it's quite easy to call one man's truth a "political" ideology. If you poll enough people, nearly anything is "just a point of view" which could be construed as political.


I don't see it as politicized science, but as abuse of science by politics. Just as when religious speakers abuse science for their purpose, science doesn't become religious, analogously when politicians abuse science for their purpose, science doesn't become political. And this abuse isn't new, politicians did it for ages, it just hit the fan now.


While I see that funding will tend to constrict the research around the desires of the funders, it's worth considering that 'science' has also expanded into areas that don't deserve the title.

The softer disciplines are particularly susceptible to the style of the day but lust after the cloak of correctness (and status) that hard sciences wear.


> Having studied statistics in college with the express intent of how it is used for scientific studies I am well aware that with enough data you can get any statistical result you want.

Just doing what a good pseudoscientist does: first dismiss any rigorous logical tools...


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: