As the famous saying goes, "nothing is as important as you think it is when you're thinking about it".
Every scientific discipline inculcates values that may be different from those held by the general public ranging from how important it is to give credit to the originators of ideas to the relative importance of difference species to whether naturalness is valuable or not. This isn't bad, its probably necessary. But it's something that has to be accounted for.
But step outside that field, and they latch on to something that emotionally appeals to them, and all that reason, facts, etc., flies right out the window.
Of course, I never do that myself.
I often wonder whether the epistemology we are "born with" can actually be improved very much (in a general sense), or whether the best we can do is teach domain-specific techniques to override our defaults on those topics.
We also need to deeply care and understand ourselves.
It would also be good if all those models were used as basis to discuss divergent interests and that politics would remain politics as a tool to select policies that maximize a majority (but not everyone's) best interests.
I feel, instead the issue is that it's become unacceptable to disclose what are someone's own best interests and to defend those.
Instead everyone produce fallacious models that show that their prefered policy is in everyone's best interests.
Politics is manifestly not about the public good - not even with compromises.
It's mostly about a few unpleasant - sometimes charismatic - individuals pursuing power for their own personal benefit, and covertly that of their funders and sponsors.
Science has nothing to contribute to this, because the entire business is so hopelessly toxic and corrupt that rational debate about policy doesn't even get to first base.
If you want rational policy you don't want politics. It's perfectly possible you don't even want democracy.
What you want is administration - a very different process where mature competent executives and managers who do not have Dark Triad personality traits make intelligent, informed, and compassionate decisions in the public interest based on evidence, expert guidance, and their own good instincts. And their power is strictly limited to the lightest possible enforcement required to do the job.
No country has even been run like this, but a few have managed to operate like this in selected subfields.
Which is basically inhuman. For an administration of any sort, you need a selection process and the selection process will then be gamed by exactly the people who seek power.
Which is why we have large bodies of political representatives: people will not stop seeking to acquire power and act in their interests and the interests of their backers, but they will be counteracted by people doing exactly the same but pursuing different ends. That’s a very human thing.
Based on what I have seen, most politicians genuinely believe they are working for the common good. Their understanding of what the common good is obviously varies.
In a system like that, politics is mostly about playing the long game. You build a personal brand that gets you elected and re-elected and gives you influence within your party. You build and maintain your networks, you collect political capital by supporting others' goals, and you spend the capital to advance your own goals. You try to find a balance between short-term and long-term goals, because your opponents today may be your allies tomorrow, and you don't want to alienate them by playing too hard.
Unpleasant and toxic people may sometimes survive in politics. They are rarely successful, because politics is all about people skills, at least in the system I know of. (Such people are more common in administrative positions, because their status is more secure in a meritocratic hierarchy.) People focused on specific topics often also have a hard time in politics, because they have a tendency of becoming unpleasant when things don't go their way in the niche they are interested in.
The public image of politics can be toxic, because the current publicity game requires it. The same politicians often appear quite different behind closed doors, where they are allowed to speak off the record.
All of the other scientists and researchers see this and adjust their own behavior and areas of study to avoid being mobbed.
I have had to become increasingly cautious and wary of trusting all published science, especially that in social areas now.
It isn't rational that you love your wife more than other women. Why constrain yourself to such rules for policies?
Is dying for your freedom rational? Investing in symbolic architecture? Skydiving could be seen as fundamentally irrational. Some still do it.
All depends on how exactly you define rationality.
Overall it might be prudent to value rational arguments, I complete agree with that.
> It isn't rational that you love your wife more than other women. Why constrain yourself [...]
Reading it again, this can transport something different than the point I tried to make...
It is a preference and that is what form the basis of interests.
> dying for your freedom
Is somewhat different, it's already the execution of a policy (fight) for a preference. And I think it can be perfectly reasonable outcome as it's not even hard to find chapters in History during which conditions made it the only acceptable choice
What problem are we solving, what is or isn't a problem, what constitutes a better outcome, and what types of policies are allowable may be informed by science but are often (and often appropriately or at least inevitably) decided by culture, law, art, force majeure, etc.
This reminded me that the weight range we term "overweight" has lower all-cause mortality than the weight range called "normal weight". (There is another tier above "overweight", "obese", which has high mortality.)
If you look at , a ton of the studies they looked at adjusted for preexisting illness like hypertension and diabetes. It's basically like saying people who fall off sky scrapers don't die of cancer.
It is also important to remember the associated costs and decrease in quality of life associated with being overweight.
That said it is possible the lower-end of the overweight BMI range could be moved up a point or two but that's probably less than the variance between populations anyway.
Maybe if you're life is about preventing cavities you think that's what kids should always have in mind, but life would suck if before every action we asked "will this contribute to tooth decay" - much more than it would suck with a few cavities.
That seems like a very dangerous saying! As a scientist, I am biased, but I think it's also important to remember that nothing is as simple as you think it is when you're not thinking about it.
Sure, it was trying to stir up business; but I think that ad mainly existed because podiatrists really did, in good faith, think everybody should get their foot examined once a year.
You just lose perspective.
Also notable is that sunniness doesn't really determine UV exposure; partial cloud cover can actually result in more UV-B exposure, counterintuitively: https://www.drgurgen.com/are-the-suns-uv-rays-really-stronge... And even full cloud cover doesn't completely block UV. So WA and CA aren't as different as you might think.
Meanwhile, as one might expect, most states that get freezing cold for months at a time have lower skin cancer rates than those that don't when controlling for race: if you don't go outside for more than a few minutes a day, while bundled up under multiple layers, for months at a time, well — you see less skin cancer. Some heavy farming states seem to have more skin cancer, but again that makes sense since farm workers are outside a lot.
Vermont and New Hampshire seem like odd exceptions to this rule — but I suspect there's also just some other selection bias at play. UV exposure from the sun causes skin cancer at high rates in white people; trying to make assumptions about states isn't necessarily an easy thing to do, since many other factors are at play when considering who lives where and how much they go outside.
Also because one doesn't want to get arrested.
> The science policy scholar Daniel Sarewitz goes so far as to argue that science usually makes public controversies worse. His argument rests on the logical incompatibility of policy expectations and what science actually does — namely, that decision makers expect certainty, whereas science is best at producing new questions. That is, the more scientists study something, the more they uncover additional uncertainties and complexities.
I'm not sure I've ever seen this basic contradiction put so cogently. We want policy (politics) to create certainty and stability. "Science" increases our risk of the unknown by making us more aware of it.
Loved the article. (and still an avid scientist)
Maybe it's this half of the equation that needs to be examined.
Anyways your statement is literally begging the question:
["people want certainty and stability" is] not going to change. Certainty and stability is what people want.
One heuristic for this is the 40-70 rule - a heuristic for decision making. In order to make a decision you should have no less than 40 percent of the information you would prefer to have, and you shouldn't wait to make the decision once you have 70 percent of the information you would prefer have.
I'm sympathetic to this. There is a strong argument to be made that this is a need.
> It may be that creating certainty amid uncertainty is itself the core of leadership.
I would agree with this without reservation.
But the phenomenon here is being driven by what people want, not what people need. If they're benefiting from the certainty they get, that's just a coincidence.
Wanting and needing are different things, and while people may need some certainty, they want much more than they need, and they're getting more than the optimal amount.
I'm not sure I understand the phrase "our risk of the unknown". The risk something poses to us is surely the same whether or not we are aware of it—just that our mitigation strategies, and even the awareness that we need to mitigate, change in response to increased knowledge.
Perhaps the idea is that people are more hesitant to make decisions when they become aware that what they had previously taken as absolute is in fact conditional? That seems like a good thing (although I agree that it can be taken to paralysing extremes).
Most people are horrible at dealing with uncertainty when making decisions. I don't know why this is. But taking an uncertain landscape, making a decision and then projecting certainty works better than conveying the risks for communicating with everyone but people used to making executive decisions.
So if you have two politicians, one who channels a scientist's healthy (and realistic) scepticism and one who takes a random position and blasts it, the latter will tend to be more popular.
I think it depends on what 'better' means. It works better in the senses of getting things done, and of popularity. But, unfortunately, the things that get done are those that are some weighted combination of (a) rewarding in the short-term and (b) in the interests of the person who's good at projecting an aura of confidence.
If the 'right' decision tends to align with the interests of the decision-maker, then it's great to have that decision-maker pushing it through. But, when the decision-maker's interests are not those of the general public, paralysis might be better than populist marching into short-term gratification.
(On the other hand, I also recognize that not making any decision until you know it's the right one is just a long-winded way of never making any decision. Making decisions about whether and how to make decisions is just as complicated as the non-meta decisions themselves ….)
The reason that it is impossible for mere fact to end political dispute is that facts are only one element in policy. Values are more important in the long run and facts can only be used to improve policy around shared values.
People don't agree on values and they never fully will.
And if everyone had the time and resources to discover and digest every fact, facts might be definitive.
But everyone doesn't have time and resources. To compensate, we rely on others to curate facts for us. When we encounter an internally consistent subset of facts that suits our ideals and our interests, we adopt that point of view.
There are infinitely many subsets of curated facts that can be presented as internally consistent. That's why there are so many different points of view.
To complicate things further, it is difficult to get a man to understand something when his salary depends upon his not understanding it.
If an AI did do that, the “conclusions you don’t want to hear” would be artifacts of the AI’s algorithmic process or data set.
Living things in general don’t worry about conclusions. They just live.
If I remember correctly, Peter Watts has a somewhat more realistic take on this in his Rifters trilogy (under the Novels section here: https://rifters.com/real/shorts.htm), where there's a brain in a box that sifts through loads of information and gives advice to political leaders. The trilogy as a whole is more... deep-sea cyberpunk than particularly centered on the brain in a box, though.
Not totally omniscient AI but running large ships and space stations - Imperial Radch trilogy by Ann Leckie
AIs with personality running almost every machine and helping rule the empire in the Culture series by Iain Banks.
"Science doesn't tell you not to pee on an electric fence. It only tells you that urine is an excellent conductor of electricity."
From the TFA.
"what happens to politics ... when rhetorical appeals to 'the facts' start to dominate people’s justifications"
The bigger problems are probably on the consumer side, though, and not only in under-educated social groups. Here's an interesting post I stumbled across recently: "People who trust science are less likely to fall for misinformation -- unless it sounds sciency" (https://digest.bps.org.uk/2021/08/10/people-who-trust-scienc...)
This confirms my sense that we easily glom on to things that "sound right", and if you're a scientist or engineer "sciency" statements sound right to us. Do we really have the time to dive in deep enough to figure out whether it's pseudoscience?
There are a lot of problems with the way science happens, and a lot of bad data is never found out, because it can be hard to show, definitively, that it was fabricated or manipulated without someone from the lab in question speaking up about it (which probably doesn't happen enough).
At a certain point, though, there's too much information & data, good or otherwise, for any one person to parse. That's sort of why we have journalists: to acquire first-hand accounts and deliver them to a broader audience. The problem really arises when journalists significantly editorialize or disregard conflicting information. The Wakefield paper is a great example of that. Some journalists' cum pundits' gross negligence with regard to the retraction of that paper constitutes misinformation, but has shown to be very difficult to discredit because those actors abuse what we've all agreed is the role of the journalist: to give us valid accounts of actual events.
The study discussed in the article you cited even used a real, but heavily criticized (unknown to the participants) scientific study in their experiment. I think the question is "what is the amount of effort that it is reasonable to expect a lay-person to put into the in/validation of information presented to them?" with the caveat that trust is generally earned over time, but once earned, can be abused (and I use the word abuse very purposefully here, because it is a violation of one's relationship in a harmful manner). Should one be expected to more rigorously critique the statement of a trusted peer because of the potential for abuse?
It's a little counterintuitive, but people don't complain about stuff they don't deal with.
Exactly, I think that's the crux of it
But, politics and media do not currently thrive on presenting complexity, so if you're going to run the risk of asking a scientist their opinion, you either select scientists who are willing to dumb down the science, or you ignore everything they say that doesn't match what you already believed.
In the context of Democracy, the trouble with "actually, it's sort of complicated", is that there is absolutely no way that all citizens can approach everything this way and still have time for, e.g. the Pursuit of Happiness.
In other words, trust (and trustworthiness) is key. We must delegate to someone, whether expert or not.
Traditionally, organizations with cult-like properties envelop people in a kind of "information bubble" that makes deciding who to trust a tractable problem.
The culture of science does this to some extent, but is unable to compete well for a number of reasons.
Is there political counterflow that compromises integrity in some parts of science? Of course there is. For millenia this has been so. From Gallileo to Darwin to Haber to Einstein, the political or religious disruptions arising from science theory or experiment or technology often prompt someone to argue that governmental policy should not change in the light new facts or a new interpretation -- not if that change disrupts cherished societal values or impedes vested interests.
It sounds like the author's argument is that political winds WITHIN scientific communities are fomenting bias in their work because they're not satisfied with publishing papers but now want to effect political change. Therefore they tolerate no dissent from an official party line.
Perhaps climate change warrants such circumspection, but I know of no other scientific subdiscipline that does. As such, the rise in politics in ONE scientific subject doesn't justify a book that seems to tar and feather ALL of science.
I suspect a book that attacks only the climate science community was deemed to narrow to attract a broad audience. And of course, exploring that topic wouldn't compellingly break new ground either.
Finally, it seems to me that this is exactly the WRONG time in history to be impugning science or scientists. Without concrete and viable suggestions of how to redress the forces that have broadly compromised scientific integrity (which I doubt the author proposes), chinking away at science's armor can only aid the cause of anti-science, and feed the rising barbarian horde.
> Perhaps climate change warrants such circumspection, but I know of no other scientific subdiscipline that does.
Anthropology is especially notorious for this, far more so than climate science (!), but you could describe all of the social sciences this way.
The problem we have in society right now regarding science, as I see it, is that we have a significant group of people who disagree about the objective parts.
The people who think the world is flat, the people who think the world's age is measured in thousands of years, the people who think all the world's top climate scientists are part of a massive conspiracy, the people who think COVID is a massive conspiracy, etc.
Our science problem as a society is that those people are mainstream. They're politicians, they're pundits, they're your next door neighbor, and they're all objectively wrong but they use the appearance of science to intentionally spread lies.
Ted Cruz is a good example. He argues that he's a scientist not because he has done science, but because his parents have done science. Thus, he's a "legacy" scientist, and because they emanate from him, his arguments must be scientific.
The man isn't this stupid. But in endorsing such nonsense, his supporters are willing to be. "We don't need no stinkin facts!"
How can reason hope to overcome willful illogic of this magnitude?
And that is definitely mainstream.
I see a lot of people who disagree with utilitarian proposals labeled as "anti-science".
Logos, Ethos, Pathos: you need all three. "This is true. I am trustworthy. This true thing is important."
It is very frustrating, especially to many of my fellow scientists, that bellowing "THIS IS TRUE" as loudly as possible is insufficient. But it is insufficient.
Statements about importance are value judgements that no doubt you have made, but are not really science and are basically political.
An easy example, "covid restrictions curb covid" may be true and you can provide experimental evidence. "We need to impose restrictions" makes a judgement about the overall situation under the restrictions being preferable to the one without them. This is not a scientific call, this is a societal judgement.
People too often pretend that an action immediately follows from some identified causal relationship. Smoking kills you does not immediately imply you should quit smoking, without the intermediate step of "you value your later life and health more than the ongoing benefits you get from smoking" (disclaimer, nonsmoker)
But eventually, someone needs to do the work to convince people "science is trustworthy." And for a variety of reasons (both objective and social), scientists should be on the front lines of that work.
The question of "this is important" is tricky because it must involve a two-way discussion with the audience. But there are cases where a scientist must be making that case, e.g. in order to procure funding.
This is exactly the source of the trouble. Hearing scientists project their value system onto everybody else destroys their credibility. If people felt that scientists were sticking to the facts and avoiding ideology and values, science would be better received all around.
Whether or not the Earth is getting warmer, and the cause of that warming, are scientific questions. Also science is a discussion of possible remediations and their costs. Which remediation to choose, if any, is a question of values and outside the scope of science.
Also important for science is being clear about levels of uncertainty and ranges of possible outcomes. Recent global temperatures have little uncertainty. Distant-past temperatures have a much higher level of uncertainty, and predictions about ranges of possible distant-future temperatures have yet more uncertainty. I never hear much about this from scientists or the media that cover them though.
So how do you propose scientists "stick to" the discussion of remediations of "we can remediate X by doing Y at expense Z" without it being portrayed maliciously as pesky interfering scientists saying "we should do Y"?
The problem of lack of trust is hardly the sole responsibility of scientists here.
Do you think discussing uncertainty more is going to raise trust or just be more fodder for the people with non-scientific reasons to oppose action?
I propose scientists "only stick to the science" if and only if everyone else in the world sticks to their wheelhouse as well. Lotta citizens, politicians, and pundits out there without much training in making good ethical decisions either!
As long as the output of a scientist will be interpreted through other people's political lenses its fair game for them to frame it politically too.
Not if they want to be credible.
> sticking to the facts and avoiding ideology and values
I think a few different things are being confused here, but as far as value is concerned, the choice to do science, to take scientific findings into account and to decide what to investigate are themselves the result of value judgements. There is no fact-value dichotomy. There is no "clean" separation between fact and value (as if value were a dirty word contrary to fact).
This fact-value dichotomy can be tied to the materialist worldview which denies objective value because it presumes a metaphysics that renders the world a kind of theater of senseless extension in space. Any value must therefore be a matter of subjective projection (and therefore delusion, putting to one side materialism's inherent inability to account for subjectivity). But this metaphysics is, to put it gently, problematic. The wish to separate fact from value (which I take to occupy one order, not two) is no doubt further encouraged by liberalism's pretensions to neutrality and the inherent tension within Lockean liberalism between science and liberty.
But I do agree that the actual deciding of policy is not to be left to scientists but to politicians and the like. Scientists are specialists who can supplement our knowledge in specific ways that generalists can then take into account along with other data and understanding when making judgements.
Of course lots of scientists cares about facts first, but the scientists with an agenda are much juicier for journalists to interview and write articles about so they are who we see.
(I believe in climate change, I am pro vaccine. Just noting this here since many will think that me having the above view means I am a climate change denier and anti vax)
I think you misunderstood, I am not saying that they try to discover facts to support them using science. I am saying that they want the credibility of a scientist. You know how people start to listen a lot to the views of a Nobel Laureate regardless if it is their field of expertise or not? That sort of authority is very attractive to a lot of people and they will work really hard to get it, I am saying lots of people go into science since they want that authority, they don't have any care at all about doing science.
Yes, I judge people who try to corrupt our view of science. I never denied that.
> our concern for bad motives is separate from the question of whether science if value-laden (which it is)
Science having value is exactly why I don't want people to corrupt it. If you agree with me that science has value then you should agree that we should try to stop people from corrupting it.
If you argue that we can't judge who is corrupting, then I'd argue that you are so out there on the clouds with your definitions that we could just as well argue that a random youtube commenter is also doing science and that is a good thing and that we can't really say that youtube commenters are worse scientists than the people at universities since that is just a value judgement.
W.r.t. value, the only point I was making is that there is no fact/value dichotomy. It doesn't follow that I am therefore arguing that one cannot make value judgements. On the contrary, if no fact/value dichotomy exists and value is a matter of fact, then it follows that we can indeed make value judgements on par with factual claims.
But what I was addressing in an earlier post was the suggestion that there is a fact/value dichotomy and the notion that problems occur in science when value mingles with fact. I rejected this claim by arguing that there is no such dichotomy and by implication that the diagnosis is incorrect. Questions about corruption are fine as far as they go, but they are not relevant to this thread because they do not address the question of fact/value dichotomy and they presume value judgement.
This is where I've noticed mainstream political discussion often go off the rails. The best example is when "because science" is used to end conversation on the idea that 100% of the population should wear masks. Science, at least in this case, is clearly not prescriptive, so it can't be applied as a single justification like that. Perhaps science confirms that masks reduce transmission, therefore 100% of the population wearing masks is certainly one valid design. But one could come up with multiple other designs that would be equally confirmed by science to be effective at reducing transmission. So reducing transmission is not the hard part. The hard part is all the other variables that cause consequences in economics, mental health, other areas of healthcare, etc. Each model needs to be tested for it's utility across a variety of factors, but that idea is lost with the "science" cancel cudgel.
It will not care about narratives and may bring facts that we don't want to accept.
One sample is -> https://onlinelibrary.wiley.com/doi/abs/10.1111/j.1467-9477....
After this paper several other papers tried to milk other interpretation.
It's harder than that though.
Science is complicated, frequently messy and often adversarial, and that is how science makes progress.
There is no single narrative for scientific truth -- even when there is overwhelming consensus, science places high value on the coherent arguments at the margins.
This scientific embrace of complexity has been weaponized against us.
"Society" prefers a clear narrative they can comprehend, and science does not always provide it.
The media used to be relatively responsible stewards of the narrative, but that is very much no longer the case. (not the blame the media -- the media is us)
This question is not tricky. It is weighty and scary.
If it was tricky, there would be a puzzle you could solve to answer it.
Instead as you say it must first involve listening because people have the freedom to choose what is important to them.
Otherwise, you might assume that people place a highest value on sustaining human life. This can fail if you go to New Hampshire and encounter a person who value liberty more highly than life.
Yes, and science as an institution has been failing pretty badly at that. I posted this on HN just last week, but it's worth repeating:
1. Science journalism is almost universally terrible, so people already get sold half truths and sometimes even outright falsehoods from allegedly reputable sources. Messaging needs dramatic improvement.
2. The replication crisis has shown that up to 50% of published results in medicine can't be replicated (and up to 66% in social sciences), and there are virtually no incentives to replicate or publish negative results, and too many incentives to data mine/p-hack and publish sensationalized results (in fact, results that fail replication get cited more). There are now some efforts towards correcting this, but it's only just beginning.
I think there's another part that is also underappreciated, which is that the honesty of the public faces of science should be above reproach, and if someone in public facing positions lose their credibility by violating public trust, they should no longer be the public face.
The best recent example is Dr. Fauci. He has openly admitted to lying to the public on multiple occasions, such as whether the public should wear masks and about vaccination levels for herd immunity. It doesn't matter whether you think he did the right thing in those cases, he has unquestionably violated public trust and eroded trust in science as a result.
Strictly speaking no, but that's more of a long digression on epistemology than what I think you mean. (but think of indiana jones here: "archaeology is concerned with _fact_, not _truth_ ...")
My role as a scientist is to work diligently to understand, as best as we are able, the natural world.
Yet, "I will always be conscious that my skill carries with it the obligation to serve humanity by making the best use of the Earth's precious wealth."
That thing may be true, but the rising disdain for not accepting a conclusion when the evidence isn't presented alongside is worrisome. The difference between science and religion is that science draws conclusions from reproducible research. Yet - especially now - many people take the naked word of 'a scientist' the same way a religious fanatic takes the word of their spiritual leader.
This definition gave me a lot of clarity: an expert is someone who has, can get, can make, or can cause to be made - and presents - evidence that supports their conclusions.
If you treat logos, pathos, and ethos as a self-certified checklist, you are doomed to fail as well. You must provide arguments of each three types that are convincing to others. Above all else, preaching to the choir is (IMHO) the biggest problem in my political environment (USA).
Given how strong gerrymandering is there's no benefit to reaching out to the other side, just hold the line while scoring political points.
This is why I categorically ignore everything from Dr Fauci. He's known to have lied to the public before to influence behavior. Since I'd have verify his claims from some other source anyway, there's no sense using him as a source at all.
> "There's no reason to be walking around with a mask," infectious disease expert Dr. Anthony Fauci told 60 Minutes.
> While masks may block some droplets, Fauci said, they do not provide the level of protection people think they do. Wearing a mask may also have unintended consequences: People who wear masks tend to touch their face more often to adjust them, which can spread germs from their hands.
Like, they acknowledge Fauci lying about not needing masks, which would imply that they should be wearing masks, but will now refuse to wear a mask because they think Fauci is lying about needing masks.
The situation gets even murkier when you talk about mask _mandates_ instead of individual decision making. The argument that mask mandates are helpful is tough to support in the face of the differences in the delta-variant curve, for example, in different counties in California.
Just to say it: even daring to compare the results in contra costa county and san diego county, california (which have different mask requirements) got me shadow-banned on reddit. The reasoning here is mostly political, not scientific/rational. No one cares what the science says.
Claims that "Masks slow the spread of COVID" gets interpreted as "Masks stop the spread of COVID", and so when we have mask mandates and yet COVID still spreads, people use that as evidence that masks are worthless.
It's interesting that people can draw opposite conclusions from the same scenario. COVID has continued to spread despite mask mandates. Some claim that means the masks are worthless. Others (including me) would claim that, despite how bad it is, the spread would be even worse without them.
> individual decision making
In most cases, I agree that people should be able to make their own health choices. You wanna eat McDonald's for every meal and walk less than 50 steps a day? Go for it. Hell, snort a few lines of cocaine for dessert if you want to.
But when it comes to a pandemic, it's different. Sure, the vaccines are 95+% effective, and masks might be X% effective, and social distancing is Y% effective, and so on...but when >30% of the population has zero interest in doing any of that, then you can take every protective measure you can (Besides just staying in your house) and still get the disease from some asshole at the grocery store that doesn't care if they spread it.
Also, consider last year's toilet paper shortage, and the short gas shortage a few months ago. Individuals will often act irrationally in their own interests rather than what's good for everyone as a whole.
To think of it another way, when at a pizza party, you will have some people who take 3 slices of pizza because there might not be enough for everyone so they want to make sure they get their share. Others might only take a single slice because there might not be enough for everyone so they want to make sure as many people get some.
Individual decision making only makes sense if people aren't selfish.
It isn't just that COVID continues to spread despite mask mandates. It's that the curves look nearly identical in areas with and without mask mandates. And, to show their effectiveness, epidemiologists have resorted to pretty serious P-value hacking.
Separately, I find it hard to get worried for my personal safety because of the 30 percent of people refusing to vaccinate themselves. It's just not that hard to avoid the sorts of places where such people are likely to be. And, being vaccinated and healthy makes it less of an issue for me than, say, the risk of a car accident. Sure, I could pass it on to someone else if I get it, but with reasonable precautions I don't think that's likely at all.
I don't see how that is necessarily a lie. It could have been the best public health recommendation he could make at the time based on the available information. Research into how best to use masks is ongoing, so I would not expect today's masking recommendations to be the same as tomorrow's.
It's also not the only lie he's told and admitted to. See his claims about herd immunity, where the numbers kept going up, and when he was questioned on this, he outright said he just gave out numbers that he thought the public would accept at the time.
It's clear that what he and other scientists said about masks early in the pandemic certainly changed over time. I think science is like that. People who like simple, certain, unchanging answers can get them from religion or ideology. People who don't mind complexity, nuance, and change are more comfortable with science.
Assuming Fauci was lying because what he said then isn't what he is saying now just isn't logical.
He literally explained his reasons for lying.
Fauci admits to lying about masks and explains why: https://www.youtube.com/watch?v=kLXttHlUgK8
Fauci admits to moving the goalposts on vaccination rates and explains it's because of his "gut feeling that the country is finally ready to hear what he really thinks": https://www.nytimes.com/2020/12/24/health/herd-immunity-covi...
> It's clear that what he and other scientists said about masks early in the pandemic certainly changed over time.
This isn't about that, this is about literal deception from a public health figure about public health.
It is not the specific content of the lie that is the issue, but the lack of integrity on display. It is used as a retort to "official X declared Y", and is meant to undermine the integrity of official pronouncements in general. There are many who bristled at these initial claims by pointing out (correctly) that promoting "noble lies" is terrible for public health officials and doing so would come back to bite them. For some reason, the medical profession seems to accept noble lies as being justified when the rest of society does not. This goes back to the old saw of doctors lying to their patients about their own health. It's a blemish on the profession, and one that needs to be erased and apologized for ASAP, and IMO, Fauci belongs to that old school and doesn't really get it -- and probably never will.
Also, as a protip to your finding of the fact that masks were being lied about but they themselves don't want to wear masks as being "ironic": the literal meaning of "irony" refers to saying something but meaning the opposite. For instance "Sure, I trust you", when the speaker clearly doesn't. There is also situational irony, which would be when the opposite of what is intended happens. E.g. trying to kill someone by giving them a poison that ends up curing them. So in this case, the irony would be saying a "noble lie" with the intention of saving lives but actually causing more lives to be lost -- that would be the true irony here.
It's similar but more emphasized with chimpanzees.
The problem is that we're not chimpanzees. We are smart enough to make very powerful technology, but not smart enough to use it sustainably.
Nice word choice.
Nobody's perfect; what builds the sort of trust that permits action is a history of publically doing one's best:
I would like to add something that’s not essential to the science, but something I kind of believe, which is that you should not fool the layman when you’re talking as a scientist. I’m not trying to tell you what to do about cheating on your wife, or fooling your girlfriend, or something like that, when you’re not trying to be a scientist, but just trying to be an ordinary human being. We’ll leave those problems up to you and your rabbi. I’m talking about a specific, extra type of integrity that is not not lying, but bending over backwards to show how you’re maybe wrong, that you ought to do when acting as a scientist. And this is our responsibility as scientists, certainly to other scientists, and I think to laymen.
Do that, and you'll earn your Ethical appeal easily enough.
Let me be clear, I think arguments about ethos generate the most argumentation and heat right now, but the silent killer is really pathos. People feel comfortable arguing ethos. Most people will not argue pathos openly though.
We have a crisis of pathos in our culture. How do you claim that something is important without an appeal to authority? You can't. We live in a culture that is fragmenting its sources of authority; different groups of people have different sources of authority.
Here's a good example. The external dialogue that a lot of conservatives give on climate change is that scientists can't be trusted. The internal dialogue that a lot of them (though not all) engage in goes something like, "The earth doesn't matter. God is going to come back and set things right. So we don't need to worry about it anyway."
This is particularly apropos in the wake of Russell Brand's recent video on FB's Fact Checkers for Covid 19 vaccine information  funded by BigPharma, having a huge financial stake in bigpharma, and intentionally hiding their funding by BigPharma, and not adding a notification that the fact checkers are in fact funded/invested by/in BigPharma.
I am starting to read completely disinterested 3rd party sources more, because weighing the pros and cons of a thing is more challenging when the supposed experts are deliberating concealing material relationships. Some old Emeritus professor, already retired, with several phds and just a passing bit of information often gives great criticism of a thing without having any stake in the outcome.
While I understand your argument, I'm believe there's something more going on.
Taking COVID, for example. From the pro-science side, we get:
"COVID is a dangerous virus. We should take X, Y, Z actions" from somebody with a PhD in public health, medicine, or similar.
And from the anti-science side, we get:
"It's fake. It's the flu. X wasn't perfect, therefore X, Y, Z are all a plot to make you magnetic" from somebody on YouTube with no credential beyond being on Youtube.
I guess what I'm saying is there's something fundamentally broken about how people process information. All the Logos, Ethos, and Pathos in the world doesn't help when a significant portion of the population is brainwashed.
Many of the "facts" about COVID are not actually in much dispute. The people who view it as "a dangerous virus" and those who see it as "the flu" are much less far apart than they appear.
Nobody has seriously disputed the R0 values for COVID, nor the (rather) broad range of deaths per 100k cases. What isn't agreed upon is what the implications for policy should be.
Those who view it as a dangerous virus point to its transmissibility and capacity to cause hospitalization, and therefore its implications for public health issues.
Those who view it as "the flu" point to its relatively low death rate compared to other global pandemics of the past, particularly when adjusted for demographics, and correctly note that a given individual's chance of dying from COVID are extremely low (at least in countries with roughly adequate health care systems).
Both can claim "facts" on their side. The question is what policy consequences follow, and that is where the major differences lie. Depending on your perspective, the deaths (and long term illness) caused by COVID are variously worse, better or about the same as the damage caused by policies to contain it. Since making this assessment necessarily involves subjective judgements and questions of morality, it can't be settled by an appeal to science let alone mere facts.
None of this is to say that there are not relatively fact-ignorant people making various cases for certain policy approaches. But we could ignore those people, and the core debate would remain, and it's not a debate about truth or science, but policy.
On vaccines, case rates, etc. I agree with what you said.
As far back as you can go, yes. 2.5M years. But I think the point is not to call what exists "broken," but to understand and then use it to the advantage of humanity. Cult-like behavior is human behavior--i.e. the rule, not the exception. Now how do we use this to advance the species?
For context: I grew up Mormon, which I consider "cult-lite mainstream" on the cult<->religion spectrum. Having gone through that and left the church, I've spent some time deconstructing what influenced me to think the way I did as a believing member.
The best advice I can give is (a) make friends with people who are inside information bubbles , (b) people are motivated primarily by feelings and needs  despite what they say, (c) people often cover what they're really feeling with political, philosophical, and ideological language, which is often very difficult to decrypt to outsiders, and leads to significant misunderstandings, which usually serves to further separate and prevent friendship from happening, which is what's needed in the first place.
 see Daryl Davis (https://www.huffpost.com/entry/black-man-daryl-davis-befrien...)
 see Nonviolent Communication, a book that I place in the "top 3" most influential books in my life (https://amazon.com/Nonviolent-Communication-Language-Life-Ch...)
Then why do we see articles saying that Youtube etc bans people with great credentials who are anti vax? There are people with great credentials on every side of every argument. Great credentials doesn't stop you from being an idiot or being wrong. If you believe the first person with great credentials you see, and that person happened to be anti vacc, would you be anti vax? Sounds like it to me. I'm pro vaccine, but I am strongly anti blindly listening to people with credentials.
And there's a big difference between a scientist saying "people with previous COVID infections have more anti-bodies than those with vaccinations" and Joe Rogan saying "COVID is fake/just the flu".
The first is potentially true and most of us aren't qualified to either verify it or derive any course of action from it - regardless of it's truth, vaccination is probably the appropriate action for any individual (better safe than sorry). Using the statement as a argument to avoid vaccination is bad policy.
The second is on outright fabrication, yet we still have a significant portion of the population believing that crap.
Secondly, Rogan has never said COVID was fake or just the flu, and the reason people assert that COVID is fake is down to motivated reasoning, which is the same reason Fauci thinks he was justified in lying to the public.
Getting people to change their behaviour starts by not dismissing them or dehumanizing them, and trying to steelman their position so they and you fully understand why they want to dismiss COVID. When that's done, it's often clear where you can compromise. Sadly, that's not what we see going on.
Including wanting to implement many of the same draconian policies we see today for communities where AIDS was discovered.
And how do they get funding to do science without showing its important?
Human gossip is societies field effect.
A thing may be true but politics is obfuscating the importance.
This ivermectin thing basically proves that logos isn't useful. Ethos + Pathos alone can convince a large population of people.
The Apple "Reality Distortion Field" was never about logic. It was about making people feel good about buying Apple products. That's fine, because Apple has decent enough products (I don't like them myself, but I can see why some others would like them).
But today, we can apply the "Reality Distortion Field" to any subject. Most recently: ivermectin.
What I don't get: why are people choosing to push snake oil (like ivermectin), instead of pushing the drugs that do work (3 different vaccines, dexamethasone, and monoclonal antibodies)?
Society has developed working treatments for COVID19: dexamethasone cut the death rate in half IIRC, and monoclonal antibodies cut it in half yet again. And yet, people are seeking treatments that straight up have no evidence of working.
I can put papers up for the efficacy of dexamethasone + monoclonal antibodies, and how this cocktail saves the lives of countless people across this country. But then I'm suddenly left in a "Russet's teapot" scenario where I'm apparently supposed to prove-a-negation when discussing ivermectin (even if countless papers fail to distinguish ivermectin from the null-hypothesis).
People's brains turn off. Because today's reality distortion fields / marketing / propaganda are much, much stronger than logic.
Let me tell you how to do things in today's world.
1. Automatically find the people who have the poorest logic. Use ads, memes, and other such "low-quality" discussion points to find the lowest functioning brains. For example, clickbait headlines or "Nigerian Prince" scams. The dumber the argument, the better.
2. Reasonable people will ignore you. The only people who will interact with you are people with weaker argument skills. Spend as much time convincing _this_ group of your benefits.
3. Make it fun: give them memes to share with their friends. Even if its a bad / crappy argument, that's okay. That's what memes are about.
4. Sit back and relax as your crowd automatically spreads whatever argument you want amongst their friends and family. Now they're doing the hard work for you.
5. Bonus points: get enough people moving as a crowd, and even smart people start to get drawn into the masses. You'll start finding apologists who make better arguments on behalf of you. Keep up with the meme culture and pick/choose the best arguments. Crowdsource your marketing: the memes that become popular are the arguments you want to use.
At no point is "working" on logos actually beneficial to building a RDF (reality distortion field). You can build ethos + pathos simultaneously by just seeding opinions into a crowd through meme culture.
Bonus points#2: Use really, really bad arguments (World is flat. Lets to go Mars. 9/11 was a hoax. Hydroxychloroquine can save you from COVID19) as practice. The better you get at seeding bad arguments, the better you get at seeding any argument.
> I'm apparently supposed to prove-a-negation when discussing ivermectin
is it possible there is some physiological issues about "going to the doctor" vs "self help/healing" ?
Its not physiological. Its simply marketing.
Its no secret that ivermectin / hydroxychloroquine makers are benefiting from this snake-oil bullcrap. Its no different from essential oils or other such snake oil products.
We just didn't care about essential oils 3 years ago because its fine for idiots to waste their own money on snake oil. But when the masses are tricked into distrusting COVID19 precautions and start spreading the virus around even more, its a bigger deal.
> Its no secret that ivermectin / hydroxychloroquine makers are benefiting from this snake-oil bullcrap. Its no different from essential oils or other such snake oil products.
> We just didn't care about essential oils 3 years ago because its fine for idiots to waste their own money on snake oil.
It's "Russell's Teapot", it's named after Bertrand Russell, not a potato.
We are so lucky that COVID is only going to kill a million americans or so (700k and counting, probably 850k by Jan 1st), if this was the death rates of smallpox or the black death... ye gods. Our lack of respect in science is a huge part of vaccine hesitancy and our key role in continuing the pandemic.
The news media are terrible because for decades they have reduced science and technology either to simplistic caricatures, star trek gobbledygook, or even worse the WELL THEY SAID THIS, NOW THEY SAY THAT.
The general media's constant use of the high school nerd trope for all science and technology proficiency has been a long term failure in the progress of our civilization. Only, and only, because such proficiency leads generally to some degrees of upper middle class money to vast wealth is the reason it hasn't been worse.
But that media trope has always colored all science and technology policy as culturally "well, that's what the nerds say, and they aren't cool or fun".
Possibly it goes back to how most ultra-rich """elite""" made their money in America: by control and exploitation. People that have influence and position distrust science, because it is complicated and unknown and preys on their paranoia, and so often for industrialists produces complications to their overall plan of "make money by selling products, offloading the actual environmental cost/impact of those products on ... anyone else".
Maybe I'm paranoid, but it does seem with the mastery of social media propaganda/manipulation by monied interests, sowing distrust in science is now at an all time high.
So the business-friendly right will distrust science because it threatens their money and power. The apathetic center won't like science because it isn't cool. The left... well, scientists are generally white and male so they are distrusted by identity activists, and the rest of the left is too disorganized to rally around science policy effectively.
Of course, this is exactly the opposite of how science works today. There are poorly written press releases galore, and flashy but tenuous results that seldom hold up to scrutiny. There's a reason a journal like Nature is routinely mocked in scientific circles.
Science absolutely exists as a thing on its own, “pure” if you will. And alongside it and with it also comes politics, yes! But that’s a feature, not a bug— both play a role. A very different role.
Science is for everything descriptive and inferential. Politics is for everything prescriptive.
we should not be saying: "we should do x"
Unless of course we're up front about what our goal/premise is. Then it's absolutely appropriate.
For eg.: "If we want to minimize the burden of disease from X in Y conditions, we should take Z course of action."
Other eg.: rather than saying "you should not urinate on the electric fence", a good scientist will say "if you want to minimize the likelihood of unnecessary suffering, you should not urinate on the electric fence."
Interesting. I thought it did a great job of illustrating that while the quoted is true, the quoted represents an "ideal" (or pejoratively put a "fantasy"), as the reality of doing any real amount of such ideal science coordinates with the real world, the context of doing so, means it never ends up anywhere near pure and unsullied. It's like the fact that adding an even number and an odd number will always produce an odd number.
Politics, being concerned with the allocation of limited resources, does two things to complicate this ideal picture:
1) It decides "which science" gets done right now, which means the progress of science may not be uniform in all areas.
2) It applies research findings improperly to serve non-scientific goals.
Notwithstanding, the science remains just science. And crucially, trying very hard to stop or limit #2 from happening, in particular, is essential in making better _political_ decisions.
Political decisions are inherently about how to manage conflict between social groups, and while it's not really my place to judge the quality of any given political decision, it's hard not to judge the quality of a political decision which is not based in the physical reality that we all share.
"Settling Politics" amounts to everyone agreeing about what to do on every matter all the time. Otherwise politics just isn't going to be settled.
If the question is why science hasn't settled a particular debate, the answer is typically that the facts aren't really in question, and there is a secret debate about who is going to have to wear the costs.
But it's a misconception to think that the goal of politics is to "be settled". The goal of politics is to make decisions on how to rule the land right now, they don't get to wait 15 years and then try again (yes, some do, but that's a different problem).
there is a secret debate about who is going to have to wear the costs.
It's very sad that this has become a "secret" debate, because this is the heart of politics. I haven't decided if I blame politicians or the sensationalist media more for this, but these debates should not be secret. This is not just a problem in the US, in Europe I see the same: we are stuck with career politicians that are too chicken to either state their real beliefs in public, or to act on those beliefs in the senate.
Politicians see a block of wood and ask, how do we use that wood? Do we build a house, make a fence, or carve it into an ax handle.
So, it's really apples and oranges. Scientists use the scientific method to determine the truth. Politicians are not seeking the truth, but are using consensus to govern the actions of their constituents. The consensus can be based on science and truth, or on irrationalities such as fear and hope.
Science deals in the objective while politics deals in the subjective (preferences).
People are so attached to their preference that they dismiss facts and attack the whole concept of knowledge to maintain that preference.
Rational preference would be to take that into account and consider risks and potential mitigation into the preference to keep using them.
What we observe in much of the political sphere however, is often just abject denial to maintain the current preference.
If we think about truth bending to subjective reality, I am sure certain that some of the churchmen & scientists opposing Galileo's embrace of Copernican heliocentrism were completely certain geocentrism was the truth because their own astronomical observations were subject to slavish obedience to doctrine. This doctrine in their heads influenced their own observations. Narrative overwhelmed the senses.
It is the mad among us that have the rare ability to completely discount all external input and see a thing for what it truely is. And, that is why we call them crazy.
You are conflating "science" and "scientist" here. The way science overcomes bias is two-fold: 1) by training scientists to be aware of their own biases, and 2) by having multiple scientists, each with their own biases, try to reproduce the same experimental results.
Your artificial example is just that, they all fail 1).
Could anyone explain to me what the phrase "may be damaging democracy" means? I've been hearing it a lot over the last several years, and I've been wondering what democratic ideal people who use this phrase have in mind. It looks as if independent opinion-making based on one's trusted sources "may be damaging democracy", because most of us are not equipped to identify fake news; open and free exchange of opinions on social media "may be damaging democracy" for the same reason plus due to the tendency of falling into warring tribes; and now this tagline from the article claims that bowing to the experts may also be damaging democracy. Is there anything that doesn't, and what does it all mean?
IMO, it means something like undermining the trust in elected government and law makers.
There's a lot of policy making for which there's no solid evidence, because –crudely said– social sciences are far too sloppy. However, I don't think much policy is based on it. But in the public on-line debate (whether that's run by trolls or not), many appeal to science in their arguments. Perhaps that has some impact?
The article itself has the same vibe as online debates. The first example of biased science is medical experiments using men, and some totally unnecessary anthropomorphization of the reproductive process, which sounds more like virtue signalling than pertaining to the topic. Whorff-Sapir like arguments complete the picture.
Other arguments in the article point out that scientists can have a rather limited vision, resulting in sub-optimal solutions. But as I said above, I don't think much policy making is largely based on science, and the examples given point as much in the direction of tunnel vision by policy makers as the scientists they consult.
> Is there anything that doesn't
Where trust is absent, everything appears hostile. That would make a good Latin fake quote...
As a thought experiment, consider protests against fraudulent elections in other countries (Russia and Belarus come to mind, but I am sure I saw some other countries in newspaper headlines recently). Would you consider those protests to be an exercise in democracy?
The thing is, these companies are running the elections like a black box, Princeton University already has shown it is easy to hack Diebold machines for example, and we had very suspicious cases in other countries (for example voting machines in a state election in Brazil registered more votes than voters, and the amount of "invalid" votes, absentees and "blank" votes were all identical, when someone asked for a recount the response of the government was to say the machines must be trusted and fine the shit out of the candidate that complained, for "education purposes.")
Fraudulent elections are happening in countries that are, in fact, no longer democracies and have not been for some time.
I'm open to arguments that the U.S. is no longer a functioning democracy, but the ability to still have mostly peaceful transitions of power and for election audits to come back clean reassures me a little bit.
For example, Swiss constitution of 1848 was a result of a civil war. Its provisions made the division of powers between the federation and the cantons clear and Switzerland has never had a violent crisis again.
When put like this, this statement sounds truly conservative (small c, no necessary relation to parties); but I am hearing this phrase from people who are fine with the disruption of the political arena in the name of what they consider to be progress.
I doubt the wisdom of people who say - I want to stop global warming so I studied climatology. You're going to open yourself up for a heartache if the plan is to study and learn enough so that you become a voice of authority and then use that voice to control what happens.
It's not like people are terribly confused about what the least damaging course of action to take is. If resources were unlimited there would be little disagreement about what to do. Resources are limited though so we need to choose who gets what they want and who does not. There are messy compromises that can be made. Tricks that can be played. Sometimes there's power enough so that only one side sacrifices. In all of that though Science can inform what choices are available but when it starts trying to make them it will get slapped down.
What bugs me about these thesis is that they lead people to question science, as the best set of tools we have for reasoning, when used properly.
Put another way in modern colloquial English what you call science is almost always called the scientific method. What everyone else calls science is really Academia. When people start publicly calling for other forms of reasoning and problem solving to be taught then your argument will be convincing.
Enough people aren't being trained to think critically nowadays, and increasingly, societal pressure, which used to put pressure on officials to vet and verify facts and positions, is diminishing. Until that changes, we will fall further and further into the cult of celebrity/personality, which is fundamentally appeal to authority.
A person reading this MIT article should ask "Why should I believe what this article says?" If the answer is "because it's MIT! Duh!", then that's the problem in a nutshell. MIT is not a substitute for their own critical thinking faculties, nor is an MIT professor/academic/spokesperson/author the absolute arbiter of truth.
I haven't really seen an increase in people doing the hard work of gathering facts, considering new perspectives, and challenging their own assumptions or prior beliefs. It's more like switching from uncritical consumption of NYT to uncritical consumption of YouTube.
I did like your idea that this could be a good thing though, not sure I believe it yet.
Except in the past you could create (via mass media controlled by, quite literally, five people) an airtight, completely impenetrable narrative and feed it to the public, and now the public can get both the information, and conflicting disinformation elsewhere. Oops. Bet the CIA did not think of that when they helped create Twitter and Facebook.
I think you are viewing the past through rose-colored glasses.
Science itself has been politicized where the pursuit of truth has become secondary to the pursuit of funding and the alignment with the political agendas du jour. When I was younger my faith and trust in science was quite high. As I have aged and seen more of humanity and how it permeates all aspects of our existence I don't trust science like I did when I was younger. Now all I see are the motivations of those who are doing the "research".
Having studied statistics in college with the express intent of how it is used for scientific studies I am well aware that with enough data you can get any statistical result you want. Even better vaguely word it so it resonates with main stream media and still gives the authors an out with their peers.
This explanation doesn’t resonate with me. Research most often has incremental results that need to be carefully qualified. Isn’t the far bigger problem that mainstream media takes subtle research results and “simplifies” them for the public by adding certainty and often mis-interpreting the results completely?
Political agendas have been recently systematically trying to erode trust in science. (Because science and truth does threaten some politicians.) The idea that science can’t be trusted as the high-level summary is exactly what some people want, and it seems to be working. But what is the alternative? We have nothing better. The point of science is to try to protect against motivation and agenda, and it does work sometimes. Even when people are motivated, when the methods are reproducible and the results are peer-reviewed, that does help filter out some of the badness. And if it’s not enough: what should we do to improve it?
Solar geoengineering is the hobbyhorse I usually use as an example of this. It's an open secret (see e.g. https://www.nature.com/articles/d41586-021-01243-0) that most scientists in the field won't research it, partly due to safety concerns but partly because they think that decarbonization is the right policy and they don't want to risk "detracting from efforts to rein in greenhouse-gas emissions". Maybe that's the right judgment, but it's hardly free from motivation and agenda.
Grant review sessions most definitely have some amount of cronyism, and that famous luminary in a field will almost certainly get funding even if their grant application is worse than that of somebody with fewer citations or younger and with less of a track record. But there's awareness of it at least. And you'll find scientists on Twitter being very open about it, whereas before you'd mostly only find it at the pub during conferences.
There's a great passage from ET Jaynes that I'm having difficulty finding right now. He was a physicist from the 1940s on, and as a young graduate student he talks about how he had to be very careful about what he studied, because if it had the potential to contradict one of the big names of the field, then it could tank his entire career before getting started. If he hadn't "played ball" early on, we'd never have gotten his later Bayesianism.
Science is always a human process, there will be politics to some degree. Politics can only be minimized, not eliminated completely.
Now, imagine that your position truly is objectively factually and morally correct. (I understand things generally don't work this way, but stick with me for sake of example.) All you have to do is find a vocal group of people who disagree with the morally and factually correct position, and you can now call this "political stance" which someone can be "biased towards." (a simple example might be flat-earthers.)
Now of course, this same process can work out in nearly any permutation: the mainstream group could be pursuing an idea which they believe is morally and factually correct, but of course simply be wrong. And then the protesting group, calling out the politicization of the issue, could then be the correct group.
It must also be stated that the closer you get to the hard sciences, the less any of this politicization works. It's worth noting that things such as semi-conductors and computer chips all rely on science, and no one is politicizing whether they exist, or work. What is politicized is a bit more predictable: Questions of the cause of the ills of society, (ie, social sciences) questions about the role of men and women, questions about public policy.
I suppose the point I'm trying to make is that it's quite easy to call one man's truth a "political" ideology. If you poll enough people, nearly anything is "just a point of view" which could be construed as political.
The softer disciplines are particularly susceptible to the style of the day but lust after the cloak of correctness (and status) that hard sciences wear.
Just doing what a good pseudoscientist does: first dismiss any rigorous logical tools...