Based. Nullius in verba [1]. The whitewashing of censorship is accelerating the erosion of trust in the institutions, and it's refreshing to see a long-standing institution take a bold stance.
The problem is that we sometimes have to rely on other people. We can’t all be experts in everything. We can’t all reproduce or witness every experiment.
What we can do is delegate the process.
There’s a difference between saying “whatever X says is true, because X said it”, and “I trust what Y says because Y has explained the process and methods of how it was derived, and I don’t have reasonable doubt to assume Y lied”.
What to do when you have two very different opinions coming from experts:
- Anthony "I am The Science" Fauci
or
- Dr Malone (mRNA inventor)?
Who to believe?
Difficult to say, but these days, sadly, the first (and prudent) thing to do is to follow the money.
While credentials matter, the first thing I check when I read a paper/publication is who funded the work.
Doesn't really work. Social media grifting can be really profitable - it brings in audience, clicks, ad money. Even just audience is enough as long as you can come up with something to sell later.
Even if the original paper was an unfunded honest mistake, grifters can find a way to profit from spreading it especially if its novel and sensational.
All claims I've seen that Malone is "the" inventor of mRNA vaccine technology trace back to Malone himself. He possibly has some real claim to be "an" inventor, among the top dozen or so contributors of early advances that ultimately enabled the present vaccines. It's also possible that the contribution he's claiming as his own was basically his supervisor's idea (Felgner's), and Malone is just the one who did the lab work:
Either way, I'm pretty sure Malone's claims about vaccine safety are dangerously wrong. After literally billions of doses of mRNA vaccines, the adverse effects he's been warning about simply haven't materialized. The rate of those effects isn't zero, and surveillance at mass scale has identified adverse effects that the original trials weren't powered to detect; but for now, the risk/benefit ratio for the vaccines looks highly favorable. Malone's baseless assertions otherwise are causing real harm.
That said, please don't take anything above as a defense of Fauci or of censorship. While I'm pretty sure that Malone is dangerously wrong, this pandemic has repeatedly demonstrated that the mainstream consensus can also be dangerously wrong--so like the Royal Society, I'd rather tolerate false information from the contrarians than risk a false mainstream view that can never get corrected. That doesn't mean contrarians are automatically (or even usually) right, though.
"Follow the money" may sometimes be a useful standard, but I don't see the relevance here. Fauci appears to get his money from the government, where he's the single highest-paid employee but still earns less than countless anonymous tech workers. This seems more to me like a matter of prestige, ego and power for both men.
> I'd rather tolerate false information from the contrarians than risk a false mainstream view that can never get corrected. That doesn't mean contrarians are automatically (or even usually) right, though
I don't. I think it's reckless to tolerate contrarian information (and in fact, it's causing a much higher death toll during this pandemic) in public.
I agree that specialists should still be allowed to discuss contrarian information, via papers, peer review and the overall scientific process.
But should Software Engineers (like most of us here) really be warranted a platform, a listening audience on their ideas about virology?
What exactly is the "false mainstream view that can never get corrected"? I'm likely out of the loop to some extent, but WRT, e.g masks, that was retracted and corrected; vaccines prevent transmission/reception, that was retracted and corrected. What else am I missing, or is it simply that people don't like "flip-flopping", which is strange cause there wasn't massive outrage at the way "cigarettes don't give you cancer" was propagated, or Absestos, etc
In the two examples that you give, I don't think there was significant censorship of the initially-contrarian viewpoints ("masks are likely enough to protect against SARS-CoV-2 that they're worth a try", "the vaccine is highly effective against death and serious illness but much less so against infection long-term"). Perhaps in part because of that open discussion, those viewpoints indeed quickly became the mainstream.
I was thinking more of stuff like the origin of the pandemic. It's far from proven, but I believe it's entirely possible that (a) the SARS-CoV-2 pandemic originated from reckless virological research; (b) if this research continues, then new pandemics will occur by similar means in future, with similar or worse mortality; and (c) the only force likely to prevent that is public outcry against continued funding of such research. For about a year, discussion of this possibility was grounds for a ban from Facebook. It's only thanks to the huge reputational risks taken by a small number of researchers that this topic has now entered the mainstream. Without those few dozen people, we could easily have landed in a world where that topic remained forbidden indefinitely.
Closer to the Malone case, the BMJ published an article alleging improper conduct by a contractor that conducted some of Pfizer's vaccine trials. (For emphasis, they're not saying that any of these allegations, even if true, would mean the vaccine is unsafe. The point is to expose and correct localized procedural failures before they become patient safety failures.) Facebook marked this as "partly false information", and indicated that users who shared the article would see their posts deprioritized. This has been widely publicized, and Facebook hasn't changed its position.
So if the vaccine were actually dangerous, then would I ever find out? (Again, I strongly believe the vaccine is safe; but I'm asking this hypothetically.) I think I still would for now, because the censorship on the major platforms isn't complete, and I spend enough time on other platforms that they don't have total control anyways. It's not a comforting trend, though.
A group of authors including Ralph Baric, the father of modern coronavirology, published a letter in Science calling for investigation of all possible origins of the pandemic, explicitly including a research accident:
Last I checked, Baric still thought a natural origin was more likely; but the point is that he considers an unnatural origin sufficiently likely that an open investigation is required. The FBI assessed with "moderate" confidence that the pandemic had unnatural origin, while some other agencies assessed with "low" confidence that it was natural and the rest declined to judge.
We don't know how the pandemic originated. Nothing is proven in either direction, but mainstream consensus now absolutely includes the possibility that it originated from such a research accident. Your dismissal of that as "FUD and propaganda" now puts you in a position as fringe as the opposite would have been eighteen months ago. From your other comment, I guess you think I should have been censored for entertaining it back then. But given that the mainstream consensus has since changed--apparently without you realizing--do you believe that you should be censored for rejecting it now?
Unless you somehow become our dictator, you are unlikely to find that the censors' arbiter of truth always agrees with you. Specifically on this website, I believe that Paul Graham is our dictator; and from his Twitter account, I'm pretty sure he disagrees with you on this matter:
He surely can afford to moderate as strictly as he wants. But he doesn't seem too inclined to censorship, so you got to post your comment. I'm fine with that, among other reasons because it gave me the chance to explain the basis for my beliefs, and possibly change your mind (or that of someone else reading).
As to your link, I assume you're aware that the host and guests on TWiV have advocated for and performed exactly the kind of high-risk research that may have caused this pandemic? That certainly gives them special expertise that deserves attention, but to trust only them on this question is like trusting only Monsanto on herbicide safety.
These are some fair points, but I have some objections:
> mainstream consensus now absolutely includes the possibility...
I don't think this is the case... Your belief is obviously different to Ralph Baric's which is yet different to the belief of those who worked in the Wuhan laboratory.
I.e. the scientific consensus was that it was extremely unlikely, Baric now says that this is less likely than the zoonotic origin, and you say "entirely possible".
You can use the catch-all "includes the possibility" to treat all of them together, but if you had to pick numbers for the probability of the lab leak, you'd likely pick different numbers than Baric or others.
I.e. just because something is not impossible but merely "extremely unlikely" it's still reckless to have our media talk about it in the way it has.
If anything, if the real consensus is that we don't know... The media should not carelessly talk about any hypothesis without also mentioning the others, and why they are more/less likely.
This is not a topic that needs to he hashed and rehashed every few weeks: the consequence of treating this like it has been done, is that now a bunch of people think that the lab peak is what actually happened, and just today I've seen another article which defends Joe Rogan by saying that there's "consensus on the lab leak hypothesis"...
I.e. if there should be censoring about this, both me and you should be censored, for not being concrete and impartial enough.
> I'm pretty sure he disagrees with you on this matter
I'm not sure what this appeal to authority wants to imply. Of course the decision process for how/when to censor it's a delicate one, and ideally left far away from millionaires who think that they are more competent than they actually are.
To clarify, just because a private person owns a platform, it doesn't mean that they should be the only ones to make rules on what contents are allowed. They can make things stricter, but they shouldn't be able to make things laxer by allowing what's otherwise illegal (obviously, that depends on jurisdiction, which is why countries censor websites via DNS or routing)
> I assume you're aware that the host and guests on TWiV have advocated for and performed exactly the kind of high-risk research
The only people who describe this as "high-risk" are also the people who believe in a lab leak being actually what happened
Do you think that EVERY "gain of function" experiment is high risk?
> If anything, if the real consensus is that we don't know... The media should not carelessly talk about any hypothesis without also mentioning the others, and why they are more/less likely.
As I said in my previous comment in exactly those words, "We don't know how the pandemic originated". So I'd certainly agree that the media should make that clear. (I mean that as my personal opinion, not a call for censors to force them to.)
Given that we don't know, regardless of whether one thinks a research accident caused the pandemic with p = 0.01 or p = 0.99, I believe that an investigation of all possible causes is required. Ralph Baric and I probably disagree on the exact probabilities, but we agree on the investigation. There are many significant unexplored paths for that, even without the PRC's cooperation, including subpoenas for the records of the WIV's American collaborators.
With millions dead, such an investigation is inevitably political. You'd probably rather the investigation were left to scientific experts, and I would too; but someone has to choose those experts. In a democracy, that job goes to elected politicians. The performance of those politicians is ultimately judged by the voters. Without open discussion, I don't see how the voters could make an informed choice. (I guess the politicians could decide what information the voters deserve to know, and the voters could judge the politicians according to that filtered information; but I assume you see the flaw in that system.)
> I'm not sure what this appeal to authority wants to imply. Of course the decision process for how/when to censor it's a delicate one, and ideally left far away from millionaires who think that they are more competent than they actually are.
I mentioned Paul Graham's beliefs not because they were specially valuable in themselves, but because under present American law, he's probably the person with authority to decide what is censored on this site. If he were inclined to censor, then I don't think he'd decide in your favor.
It seems like you believe American law should be changed, by amending the constitution to eliminate the First Amendment, and the American government should exercise strong powers of censorship over such forums directly. That seems very unlikely to happen. But even if it did, a majority of Americans (including a majority of Democrats) believe not only that an unnatural origin of COVID is possible, but that it's the most likely explanation:
So if the American government were censoring, then do you really think they'd be censoring in your favor? If you think stronger government censorship early in the pandemic might have changed public opinion now, then remember that Trump was president at that time. If his government had had that power, then I can't imagine you'd have been pleased with how they used it.
It seems like you're hoping for censorship in the abstract, in service of perfect truth. That can't exist. Censors are humans, and censorship is subject to the same mistakes and corruption as any other human endeavor, especially those affecting the flow of political power. All of this requires human judgment; and once the wrong humans get the job, any apparatus designed to suppress falsehood works just as well to suppress truth.
> Do you think that EVERY "gain of function" experiment is high risk?
Almost any biological experiment involving genetic engineering (or even just culture with artificial selective pressure) may be reasonably anticipated to cause some gain of function. Most such experiments present minimal risk.
The research of concern is the search for deadlier and faster-spreading human potential pandemic pathogens, whether by laboratory gain of function or by collection from nature in areas with minimal other human traffic and thus minimal risk of natural spillover. This was a concern even before this pandemic, and is absolutely a concern even to those who believe that's not the origin of this pandemic:
> “That’s screwed up,” the Columbia University virologist Ian Lipkin, who coauthored the seminal paper arguing that covid must have had a natural origin, told the journalist Donald McNeil Jr. “It shouldn’t have happened. People should not be looking at bat viruses in BSL-2 labs. My view has changed.”
> It seems like you believe American law should be changed...
Probably yes, especially since the rest of the world still often takes inspiration from what the US does.
But frankly, I don't live in the country in which I was born, and neither of those are the US. It would be enough for the law to be changed in my relevant jurisdictions
and then, if news.ycombinator.com but especially other sites with user generated content (e.g. Facebook) are not compliant, they could just be blocked (forcing me an others to use a VPN, if we'd still want to interact with these sites)
> Censors are humans, and censorship is subject to the same mistakes and corruption as any other human endeavor, especially those affecting the flow of political power. All of this requires human judgment; and once the wrong humans get the job, any apparatus designed to suppress falsehood works just as well to suppress truth.
Definitely true, but not censoring anything is not a neutral decision (just like deciding to censor is not a neutral decision). There are risks either way.
To tie back to your previous point:
> The performance of those politicians is ultimately judged by the voters. Without open discussion, I don't see how the voters could make an informed choice.
But do they make an informed choice, on aggregate?
Are people like Trump and Biden legitimately the best that the US could muster? Isn't this facet of democracy mostly a popularity contest, in which popularity is hugely affected by which claims are most often repeated in the media (and less on the actual compentences, policies espoused and reliability track record of those political figures)?
I think democracy can be achieved in a different way (but this is getting out of topic)
That's very informative, thank you. I knew about the BSL-{1,2,3,4} rating... But I didn't know that the Wuhan labs were only BSL2
I'll look up more info now, but this is definitely something that should be addressed (and I'd be surprised if something hasn't been done about it already)
Edit: the issue of the BSL level of the laboratories in question seems to have already been addressed:
If you think there's a way to have a democracy without an informed electorate, then it makes sense that you'd be less concerned with censorship. I don't see how that could work, though. I'm not impressed with Biden, and significantly less so with Trump; but I'm also unaware of any system that works better. I didn't grow up in the USA, and I'd prefer a parliamentary system to the USA's republic; but that's a minor question compared to democratic vs. nondemocratic systems, and we're depending--however fragilely--on an informed electorate either way.
> Edit: the issue of the BSL level of the laboratories in question seems to have already been addressed:
I'm not sure what you think is addressed there? In that interview, Dr. Shi confirms that they were working with bat coronaviruses at BSL-2. Various papers published by her group before the pandemic also confirm this. They also had a BSL-4 lab for animal experiments, but experiments on the viruses in cultured cells were continuing at BSL-2. That's what Lipkin thought was "screwed up" (i.e., presented an unacceptable risk, regardless of whether it actually caused this pandemic).
As far as we know, all of Dr. Shi's work was performed in compliance with her institution's safety standards. The question is whether those safety standards were adequate, though--her standards were already a step below Ralph Baric's, and long before this pandemic academics like David Relman thought Baric's experiments were at or beyond the edge of acceptable risk:
Baric and the WIV later submitted a proposal to perform exactly the same kind of research as in Relman's hypothetical, not with SARS-1 and MERS but with novel bat viruses collected by the WIV:
> “We will introduce appropriate human-specific cleavage sites and evaluate growth potential in [a type of mammalian cell commonly used in microbiology] and HAE cultures,” referring to cells found in the lining of the human airway, the proposal states.
That proposal was rejected by the American government for safety reasons, but there's no way to know what work continued in the WIV with other funders.
No, looking at funding is a terrible shortcut. If you can't study the papers yourself, maybe see what other scientists think of their work? For COVID there is plenty of discussion.
If you can't figure it out, hedging your bets is better than choosing a side.
I trust Malone way more. Fauzi has vested interest in continuing COVID measures. Check out his pay (higher than western leaders in the world). Check also his stock holdings. Still Fauzi has way more credibility than the entire WHO which has been a laughing stock for the past 2 years.
> The problem is that we sometimes have to rely on other people. We can’t all be experts in everything. We can’t all reproduce or witness every experiment.
Implying that anyone has to have expert knowledge.
> I trust what Y says because Y has explained the process and methods of how it was derived, ...
Doesn't that directly contravene what you said before?
> ... and I don’t have reasonable doubt to assume Y lied”.
Completely different issue, same problem though. Reformulated by sylogism, you are basicly saying "I trust that I don't have reasonable doubt, because we can’t all be experts in everything. We can’t all reproduce or witness every experiment.", which is really a definition for what you might consider reasonable. It's a recursive definition at second order when "What expert X says is true, because they do not have reasonable doubt. They sometimes have to rely on other people. We can’t all be experts in everything. We can’t all reproduce or witness every experiment." Whereby the default case for the recursion is "I don't trust X, because Y said they's an ass" or something like that where being an arse can come in many coloures.
Of course you have to rely on experts, that is not the criticism. The criticism is that people with opposing views were silenced and declared as spreaders of misinformation. It is not hard to craft a counter argument if you have evidence for it. Just stating that you believe that X is wrong is enough. Having X removed hints that you do not have contradicting evidence. If said evidence is not politically communicable, you still should not ban people and remain on your position.
I see the root cause of all this debate as a failure in scientific communication.
When it comes to complex scientific decisions with significant impacts on the public, nuanced and detailed justifications are required. Instead, we often get simple declarations written by PR departments at juvenile reading level. This spawns chaotic and poorly conducted debate in all walks of life.
What is required is a robust and transparent framework for honest analyses of topics.
For example, if the CDC has a specific recommendation, They should provide an outline of the arguments, counter-arguments, assumptions, and supporting data for each part.
It seems that the status quo is to completely ignore any arguments against a given recommendation. This suffocates any honest discussion in the crib. It also suggests to some that the justifications are not robust enough to survive the light of day. If you cant show your work, people will be skeptical that you did it all.
This is applicable to any science based public policy, but especially obvious to covid policy.
I am pretty skeptical that complex science can be explained to normal people. I just tried to explain to my school that since my child tested PCR positive then negative for COVID, he shouldn't take a PCR test for 90 days (highly prone to reporting positive even when he is non-infectious), but instead (like the doctor's note says) should take an antigen test, and if he tests negative on that (and has no symptoms) he can return to school.
I explained this fairly simply but the message didn't get through (to a reasonably intelligent person) and they thought I was saying the complete opposite of what I was saying.
Instead, they are insisting on continued PCR tests (for "more data" even though I have a doctor's note saying to use antigen tests.
This is a symptom of no centralized framework. You don't need to people to memorize or understand the entire message, but a comprehensive reference source. We ship manuals with cars, we don't expect people to memorize it at the dealer.
In your case, wouldn't it be helpful to have a a recognized stand reference you could just point them to and let them read, instead of having to explain it and point to some random web page faq that they may not trust?
> I explained this fairly simply but the message didn't get through (to a reasonably intelligent person) and they thought I was saying the complete opposite of what I was saying.
If I detected you were being condescending I would do exactly this. Mostly to piss you off for thinking you’re better / smarter than me. You may know more than this staff member on this subject but there’s an appropriate way to explain things. Assuming from the get go they’re too stupid / uneducated is offensive.
You can't, discuss a nuanced topic without detail, but It is solvable with parallel mediums. Not every debate fits in a tweet so you issue long form communication in parallel. This is common practice for many topics. You wouldn't expect a nuclear power plant or rocket company to conduct all their internal business in self contained tweets. You expand the medium used to provide the bandwidth nessicary. You might have a power point with a single slide summary, several dozen supporting slides, and then a pdf going deeper.
I would dispute the idea that nuanced /messaging/ can't happen at scale (discussions are a different thing altogether). While I understand that everyone is up in arms about COVID-19 communications these days there are lots of instances around the world of effective communication that in the end had to be fairly nuanced.. most populations understand that vaccines are useful and effective but not perfect, most populations do understand that their particular risk is low but population-level risk is quite high.
I guess we can quibble about the definition of "nuance" but the idea that vaccine science and hospital capacity management is very direct doesn't hold much water with me, maybe others disagree, IDK?
>I guess we can quibble about the definition of "nuance" but the idea that vaccine science and hospital capacity management is very direct doesn't hold much water with me, maybe others disagree, IDK?
This is what I am getting at. If we claim these decisions are science based, communicate the science! e.g. if X% more people are vaccinated, we will have Y more hospital beds, and Z more elected procedures. make an honest calculation, show the work, and stand behind it.
> if X% more people are vaccinated, we will have Y more hospital beds, and Z more elected procedures..
But the problem there is you are not asking for nuance, you are asking for.. lies? The answer is that more vaccination means more hospital beds and elective procedures but no one could credibly argue any specific numbers - it was just not possible to make that calculation without enormous error bars, you can only make the nuanced argument that vaccination produces reduced case counts and hospitalization, and that the public has to understand that this will allow for better services against an unvaccinated baseline. The good news is that a huge majority of people do understand that!
Alternatively, if you haven't done the math, or it doesn't actually support a robust position, say :"This is not a scientific argument, be we are really hoping this helps. no promises"
Speaking for Ontario, one province over from me, models with error bars is pretty much what the public was presented with every time they imposed new restrictions.
Whether you communicate, or do without communicating, there needs to be accountability.
When there's no accountability, it boils down to power games, which is what we've always had, because as soon as you start playing accountability, you have to justify why you have power and I don't, and that makes people in power extremely nervous, because most of them are really not that bright and any requirement to use rationality exposes it all too clearly.
Totally agree. There's an inherent tension between protecting the freedom of speech and the potential harm that speech might cause. The current incentive structure motivates people to say the most click-baity outlandish things without worrying about any of the consequences. Fact-checking can never catch up with all the crazy sh!t people come up with. That's why censorship feels like a tempting "easy way out" for combating misinformation.
Maybe one mitigation is to make public figure / media accountable to the avoidable damage their speech end up causing? E.g., if I listen to your anti-vax radio program and consequently decide not to get vaccinated, before catching the disease and dying, then my family can sue you for damage as long as it can established that you have purposefully misled / failed to assess the potential harm.
After a few class-action lawsuits like this, public figure & media will probably be more careful when they want to spread misinformation.
Perhaps. I only see it as a "subset" in the sense of: many things have to go right for good communication to happen -- ethos being one of them.
But its still "upstream" from communication. Philosophy informs action.
Your original point is that our leaders need to get better at communicating nuanced issues. That's a non-starter if the people in charge see nuance as an obstacle and straight up adopt an anti-nuance attitude.
My point is, they're bad at communicating nuance in the way that a bulldozer is bad at building a house. It just wasn't their mission. And that didn't become obvious until after these emails were leaked.
In history, every time you set up an institution it becomes an entity in its own right. Company’s, orgs, religions, federal entities and countries all vie for power at their own level and within their own space. The problem with an ‘information-control/censorship/verification’ is… it’s exactly that, the age old ‘we are controlling information for the common good’. Every major nation in history eventually evolves to this point. They claim that the internet made this a new thing.
Listen to Lux Friedman - episode 254 with jay who professor the medical department at Stanford. It’s entitled the case against lockdowns.
‘ Social cohesion is a necessity, and mankind has never yet succeeded in enforcing cohesion by merely rational arguments. ossification through too much discipline and reverence for tradition, on the one hand; on the other hand, dissolution, or subjection to foreign conquest, through the growth of an individualism and personal independence that makes co-operation impossible.’
-Bertrand Russell
Edit: lol, little disinformation quip of my own. He’s just a professor, doesn’t head the department after a quick google search.
I've seen that episode but not made that connection, very interesting.
However, the way the quote ends paints a slightly different picture:
`The doctrine of liberalism is an attempt to escape from this endless oscillation. The essence of liberalism is an attempt to secure a social order not based on irrational dogma, and insuring stability without involving more restraints than are necessary for the preservation of the community. Whether this attempt can succeed only the future can determine.`
Well - all the evidence presented in that episode would be exactly the type of information the royal society (and every other federal entity in the world) seems to be wanting to supress. Alot of the information presented is very 'anti-mask', 'anti-lockdown', 'anti-vaccine'. He makes many points that we have devolved into some kind of mass histeria when it comes to covid.
Also, the reason i left the rest of the quote out is it takes away from his argument because despite the fact of him writing an entire treaty on revolutionary thought, he didn't seem to study most of the 'Revolutions' that occured from those thoughts. You can replace 'The doctrine of __x' with anything in the sentence and it would have range true historically. when the Tzar of russia decided that liberalism was needed in the late 1800's, the more people he freed the more those people who were freed swung further and further left until everyone was screaming some kind of collecive anarchism forwarded by bakunin and the Tzar was assassinated.
I agree with pretty much everything you said, but feel compelled to point out that I think the virus origin is only a minor footnote in the communication failures. The topics of lockdowns and vaccination weigh much more heavily in my mind and I think have much greater relevancy for most people. Here the goal of informed consent was largely discarded in favor of manufactured consent. I would love to see the CDC or some authority have a transparent and reputable weighing of the tradeoffs involved.
You would have had a higher vaccine compliance if the dialogue would never have been about mandates. Just give a health recommendation. You wouldn't have reached everyone, but the science also said that you do not need to.
In some countries we will see mandates without any perspective. For the third dosage? The fourth? The opposition is correct when it says you only get your freedom back if you boot out those responsible for this in the first place. No politician comes out and admits mistakes, not even in democracies.
People talk as if a political and a scientific opinion are equivalent. That is obviously not the case most of the time.
I think that many people have a low opinion of the general public, but I think they are more intelligent than people give them credit for.
If you want to someone to take an action, tell them specifically what they have to gain, dont just provide some general platitude that it is good for you.
If you want someone to get boosted, say you have X% reduced chance of death and Y% hospitalization. Make an honest calculation and keep it up to date.
If the data doesn't exist, we have much bigger problems.
This was done for COVID, and was received as fearmongering. You have “experts” saying you “will die” if you get COVID, yet stories are passing around the entire time of unvaccinated people alive after getting COVID with no long term issues and never having seen a hospital.
So you have one group of people literally scared to death and that’s why they take the vaccine, and another group seeing other data that makes them say “I’ll take my chances”
Really? I'm speaking from a US perspective, but my opinion is the communication out of the CDC has too much nuance and is too concerned about being scientifically accurate. They forget they're speaking to a largely scientifically illiterate crowd, and that crowd sees the constant back-and-forth not as scientific progress but as a reason not to trust the institution. Never mind that it's often delivered with a tone of condescension people have come to despise from their institutions, and some of their policy recommendations are... debatable to say the least (recommending mass testing for high school sports/band/choir, for a recent example).
Honestly I only trust the CDC because I have the knowledge and capacity to verify what they say and ignore the bullshit. A lot of people don't have that. And there is bullshit to filter.
The message to the public, assuming the CDC is interested in actually convincing people, needs a simple path forward individuals can take to get said individuals something they want, even if it taps into a selfish motivation and isn't 1000% technically optimal. (i.e. Get vaccinated and the mask requirements go away). And it should be repeated in every statement, regardless of new data. It should have been consistent for the past 2 years. It should have a catchy slogan. It should be on fucking billboards. People would still bitch and grumble, and some would never be convinced. But we'd probably have more people vaccinated if that approach had been taken. Instead we confused people and handed ammo to the anti-vaxxers by the truckload, relying on peoples' better natures and assuming a higher average level of education than actually exists in the American populace.
In short the CDC should optimize for leadership and "good enough" solutions, not strict scientific accuracy and trying to save each and every life in their public statements. Publish the hard science/raw data in more obscure releases that only scientists will care about. And they need a better front person than Fauci. Fauci just radiates highly-credentialed-beltway-insider arrogance however nice he tries to sound and however right he may be.
Of course you could argue such a role should be the place of the executive, and the CDC was trying to fill the gap left by a largely absent Trump administration, and by the time the Biden admin took over Fauci's position as front-man/leadership was already too solidified. Maybe so, but the CDC and public institutions in general have failed the task of public leadership, although thankfully have had success in logistics/vaccine development.
I believe this would have been an even worse strategy for policy acceptance. We were and still are dealing with unknowns and these unknowns are able to justify wearing mask as a precautionary measure without a (not so) noble lie being necessary. Society isn't a military unit and has profoundly different dynamics. Some will ignore this advice but the repercussion of that are very likely managable.
This type of PR will not work with a modern information infrastructure anymore or at least significantly less effective.
> handed ammo to the anti-vaxxers by the truckload
It is a stupid image of an enemy and policy was crafted with a steady look on these groups. That is a mistake in leadership if there is one. You have provided them with legitimacy without anything for you to gain. A complete waste for nothing and you elevated your opposition without benefit to you. That is neither competent leadership nor politics. Figuratively not looking them in the eye would be better. They are beneath you and don't even warrant attention. Few politicians would have even used the term, at least the smart ones.
That said, since I oppose mandatory vaccination I am an anti-vaxxer myself in the eyes of many.
The issue that plagues us is a lack of trust in long-standing societal authorities.
We can't follow the entire scientific proof chain for every piece of information we encounter, because we don't have time.
So we rely on authority to some extent, whether that be peer review, government, independent bodies, etc.
We need to be able to trust that these bodies are telling us the truth and aren't seeking to mislead us. Because when we start to doubt them, we then inevitably elect alternative bodies, simply due to limited thinking capacity/time - as explained above it's impossible to do otherwise, no-one derives from first principles every opinion they hold.
The best way, _overall_, to convince someone to do something, is by clearly explaining to them the positives and negatives and letting them come to their own conclusion.
It doesn't always work, and there are specific situations (e.g. someone is holding a gun to your face) in which the cost/benefit analysis is very different - in such a situation, the short term is all that exists, the long term effect of misplaced trust is irrelevant - you simply have to neutralize them.
But in general, I'm absolutely sure that education over coercion is the correct approach for society.
Because if you force them, sure, you've got a short term win, at the long term cost of trust. What is the long term cost of lost trust?
I'm skeptical that education is a solution to collapsing societal trust.
First, the primary organs that would be capable of this education are the media, who have a profit motive to stoke any and all fears (immigration, radiological warfare, world war, coffee being carcinogenic).
Second, I think this distrust goes beyond "the CDC said something wrong once, I don't trust them as much anymore." That could be solved with education. But we're seeing a more fundamental rise in broad-based distrust of any authority figure, and it's a much more instinctive response.
In a world where authority is more powerful and centralized than ever, some are bound to feel powerless. Someone's actions on the other side of the world have more impact on your life than they ever could (pick depending on your priors: a Wuhan wet market customer's appetite, or a Wuhan Institute of Virology employee's carelessness). With such an interconnected world, how could conspiratorial thinking not be appealing to some people?
Power inherently fosters distrust and cynicism. You can have as much pedagogy and education as you want: rational discourse just can't combat a fundamental instinctive reaction, a growing sense of claustrophobia and powerlessness that some people are bound to feel. And the response to this distrust? More control over discourse, over communication mechanisms[0], and ultimately, over citizens. Governments and corporations are more powerful than they ever were. Surveillance, and enforcement of rules, are easier and more automated than ever. How could our primitive, Dunbar-number brain wiring not react with distrust, or even hostility towards (some) authority figures?
It's not just that the CDC and other institutions of scientific authority 'said something wrong once.' I think most people understand and accept that science is not perfect.
Instead, those institutions have shown that they are willing to publish nobel lies. Specifically, they are willing to bend the truth and publish false information if they believe it is in the best interest of society / greater good. Given an entity that has shown willingness to bend the truth 'for the good of society', isn't distrust and cynicism is rational?
EPA famously did the same during 9/11. It would have been better if they instead trusted us and said something along the lines of "the dust is full of dangerous asbestos, but we need heroes this day".
This kind of failure happens as a consequence of an intellectual environment: the impossibility of carrying a nuanced message. This perhaps can be solved by "more education", in the broader sense. Until then it's hard to blame individual instances of public officials who try to do their best to distill a complex tradeoff in a public announcement where they either have to talk with an authoritative tone otherwise they'll be criticized for being weak if they don't or they will be criticized for being too paternalistic if they do. They are between a rock and a hard place and various figures are trying to carve their own paths without any rule book, and most of them will spectacularly fail.
Ok. Replace weak with "shifty" and "unclear". In any case a public communication that doesn't reassure will quickly be replaced by one that does, by escalating the issue to the nearest political figure who takes on the job of providing such assurance.
I would think lying would make someone seem more shifty than making the facts known.
> that doesn't reassure will quickly be replaced by one that does
sure, just doesn't need to be the EPA that does it. pre-empting politicians and doing their manipulation for them is not justified - politicians are not a source of truth.
don't get me wrong; I don't think anything of that is justified. I just think there is a selective pressure for this behaviour to emerge. High-profile public officials are politicians for all intents and purposes. They shouldn't be, but they are.
> It's not just that the CDC and other institutions of scientific authority 'said something wrong once.' I think most people understand and accept that science is not perfect.
I disagree. I think a big part of the problem is people having unreasonable standards of accuracy or consistency. On COVID, the CDC was never going have everything perfectly figured out in the beginning, and then consistently repeat the same message from January 2020 to January 2022. There have been too many unknowns and too much of a need to act urgently before everything was figured out. To this day you still have people people justifying their distrust by saying things like "I though we just had to stay home for a few weeks to flatten the curve."
Also, there's an unreasonable expectation that messaging should be be all things to all people. If a message widely disseminated and understood, it needs to be simple and clear. However that will lose nuance, and some people will launch into distrust because the nuance they demand isn't there. The problem is those goals are contradictory, so someone's not going to get what they want.
> Instead, those institutions have shown that they are willing to publish nobel lies. Specifically, they are willing to bend the truth and publish false information if they believe it is in the best interest of society / greater good. Given an entity that has shown willingness to bend the truth 'for the good of society', isn't distrust and cynicism is rational?
I don't think distrust and cynicism is rational in that case. It would be if they "lied" [1] selfish reasons, but if you assess that their motives are pure, then descending into "distrust and cynicism" is throwing the baby out with the bathwater.
And frankly, if "noble lies" bother you, wait until you learn the kind of ignoble lies people start believing when they become distrustful and cynical.
[1] Quotes because frankly most things people call lies aren't actually lies.
If an institution cannot give sufficiently accurate facts to me, why wouldn't it be rational for me to try to figure out the truth (to my satisfaction)?
You may assume that "I" do not have sufficient education and training to interpret state-of-art publications, and/or assume that "I" would end up trusting dubious alternative theories, but you do not know. The "average" person may indeed be incapable of finding out the truth by themselves, but I'm really sick of "science-y" people giving blanket statements on what others should believe without actually checking whether they deserve the condescending remark in the first place.
> You may assume that "I" do not have sufficient education and training to interpret state-of-art publications, and/or assume that "I" would end up trusting dubious alternative theories, but you do not know.
If you're that smart, you should be smart enough to 1) understand the pressures they're under, 2) be empathetic, and 3) seek out information that may not be adapted for mass communication.
> but I'm really sick of "science-y" people giving blanket statements on what others should believe without actually checking whether they deserve the condescending remark in the first place.
Check with who? The public? You personally? Throughout this whole pandemic, precisely zero "science-y" authorities have had conversations with me on this, so I'm speaking exclusively about mass communications.
The lies people tell to the distrustful and cynical don’t matter much. When you take the time to evaluate things from first principles, someone saying something misleading around you just isn’t that impactful.
Institutions being wrong and misleading is something that is immensely harmful because of trust.
Most of the CDC guidance has been verifiably insane from the start of COVID — often from the CDCs own advice before the pandemic.
> I don't think distrust and cynicism is rational in that case. It would be if they "lied" [1] selfish reasons, but if you assess that their motives are pure, then descending into "distrust and cynicism" is throwing the baby out with the bathwater.
Perhaps not purely rational, but predictable and consistent with emotional norms.
> Perhaps not purely rational, but predictable and consistent with emotional norms.
That makes no sense to me. On an emotional level, my assessment of someone's intentions is a far more important factor in determining how I feel about them than any statement or group of statements.
> CDC does not currently recommend the use of facemasks to help prevent novel #coronavirus. Take everyday preventive actions, like staying home when you are sick and washing hands with soap and water, to help slow the spread of respiratory illness. #COVID19 https://bit.ly/37Ay6Cm
> “Seriously people — STOP BUYING MASKS!” the surgeon general, Jerome M. Adams, said in a tweet on Saturday morning. “They are NOT effective in preventing general public from catching #Coronavirus, but if health care providers can’t get them to care for sick patients, it puts them and our communities at risk!”
Both of those statements were made at a particular time under a particular set of unknowns, and the only way I could see them as "lies" is if I misinterpreted them by failing to consider that context.
The most egregious false statement is probably the thing about hand washing, but it's a not a lie because given how little was known at the time and the fact that the CDC isn't up in some ivory tower, so they had to say something and just recycled previous advice. In Feb. 2020, there wasn't much COVID period, so running around with an N95 to the grocery store would just waste it, especially if it's worn incorrectly (and I saw lots of that, especially using only one strap).
Also, both those statements are far more nuanced than I remember, which I think kind of proves my point about nuance. People claim they want it, but they still strip it away and remember some highly simplified version.
Here's what Fauci said after the fact about the CDC's early stance on masks [0]:
> So, why weren't we told to wear masks in the beginning?
> "Well, the reason for that is that we were concerned the public health community, and many people were saying this, were concerned that it was at a time when personal protective equipment, including the N95 masks and the surgical masks, were in very short supply. And we wanted to make sure that the people namely, the health care workers, who were brave enough to put themselves in a harm way, to take care of people who you know were infected with the coronavirus and the danger of them getting infected."
I think it's pretty clear that they intentionally downplayed mask wearing for the public with the goal of saving masks for hospitals, even though kn95 masks were on the market[1], and wearing masks would have saved lives.
Uncertainty and especially admitting when something is not yet known is perhaps the most important part of science. The point is to do experiments and actually test things to get closer to the truth. One must admit epistemological uncertainty until the actual experimentation.
The CDC still has not performed or directed any large-scale experiments of any kind. They are a political institution, not practitioners of science.
Saying something definitive like 'Masks are NOT effective in preventing #Coronavirus' before the evidence is in is quite anti-scientific.
> The CDC still has not performed or directed any large-scale experiments of any kind. They are a political institution, not practitioners of science.
> Saying something definitive like 'Masks are NOT effective in preventing #Coronavirus' before the evidence is in is quite anti-scientific.
1. The statement you quote came from the Surgeon General, not the CDC. The Surgeon General is a political appointment, and as far as I know, the CDC does not report through him.
2. IMHO, the CDC's problem was actually the opposite of what you are complaining about. It was too scientific. Science is slooooooow, so if you rely on it too heavily then your problem will get "inside your OODA loop" and you won't be able to act quickly enough to tackle it effectively. The CDC has multiple hats, and it needed to push its science hat back in order to wear its crisis leadership hat in front.
Science is not the right tool for every job. It works great for things that change or move slowly (like physical laws) and very poorly for things that change or move quickly. It's an error to expect/demand every problem be solved with science
I believe that people at large are better at sniffing out bullshit than the elitists running these institutions realize. Unfortunately most aren't as good at separating out the actual parts being manipulated thereby leading to increased distrust of "science" and to crazy flat earth groups becoming a thing. sigh
I'm also skeptical. An unprecedented percentage of Americans have well over a decade of full-time education, yet here we are. People question it's quality or assert that some degree-holding buffoon proves it worthless, but I doubt they have convincing empirical evidence that our population knew more in another era.
I agree with you that this won't happen, simply because (via an evolutionary basis) the maintenance of social order comes before the dissemination of truth. The two might coincide, but where they don't, social order comes first.
This is ultimately what happened during the early days of the coronavirus pandemic. It's what happens during fuel scares, it's what happened during 911 as a below poster explains, it's what happens within companies when layoff rumours circulate.
Even within the closest of romantic relationships each partner is not 100% truthful with the other because there are some things that are simply better not said.
Unless the idea is that we descend into some sort of authoritarian nightmare and the general population becomes irrelevant though, I'd argue that social order does require some semblance of trust in power even if only in the basics (e.g. if I don't cross the State, the State won't cross me).
Once that falls apart the whole thing is in ruins.
I wasn't social media that eroded the trust in research and science, it was the introduction of politics.
We need to separate research from politics. It needs to be a separate pillar of the state. You don't want reseachers chasing funds continously and doing short-term research for local sponsors. You want sizeable investments in long-term research projects, high autonomy and following of the scientific method. This way people would trust the results.
Tell an algorithm to optimize for engagement and it will promote the content causing the most engagement: novelty, conspiracy theory, outrage, surprising "revelations" and all sorts of clickbait.
If we removed Twitter, Facebook and Tik Tok and reverted YouTube to the pre-algorithmic (subscription based) era, we'd still have free speech on the internet but probably a lot less bullshit.
Note: Twitter would still promote bullshit even without a ML algorithm. The rules of engagement:
- short content
- individual popularity means content popularity
- retweeting is instant, requires no critical thought
create an environment where clickbaity emotion-viruses spread at exponential rate, and things that require engaging the brain are ignored. Its essentially a crowdsourced rating system for tabloid headlines by design.
It wasn't social media who decided that the government should strait up lie to people just because it was the best way (in their opinion) to get the best outcome.
A big part of the anti-vaxx movement comes from the fact that the government have made it very clear they will lie and misrepresent data whenever it is politically advantageous for them.
If you want a very recent exemple, just look at this interview from someone very high up in the food chain at Bayer (youtube.com/watch?v=qowDwaYx7vI) and then try googling about whether or not mRNA is gene therapy. it was considered dangerous misinformation to say what he just did and yet he is overseeing the fabrication of covid vaccines.
It's far from the best example, but it's one where I won't have to waste an hour getting proper sources
It's hard to defend this without arguing for allowing the government to lie to people without any defense.
However, even if it delivers to you true information about abuses and lies being told to you, you should still be aware that the system is doing this so that you will be outraged, rather than that you will be informed.
Yeah, this is a classic example where social media spreads falsehoods, including mistakes.
Here the exec is stretching the definition of the term gene therapy to include things that don't alter genes "mRNA vaccines are an example for that cell and gene therapy". He basically makes a mistake, because the same platform is also used for gene therapy.
To use a software analogy, just because both mRNA vaccines and mRNA gene therapies are software, doesn't mean that mRNA vaccines perform system updates (instead they trip the suspicious behavior detection of the anti-virus and trigger a download for new virus definition files).
Additional subtle distortion happens later on social media when people take this mistake and "shrink" it in the opposite direction "actually mRNA vaccines are NOT vaccines because they're gene therapy".
Joe Rogan then takes that and stretches it further to imply this is why the vaccines don't last long
> This is really gene therapy. It's a different thing. It’s tricking your body into producing spike protein and making these antibodies for COVID. But it’s only good for a few months, they’re finding out now. The efficacy wanes after five or six months. I’m not saying that people shouldn’t take it. But I’m saying, you’re calling it a thing that it’s not. It’s not exactly what you’re saying it is, and you’re mandating people take it
Sinopharm makes a COVID vaccine from dead / inactivated virus. Its used widely in China as well as in the developing world. We've also seen the same reduced antibody neutralisation there, and it isn't necessarily because of the passage of time but to a large extent due to the evolution of the virus to be significantly different from the original.
So yeah, this is a perfect example of how social media spreads engaging novelty - whether its deliberate disinformation or simply mistakes - and continues to distort it further by continuously applying stretching and shrinking. The info blob continues to evolve to evade people's "immune defences" and cause them to re-share it (e.g. retweet).
This is not something new, traditional media have been doing it for decades ever since the rise of the yellow press, with varying levels of subtlety. The difference is that this here is an "organic", crowdsourced process that yields better, more convincing results, as it has to defeat the "immune systems" of many people to get a good reach via resharing. Only the highest quality misinformation makes it through.
The worst bit about this phenomenon is that corrections don't work well. People who haven't heard the rumor don't care about the correction (its not interesting) so they don't share it. People that have heard the rumor before don't like being wrong, so they don't share it either. So it doesn't matter how many scientists explain the difference, the explanation will never have anywhere close to the reach of the original (likely hundreds of millions of people)
Social media (especially short-form ones like Twitter and TikTok) is just a natural evolution of media trends that were already present in society. TV replacing radio/text and tabloid newspapers are just two examples off the top of my head. If Twitter/Facebook/TikTok weren't there, other alternatives would've been there to fill the void.
Just like we recognized tabloids for what they were, we need to recognize social media for being the same phenomenon disguised in a "grassroots" cloak (i.e. crowdsourced tabloids)
There is nothing organic about the speech that happens on YouTube. Veritasium explains the shift that happened when youtube switched to an algorithmic model from the subscriber model: https://www.youtube.com/watch?v=fHsa9DqmId8 - you need to carefully manipulate people to click your thumbnail based on the picture and the video title.
Making outrageous but believable shit up to sensationalize your content is a well known tactic of tabloids (which does continue today to some extent). YouTube and other social media incentivise exactly that.
>If Twitter/Facebook/TikTok weren't there, other alternatives would've been there to fill the void.
The particular platform that has these characteristics is arbitrary and beside the point. No one is arguing that Tik Toks particular implementation of a short form engagement optimizing feed is dangerous.
It really is just people. Look at all the information about HIV in the 80's, long before social media. People believed some truly crazy, racist, and misinformed things about HIV and its spread.
People are multifaceted beings. Social media AI automatically exploits any and all of our failings in the name of engagement. Traditional media used to do that before to some extent but the current situation is unprecedented in its reach and efficiency - custom tailored to individuals.
Technology is supposed to compensate for our failings, not exaggerate them to the extreme.
It really was crazy. Back in 1985 Neil Young apparently thought that he could catch HIV if a homosexual clerk handled his groceries (obviously nonsense and highly offensive).
"You were -too cautious- about that new disease that was killing everybody in 1985" feels more like a dark joke than a serious claim. Everybody was shocked when Rock Hudson died in 1985.
I strongly protest the concept that the recent introduction of politics is making things worse than they used to be.
If you ask anyone who was senior working in these institutions in the 1970s, things were far more dodgy back then.
Corruption and bigotry were rampant. But it was far easier to cover up back in the day.
Politicization is mostly driven by transparency.
Politicization is not a new thing, it's a fact that there's a lot of extremely conservative individuals in each country. And technology has bought them closer to their institutions.
Society is being brought very closely together, and we're working through the teething issues.
>I strongly protest the concept that the recent introduction of politics is making things worse than they used to be... Corruption and bigotry were rampant.
It is wild how people think things like institutional bigotry aren't politics. How is a scientific organization that excludes large swaths of the population due purely to the circumstances of their birth apolitical? Science has always been political because politics is how people interact with each other. That is true whether we are discussing how governments behave on the world stage or how that big project is being managed at your job. Wherever people exist, politics exists.
Most people accept that science in past generations was, to a certain degree, corrupted by the prevailing prejudices and societal/cultural/political/ideological beliefs of the time.
But, if we acknowledge that was true then – why shouldn't we expect the same to be true now? The prevailing prejudices and ideologies may have changed, but has anything changed about the actual practice of science to make it more immune to those forces? I think a lot of people can't see it, because it is far easier to see when science is being influenced by beliefs you reject than when it is being influenced by beliefs you yourself share.
I think the politicisation of science is inevitable, and science is never going to be entirely politically neutral. Scientists are human beings, and as Aristotle famously said, "man is a political animal". I think science is going to achieve the maximum possible political neutrality when it is either (a) things so clearly established no one can sensibly debate them (which especially happens when scientific theories get translated into widely adopted working technologies–the very fact that the technology works is proof the theories behind it must be mostly correct); (b) things that avoid political entanglements because they have little relevance to everyday human life (e.g. string theory vs loop quantum gravity isn't influenced by broader societal politics because it has no relevance to issues of social/cultural/economic/political power, although even it is surely influenced by the micro-politics of the theoretical physics community, its funding models, etc.)
Take something like human-induced climate change – personally, I accept the mainstream account of that topic as most likely to be correct, but I can't be completely sure that it is right, surely there is some non-zero probability (even if a small one) that it is wrong. But, I expect people in a couple of centuries from now will be able to reach a much firmer consensus – they'll know whose predictions turned out to be right, plus technological and economic developments will likely greatly reduce the number of people with a vested economic interest in the non-mainstream answer. I think the most likely outcome will be that the mainstream account will be vindicated, and almost everyone will acknowledge that the minority were wrong; but it isn't absolutely impossible that it could turn out the other way around. And whichever way it turns out, I think people will see politics as having played a major role – if the mainstream account is confirmed, people will see much of the opposing minority as being led astray by politics; but, conversely, if the sceptical minority turn out to have been right all along, people will reach the same conclusion with respect to today's mainstream.
Economic prospects are certainly worse for the younger generations than some generations before, especially in regards to affordability of living. They have more tech gadgets and more access to information though...
Are things "improving every decade" at a "vast rate"? What exactly is improving? Doesn't that depend on one's values as to what counts as "better"? Do you mean technological progress, economic progress, social progress?
Perceptions of progress seem to depend a lot on (a) where in the world you are, (b) what things you value, (c) whether you choose to optimistically focus on the positive developments (from the viewpoint of your own values) or pessimistically focus on the negative.
Even if we agree that certain forms of progress are occurring, do those various forms of progress necessarily entail scientific progress? I think, the only form of progress which necessarily entails scientific progress is technological progress – but even for that, some scientific fields have much greater technological relevance than others, continuing technological progress requires continuing scientific progress in technologically-relevant fields, yet is compatible with going backwards in areas of science which lack direct technological application.
You can be philosophical about "improving", but in the context of the thread and the post, I think we are talking about technological improvement over the last few years as an example of tangible scientific improvement, even if a lot of papers that are published suck.
Okay, but that comes back to the point that some scientific fields are more relevant to technological progress than others.
Getting physics and chemistry right is really essential to technological progress. How can technology advance if your physics and chemistry are wrong?
But, what about "softer" sciences, such as psychology or the social sciences? It seems you could be moving in the complete wrong direction with them, and yet still be advancing technologically – because technology largely doesn't depend on them.
And even for the "hard sciences", technology generally only depends on certain areas of them. How much of the Standard Model of physics is actually technologically relevant? Technological progress is strong evidence you've got the technologically-relevant parts of physics right, but is compatible with serious error in low technological-relevance areas. What is the technological relevance of electroweak unification, for example?
We also don't know what we are missing out on. Neuroscience is an area of biological research in which many promises of clinical translation have thus far failed to deliver. Is that because this the brain is very complicated, and we just have to wait longer and invest more before we see more results? Or could it be that we are doing the science wrong, and we'd clinically be in a much better situation now if we were doing it right? How can one tell which of those two possible situations we are actually in?
> Most people accept that science in past generations was, to a certain degree, corrupted by the prevailing prejudices and societal/cultural/political/ideological beliefs of the time.
The amount of resources available to spend on research is finite. Distribution of limited resources is politics.
Historically, scientists either were well-off and independent (e.g. Ulug beg, Newton, Cavendish), or worked at the expense of very rich personal patrons (e.g. Tycho). This limited the number of potential scientists, and the amount of science done, quite drastically.
In 18th-20th centuries it was somehow overcome, when partly states, partly industries gave more scientists more resources. By mid-20th century, the system started to develop cracks, as science became increasingly driven by formal metrics, such as impactful publications. The metrics are actively gamed, but worse, since science is unpredictable, honestly checking everything and building a picture of reality became a worse strategy than going for a flashy if more weakly researched result.
> I wasn't social media that eroded the trust in research and science, it was the introduction of politics.
It's not like the medical system was a highly trusted and respected institution that politicians somehow undermined in the last few months. The lost of trust in the medical system has been literally over 150 years in the making. If you read some books on the history of the rise of homeopathy in the United States in the mid 1800s, the parallels with today aren't exactly subtle.
I'd add that even if it received funding through private enterprise, it would still be infused with the politics of said backers - whether intentionally or not.
The best you could possibly achieve is funding purely through small time (non business) donations from the public - and I'm still a little skeptical as to the scope of research that could realistically fund.
Only true to a certain extent. Is the judicial power not separate from politics? Maybe not completely, but it has measures in place to secure some type of autonomy.
What I'm talking about is removing micromanagement and ensuring that the institution is trusted.
I would argue that it wasn't just social media. Before the internet, it was relatively easy for a politician to 'pivot', to say something to one group, but quite a different thing to another. Now it is carefully catalogued and these days carefully tracked by anyone with any interest to care.
Social media just put some gas on a fire with engagement optimization. The rest was just human nature.
Thanks for your comment. I broadly agree with your points and find them insightful.
In particular I appreciate that you model authority as an analogue for trust, and trust as an analogue for independent verification.
I think trust is a prerequisite for successful convincing, because it is hard for human brains to overcome the emotional and relational elements of discourse, and we need to be in a receptive frame of mind. [citation needed]
We should study how authorities have built and lost trust, but I don't think we should be rebuilding those authorities using this knowledge. Instead I think focusing on how members of the public and different experts and scholars can build more diverse networks of trust relationships would create a more resilient system in the Internet age.
For example, instead of "I trust the CDC because they are an authority on disease" or "I trust Fauci on COVID because he is an immunologist", what if the common relationship was "I trust Fauci on COVID because his work on Zika and HIV was very interesting".
A related problem is what I call the 'knowledge gap'. Researchers develop a very comprehensive understanding of their chosen subjects, whereas members of the public have only a lay-person level of understanding. As experts become increasingly experienced and knowledgeable, the gap between lay-person understanding and expert comprehension grows. This means researchers are not always able to understand what a member of the public needs to know or how to explain it: how do you compress years of learning into a few minutes of talking, as part of a conversation? Good science communication skills are essential for experts if they want to develop trust relationships with members of the public, and if we as a society want to move from authority and coercion and towards education.
I've posted elsewhere in the thread about my own opinion that I believe the main issue we have is not with trust in the scientific sense (e.g. I think that most people generally _do_ believe that epidemiologists know what they're doing and that the outliers are exceptions), but with trust in the plan of action e.g. the entire chain of process.
Examples, all primarily via trust (I'm not a biologist):
I believe that excessive consumption of sugar can cause diabetes.
I know that alcohol can (will, given enough time) cause liver disease.
I know that driving my car drastically increases my risk of being in a car accident and therefore meeting a swift end, an agonising end, or perhaps a lifetime of disability.
I still engage in those activities despite the downsides because I believe them to be of net benefit, both to myself and to society as a whole. I'm willing to fully argue through that process.
Where I think the issue lies at present is that we're sorely lacking in convincing full-chain argumentation.
A person can trust that the UK health authorities know their stuff, they can believe that coronavirus exists and can be dangerous, but that doesn't solve the problem of whether or not they should skip out on their friend's birthday party or not. It only makes up a very small part of the jigsaw puzzle, the rest still has to be filled in somehow.
An interior minister that is dutiful will most likely always opt for more surveillance. Safety of every citizen is his responsibility and he would not leave anything to chance. If if the minister is no friend of authoritarianism, his position will most likely encourage certain policies in a predictable way.
But that safety has to be balanced at some point. Same is true for any
ministry of health.
Dying sober might not be the preference of everyone and optimizing on a trivial metric might do a lot of wrong. In fact I believe most people would be horrified to lead a completely rational and reasonable life.
Agreed, it can be very difficult for anyone to make useful decisions outside of just trusting the existing consensus.
I mean there is no real way for me to know if a vaccine is safe. I assume it is, and prefer to gamble on it not harming me and potentially helping the current situation.
But there is really no way for me to know for sure.
To put it another way, when the Omicron variant emerged, there were newspaper articles that often showed pictures of the spike protein labelled with different mutations.
As it happens, I did research in protein structural bioinformatics, so I have a good idea what the detail in these images 'mean' in a certain sense. However they are of zero use in making a single meaningful decision about how to alter my behaviour (or not!).
Of course, the journalists add these images as decoration, but to me it symbolises the mass of irrelevant detail provided. Which is not the same as saying 'just believe science!' as that is also wrong and unhelpful.
I actually think we may learn a lot by studying whom someone distrusts and the underlying causes. I imagine we would find that most of the times, the distrust is projected distrust—in other words, I distrust X person in my life because they did A thing and person Y is like X therefore I distrust person Y as well.
For example, I think many distrust Fauci less because of Fauci himself and his behaviors and more because his behaviors may remind them of their primary care physician who treated them condescendingly, saying they were stupid for asking a question. Or because their parent got Alzheimer's or cancer and don't know where it came from and were told by doctors that their family member would be fine, even though their family is not fine.
I guess I just wished we paid more attention to the transgressions that happened to find the presumed cause of distrust so we can repair it, because I think trust can be highly efficient and distrust highly inefficient at times.
And yet few people want to hear "your family is not fine". Many people don't want honesty, they want comfort, and they (wrongly, IMO) look to experts for comfort. This puts doctors in a damned-if-you-do-damned-if-you-dont situation. It's easy to translate "doctor was honest" to "bad bedside manner".
I think people want both—I believe I do. I want people to be honest with me and considerate. I don't think it has to be one or the other, yet we often haven't learned how to communicate with both.
I can say, "I want to be honest with you, your relative may only have a few weeks left. I imagine that might feel really scary to hear or perhaps you might even feel angry, I don't know. I wanted to be honest with you and to let you know that my team and I will be here for you and your family."
I don't disagree, and it's probably the right option - but most people are lazy thinkers. You want the simplest answer that explains the problem in a way you understand.
If someone is trying to take advantage of a situation that's inherently complicated or unpleasant - it's easy to do so with a simple message that makes it digestible, even if it's wrong or harmful. We've seen this time and time again over the last few years. And then worse still, social media perpetuates it by keeping you in a bubble and pushing you further down a path.
You could be right, and as far as I can tell - that's what we've done.
But is it sustainable to lie to people in order to effect a desired (even if positive) change? How long can it go on for before people just turn off?
Maybe the answer is indefinitely. Empirically at least in my country this doesn't seem to be the case.
I've heard this been described as "pandemic fatigue", but it seems to more revolve around taking orders than believing in or being concerned about the pandemic itself.
People are willing to follow counterintuitive rules for a while, but eventually they need to understand some sort of final benefit derived.
The vast majority of people don't get "not drinking bleach fatigue" because they know it's bloody awful and that if they don't drink it, they're gone.
I see that as a failing of the policy setters and communicators. You can communicate a simple message, but you have to show your work too. Pretending counter-points don't exist only leads to "gotchas" and distrust down the road. This has been the the theme of last few years for me. If you want more than a public policy soundbite, you should be able to get a public position paper describing the assumptions, arguments, and counter-arguments.
This part is a huge society-wide problem that can theoretically be fixed so people can become more informed and active participants in the framework of their daily lives. Don’t ask me how, though, because I just got off work and am too exhausted to do anything except let the news tell me what to be afraid of today, then probably binge The Office and slam a few brewskis until I fall asleep on my couch ‘cause I got work again tomorrow
Even if I do have time and energy, my lack of knowledge on the subject matter would probably lead me to waste that time on dead ends, wrong conclusions, or learning things I will never use again outside of fact-checking one study.
You have to trust yourself to try anyway. Beats doomscrolling even if both are equally meaningless in the long run. Something that helps me is refusing to accept any narrative or conclusion about any topic where it would require a specific named group of people (e.g. members of any nationality, profession, corporation, ancestry, presentation, etc — anything) to be uniformly dumb or malicious.
That might sound silly but it means there’s no downside to entertaining any consideration, not even the downside of my own bad feelings from negative ideas. It totally removes a layer of mental latency I didn’t even realize I had until recently, even about topics where that wouldn’t ever be a question that’s possible to ask.
Instead of getting bogged down checking one story after another it’s easier to ignore entire storyline arcs at a time once they’re more obviously one set of jangling keys after another all vying for my attention at a higher priority than any self-directed interests :)
We, as a society, bought a dog, and it took at literary shit in our communnal kitchen. No one wants to clean it up, and frankly, I can't blame you/me/any one individual for it. It's exhausting but a society-wide problem. It sucks that the only response from platforms has been to ban the discussion, rather than pay for a writers corp to come up with easy to parse FAQs why some of the garbage misinformation is a pile of lies, and to write systemic takedowns of odious dreck. But here we are.
The CDC recently dropped to 50% public trust, after repeated failures, mixed messages and outright lies in some cases.
They lie and then act incensed when the public stops believing them. Given the massive damage they've done, I'd consider it justified.
The CDC is an incompetent, bloated and fundamentally dishonest organization that was perfectly comfortable throwing grandma to the wolves via lies (masks don't work!) if it meant their unpreparedness wasn't exposed.
Via anti-masking lies, CDC irreparably damaged their reputation and also likely got people killed who believed them. The damage is deserved.
I remember pre-covid thinking that the CDC was the only federal agency that had its stuff together, but early on in COVID they really came off as inept. They failed to stockpile PPE (masks and such) or even make provisions for procuring the same. Then they told us masks don't work (as you noted) before eventually flipping on that position. Then they bungled the tests (instead of going with the WHO test like the rest of the world, CDC rolled its own which didn't work and was much less efficient to process).
Still, I don't share the distrust of a lot of folks regarding their policies and decisions over the last year or so. I think they're probably making reasonable calls for the available evidence even if hindsight eventually proves them wrong. To the extent their critics are correct, I think most will have arrived at their conclusions as a matter of luck (e.g., someone who knows nothing of virology happens to hear some study or other that hints that the CDC may be wrong and bases their entire worldview on that study without any awareness of the counter evidence).
For long term scientific correctness, what the hacker news crowd could help with tremendously is reproducibility and that means versioning and continuous integration with data, for research software. In context of a paper or doing research, you have to write software anyway.
The other one that pops to mind is probably statistical calculation helping services.
Maybe there are lots of other things as well - I haven't been involved with scientific research in a long time.
> Y is better than X, because we end up with a more verdant Earth, cheaper food, better weather, less war (insert reason here).
The problem is that while it's easy to calculate exactly how much less CO2 car A emits than car B, it is close to impossible to use science to predict to any useful degree of accuracy how much cheaper the food, better the weather, or less of the war we will get. It's often hard to even predict the sign, not even trying to determine the magnitude. This means that when scientists or politicians tell you that "we need to do X, so that we get Y", they're very often fooling you, because they have no good idea how much (if any) Y you get for each unit of X.
I think you missed a step in the chain; if you increase the quality of life for humans, what are the consequences for human population growth and the consequences for the rest of the biosphere as a result?
This is not helped by the lack of scientific and logical literacy in the public at large who will cherry pick to support either position on a topic and call that "homework done".
The scientific process involves testing the accepted and pushing ideas to there extremes. The fact there is a test of 5G on mice isn't surprising, but when these show a statistically insignificant blip and it's not in the conclusion that's not "big science hiding the real truths", it's that this blip in isolation is a true statistical anomaly...
Absolutely. At least here in the United States there’s a full on collapse of trust in institutions and supposed expertise and for extremely good reasons.
Over and over again those with authority have squandered it. An opioid epidemic devastated so many communities after doctors, health care executives, and McKinsey and Company said the new class of drugs were non-addicting. Our manufacturing sector evaporated after economists and politicians said free trade would benefit everyone. Thousands of young people were sent to die in the Middle East in a war all the experts said was necessary and would be relatively painless.
Those are just the first three examples off the top of my head. I could keep going. We were promised technology would improve our lives. We were promised borrowing money for education would improve our lives. And on and on and on.
All bullshit. All promulgated by supposed experts with impeccable credentials. It’s a full on disaster and the chickens have come home to roost.
There has also been a huge emphasis on institutional misdeeds, driven by the outrage economy.
Nobody takes note of the hundreds of life improving drugs developed in the same time period as opioid epidemic. Nobody takes note when cops arrest the right guy and saved the day. Nobody takes note when a country doesn't militarily threaten or invade the US.
Interesting point. Do you think were it not for outrage economy, would US tackle it faster/slower/at the same rate? Maybe the reason for outrage economy is that there is a fair amount of things to be outraged about.
I personally think the majority of the outrage economy is counter productive. It is important to identify and communicate areas for improvement, but I think most of the solutions generated from the state of outrage are off target. I think emotional outrage is a natural response to some of the injustice in the world, but rarely helpful in actually solving a problem.
What do you mean by success here? Depending on your definition, I think there are times where the answer is yes. There are clear analogs to the trolley problem in public policy every day.
The answer we have generally come to is based on a psudo kantian interpretation of causality. One innocent can die due to inaction, but you can't actively kill one to save many.
We don't allow nonconsentual body snachers to harvest one person to save many. Alternatively we may allocate finite funds to save several lives instead of a single life. We may also choose not to spend the money to save a life
Sorry nobody gives a fuck that our global military dick swinging prevented Senegal and Guatemala from sailing into New York harbor and carting off our treasures.
The outrage economy tends to focus on people and interpersonal drama. Some comedian said this, or maybe some company did something stupid and there’s a walkout.
If anything there’s a massive shortage of outrage. Given the degree of misery the entrenched elite is inflicting on the masses, the fact that we don’t have a general strike or torch wielding mobs swarming over the gates of billionaire’s houses is frankly astonishing.
Outrage is only good in that it generate interest in the topic. There is a surplus of hot-headed illogical people bogging down any real constructive discussion. There is a shortage of people that understand the realistic challenges involved and working to come up with viable Solutions on either side of a given issue
The problem is that the only viable solution is for an entrenched elite with disproportionate power and money to lose much of that power and influence.
It’s a struggle between the masses and the ruling class about who gets to make decisions in our society, not like lunch meeting where we can find common sense action items to agree on.
How does outrage help overcome this challenge. A good chunk of the country stands with the status quo and elites on any given topic.
I don't think outrage changes minds, just engages your base. It also encourages the working masses to lash out at each other. Most outrage topics serve mostly to pit the masses against each other, based on skin color, gender, or who has bigger scraps from the economic table.
If you want real change, the solutions are relatively simple IMHO. Ranked choice voting, proportional representation in senate and congress, and less winner take all closer to a Parliament.
Just to clarify, many experts from generals to peacenik linguists to gonzo journalists predicted immediately that the wars would be an expensive and unwinnable series of atrocities. They weren't given airtime.
Which is also why the trust in the institutions that give people airtime is virtually nonexistent, and people on the internet, sometimes those censored experts and sometimes people masquerading as censored experts, are getting all the attention.
You've touched one of the real veins of this issue that very few really want to talk too loud about. There are still a handful of good journalists doing good work out there, but they seem to have been shoved over into "crazy man on a blog ranting" corner, making it much easier for propagandized coincidence theorists to shut down all discussion from them as crazy conspiracy theories.
I think a good analysis of the history and changing landscape of media is highly in order if we want to get to the bottom of this.
As for me, I like to remind people of Operation Mockingbird quite a bit lately, and rest assured the name just got changed after the Church Committee and given the explosion of the internet they are certainly extremely pervasive in the new system.
As CIA director William Casey said: "We’ll know our disinformation program is complete when everything the American public believes is false."
That's not the way I remember it at all. The government were very obviously going into war because they wanted to and scrabbling around for anyone who would support them. You only have to look at the death of David Kelly.
> Lawmakers knew from the beginning the shakiness of the Bush administration’s case for going to war with Iraq, and Biden not only went along with it, he championed it.
> Iraq in 2002 was devastated by economic sanctions, had no weapons of mass destruction, and was known by even the most pro-war experts to have no missiles that could come close to the United States. The idea that this country on the other side of the world posed a security threat to America was more than far-fetched. The idea that the US could simply invade, topple the government, and take over the country without provoking enormous violence was also implausible. It’s not clear how anyone with foreign policy experience and expertise could have believed these ideas.
> Senator Dick Durbin, who sat on the Senate intelligence committee at the time, was astounded by the difference between what he was hearing there and what was being fed to the public. “The American people were deceived into this war,” he said.
- Also from the Guardian link.
And that's just war. We were lied to about sugar; about margarine, about recycling, about oil spills, about agricultural pollution, about public transport. We were lied to about trade, espionage, healthcare. Repeatedly, barefacedly. Lies are the most bipartisan issue in America.
In the face of all this, we're told the enemy is distrust itself, told to trust the science, told to believe that Biden and the Dems are fighting for us. We're told anti-war news outlets and comedians are secretly communist, Russian funded. It's all such utter wank.
Eg, a doctor at a jail tested Covid treatments on inmates to see if it helped. That's a ground-shaking loss of trust! Where that's one crazy doctor and an isolated case, it's hard for the disenfranchised not to generalize to the rest of the medical establishment.
You are making some pretty uncharitable assumptions here, and frankly I find this entire post very distasteful. Who decides what is "pseudo-truth" instead of "actual truth"? I'll take a wild guess... you are the one who knows the "actual truth" unlike all the ignorant people who disagree with you...
....and that is exactly why no one trusts institutional actors anymore.
> I think your comic is referencing the political "left" vs "right" divide, you are part of "the right"
That's a big assumption. I could very well be part of the left who simply doesn't share "the left"(tm)(c)pat. pending)'s current party line because I find it to be nonsensical.
> People who exhibit this pattern are usually stuck in a rut of pseudo-truth.
There it is. That arrogance right there. "You don't share my class interests, therefore you are wrong". That, in essence, is exactly why no one trusts institutional actors -- because people realize they've aligned themselves with the bourgeoisie against the proletariat. There's no disguising it anymore. They will do or say whatever those who hold the purse-strings tell them to say.
> It doesn't help that a lot of institutions agree with the actual truth a lot of the time
Which truth is that? The one that said that masks helped halt the spread of the virus, or the one that said it didn't and we should stop hoarding them? Roll 2d6 and consult your Propaganda Master's Guide on page 666 to find the answer.
> People who exhibit this pattern often reflexively disbelieve whatever any authority says, just because they're an authority, i.e. Oppositional Defiant Disorder.
...or they've correctly identified them as class enemies who will do or say whatever it takes to defend their class interest such that you can't believe anything that comes out of their mouth. It honestly doesn't matter if they're right at that point because the lack of trust cannot be amended.
> The thing that makes it really insidious is the mentality they're right and everyone else is automatically wrong.
Indeed. Calling someone a part of 'the right' and saying they're from a side stuck in a 'rut of pseudo-truth' because they point out that institutional actors are biased to their own interest, have a history of lying, and this is why no one trusts them anymore, is quite insidious.
> I ask you: Hypothetically, if what you believed was not actually true, what would convince you of that?
Institutional actors abdicating their posts in favor of someone of proletarian class interest so proletarian ideals can be put to the test without interference.
Edit to add: I found out what happens when you present a video testimonial strongly calling into question vaccine clinical trials - it gets downvoted.
So what happens when you present a video testimonial like this - https://www.youtube.com/watch?v=L2GKPYzL_JQ - and people just dismiss it, people not understanding that with clinical trials you're meant to keep track of and report all outcomes, and not only that - by limiting the scope via an app that has limited options, and no free form input, you're crafting a very narrow scope of outcome, literally doctoring the outcome by minimizing the potential range and severity of adverse events? (If the people even watch the video or understand the implications of what Maddie's mother has shared; and if not, then you're not able to educate or inform them further - probably because it's a counter-mainstream narrative point, arguably the mainstream and government captured, regulatory capture from industrial complexes, etc)
Does the video described build or harm trust of our institutions? Does that this happened with this clinical trial, does that not automatically extrapolate that all of the other clinical trials were ignored, not given proper oversight, nor having integrity to them?
The interesting thing about trust is it takes much longer to build than to spend.
Given the level scientific institutions have been spending trust over the past few years I'm not sure how this problem can be alleviated within our current framework.
Let people draw their own conclusions and don’t censor or otherwise penalize them for questioning one interpretation of data vs another. Encourage people to do their own research and investigate who is funding the sources used for that research. There is no need for conspiracy theories when financial incentives exist.
I don't. My point is that we should always be aware of the financiers behind any piece of research, prescribed solution, etc. and teach people to think critically about this.
But someone with an invested interest in research funding that research doesn't inherently mean the conclusion is wrong. If anything your suggestion would lead to more conspiracy theories and not less as people are now more focused on the motives for the research rather than the body of that research.
It doesn't mean that but it should always be treated as a possible factor, and analyzed in light of other connections. Collusion is very real when billions of dollars are on the line and it is naive to assume otherwise.
>It doesn't mean that but it should always be treated as a possible factor, and analyzed in light of other connections.
But do you not see how this will result in conspiracy theories? You are no longer engaging with the merits of a study and are instead questioning the motivations and source behind that study. You are shifting from a more objective argument to a more subjective one allow more room for conspiracy theories.
>Collusion is very real when billions of dollars are on the line and it is naive to assume otherwise.
Yes, which is why "do your own research" is often misleading because those billions of dollars have greatly impacted what research is available to you.
Ok, let's look at a scenario. If a board member of a powerful pharmaceutical company was also a chief executive at a news organization responsible for "fact-checking" claims about that company's medicines, would it be inappropriate to wonder about financial incentives?
You seem to only be looking at the surface of this issue. I agree that it is appropriate to be concerned about conflict of interest, but what happens when you take it one step further beyond your comment? I decide to "do my own research" and plug the drug's name into Google. The top site that comes up is the same news organization's fact check because no one else has a financial incentive to investigate this issue and report the truth. Me "doing my own research" leads me to the same biased information. The "do your own research" portion doesn't fix anything.
>I’d be willing to bet that most people with the gall to do their own research will look beyond the first result that appears on the SERP...
Once again this is thinking that doesn't go beyond the first layer. If this situation is possible for the first link, it is possible for the second and the third and everything on the first 5 pages. However deep this hypothetical researcher is willing to go, if there are enough billions at stake the manipulator can keep pumping out misinformation. They have a financial incentive to lie and no one has a strong financial incentive for truth so how does this researcher find the needle of truth in the haystack of lies?
It goes far beyond the first layer, to the ultimate layer: what is the basis of our presuppositions about knowledge? How do we authenticate what we think we know?
We fundamentally can’t for most of our knowledge given the complexities of the modern world. Which is why the whole “do your own research” suggestion is BS. There simply isn’t enough time in the day to become an expert in everything in order to be able to accurately judge the validity of most research. That is why stopping the spread of misinformation has value because most people do not have the expertise to identify all misinformation.
But people should be helped to draw their own conclusions with annotations where every time some crazy idea is repeated it is automatically flagged with "This idea has repeatedly been debunked by 37 national societies of medicine and 19 meta analyses of 2,000 peer-reviewed studies with a quality deemed high or superb," etc.
Here’s one better: show people how to reproduce the same results given the same set of controls. Reproducibility should be the standard, not committees of debunkers.
All the time those fact checkers are provably biased (haven't seen none of them fact checking the early statements that masks were useless) they are useless.
If however fact checkers were a pain in the butt for mainstream media and politicians as well they would be much easier to listen to.
This is a commonly suggested solution that fundamentally misunderstands the problem. Most of these people aren't believing falsehoods because they don't have access to better information. It's because they don't trust the institutions that they believe drive science. Additionally, as these annotations only represent an opinion and are subject to change when they aren't correct they likely cause even greater distrust.
Do you mean "Do their own research" like make up conspiracy theories out of thin air and parrot nonsense deep state rhetoric? Completely divorcing yourself from reality? You want more of that?
Thing is, to a large degree, we haven't had to before. If someone random person wants to believe the moon landing is hoax, it doesn't affect my life in the slightest. What's changed is that random person walking around with no mask and no vax. Now, a random person in a Target market a thousand miles away from me isn't literally a threat, but where that attitude is supported online and a random someone walks into my local Target, it causes me to need to care. Someone can chime in to question the underlying beliefs on the matter of Covid, but the point is this is an example of an issue where we care. There are extremist views like white supremacy or similar violent ideology that are a bit further outside the scope of discussion but are another example of why we care that enough everyones agree.
Isn't the fact that we (as in, society) don't agree that it does affect your life an indication that it's actually a borderline issue, though?
Everyone would agree that a person wandering around the street firing live rounds into targets is acting in a dangerous manner.
No-one would agree that a person drinking a cup of tea in their own house is acting in a dangerous manner.
But coronavirus isn't one of those things. It's genuinely a borderline issue, and not because of the science, but because it's almost exactly on the line in which some people consider it to be a disaster and others consider it to be a bit of a nothing burger.
As a result, you end up in a situation as described by rgrieselhuber below.
Some people think that we need to close (e.g.) schools to stop coronavirus. Other people think that we need to stop caring so much about coronavirus, so that we can go to (e.g.) school.
The question of whether there's misinformation is almost tangential to the point because there are huge numbers of people who do not doubt official statistics or modelling but also believe that data or those models are not severe enough to require us to 'stop the world' so to speak.
I guess my thoughts are that, if we can't even convince each other (and from what I can tell neither of us have really fallen for misinformation), what hope is there for the actual 'vaccines cause 5g' nutjobs?
There were people during every atrocity (e.g. genocide) who argued the atrocity didn't affect their lives. Did that make it a borderline issue? No, it did not.
Ok, in your threat model, a person who doesn't wear masks or take vaccines represents (even though the vaccine you took should keep you safe?) an existential threat to you, and therefore you should have the right to dictate their beliefs and actions. However, they may view your actions as an existential threat to them. How to resolve?
You have it backwards. Discussing controversial issues in the clear minimizes conspiracy theories (no, it doesn't completely eliminate them). The volume of conspiracy theories was driven up precisely because our epistemological institutions (e.g., media, academy, etc) abandoned the pursuits of objectivity and neutrality. To get back to a semi-rational state of affairs, we need to get the ideology out of our institutions (or at least diversify it).
Do you think there would be less moon landing deniers if the media talked about it earnestly? If legitimate scientific journals published "research"? Honest question.
No, because there's some margin of people who won't be convinced no matter what. If you've failed to persuade that 2% (or whatever the floor might be) then there's no need to discuss further--those 2% are inconsequential. But if you're at double-digit percentages you can be sure that you're not near the floor.
Moreover, it's not just about the media's coverage of COVID or the moon landing, but about their overall credibility during the prior decade. For example, the media has reliably and brazenly lied and mislead on racial justice protests from ~2014 through 2020, which almost certainly affects its credibility on other issues (e.g., covid). Of course, the same applies to the academy and other institutions.
Moon landing deniers won't read, nevermind believe, legitimate scientific journal investigations. What's needed is Buzzfeed-esque, at that level of writing, top ten reasons why moon landing hoaxers are wrong.
Right, because the media and academic environments following the moon landing were (for all their faults) much more committed to objectivity and neutrality than they have been in the last 10 years.
This is often people not understanding the technical details of the science so fall for the logical mistake of "person on YouTube explained it so simply it must be true"... Obviously this isn't helped when the people who are supposed to be trustworthy and unbiased in presenting the science to people are repeatedly caught lying that drives people to that simpler world view.
Should I be penalized if my interpretation of the Israel data is that Jews are dirty and should be exterminated? That would be a pretty outlandish interpretation! I would HOPE I would be censored for that!
I think unfortunately we also need to protect science from idiots making very bad conclusions based on not understanding a paper... That's how we end up with 'bacon cures cancer' headlines in the daily fail...
This brings up an interesting point I was thinking about but didn't feel like typing until now.
I think the problem could be "science communication", not bad communication or anything like that, but the existence of it in general. There's a casual interest in scientific stuff in the public, there's an entertainment market for it, and so now we have a scenario where people that publish research have to weigh this impact when publishing, including it's benefits. But most people interested are only casually interested (and most vendors of the entertainment have incentives not necessarily aligned with presenting fact, but that's not my point, only a secondary problem). I think it's possible that it might be better if, when someone was interested in something, they had to dig it up online. If we get rid of the whole "science communication" thing, truthful or not you get rid of all these problems.
I know it's impossible to do that, but it's interesting to consider that the problem isn't bad communication, but this drive to communicate everything all the time skewing the incentives and public perception.
Do you think 'scientific research' is in a great position from a funding perspective right now?
Have you looked at the press or out the window as to how "science" is portrayed due to covid? The lack of sanity in both political directions is alarming and depressing all at the same. Both sides of the public on this are both, equally, damagingly, wrong. However, they pick up and throw around scientific papers as though it gives them the right to be the next pope...
If science is paraded around like nonsense in the press there's a matter of time until we end up with politicians who make the decisions sit there and say "I don't see any public benefit in researching this".
That day will only come sooner if things carry on as we head into a recession.
Edit:
My proposal to fix, as I've suggested elsewhere, keep scientific papers behind some (non-for-profit) walled access, obviously that has damaging impacts when you say X from country Y can't read paper on Z, but frankly maybe the public isn't grown up enough for completely open discourse when they can't understand the material.
Trust wasn't lost first. Talking heads swooped in for money or power, and they sell whatever they have to in order to achieve those goals. There isn't any sort of fact they are arguing for in good faith.
You can add any number of layers of scientific rigor and it won't convince someone who has a favorite talking head.
Is it not still that way? There may be a bias towards believing trusted institutions, but that's because they have reputations for delivering truth based on experiments.
> one derives from first principles every opinion they hold
But it's the only correct way, if credible studies contradict an official agency (more often it's the degree of the opinion where there is disagreement, not the opinion itself), my advice would be to trust the studies. For instance MRI contrast agents: the EMA banned most of them while the FDA did not, which is not the advice one would form by reading studies on the matter.
I believe if you take for yourself the right to neutralize or silence an opponent, he gets to do the same in kind. State authority is mostly based on show and relies on people complying, even in autocracies. So you should be very sure about it if you do silence people and about the costs. There was never a point in the recent pandemic or would you disagree?
It is also difficult to come up with an example where it was warranted to suppress information. Can you name one? Educated and intelligent people are often very distrustful, especially of authority.
even if someone doesn't walk us through the entire chain of proof I'd settle for clearly delineated junctions and paper trails--basically full transparency with the ability to augment and shim any gaps people identify along the way. Nothing is implemented by default though but it would be great to have this public body of work where someone can spot check on the fly to see if things line up. It's a lot to ask for with no incentive to push such a thing through but maybe I'm not seeing a potential profit motive behind such a project.
That's ... not true? Children can learn a basic model of the greenhouse effect in school, and do so in most civilized nations. Whether this is sufficient is another thing entirely, but I would personally contend that (non-propaganda) education can never be 100% effective at causing someone to believe something?
Right, this hits at the more fundamental issue which is that the purpose of stuff like "misinformation" filters is not to affect what people believe, but the second order effect - how they act.
Loads of people are well aware that wearing a mask would result in less coronavirus particles being emitted from my face if they were infected.
But they don't act on that for various other reasons unrelated to the virus.
The same is true of climate change. You can think that it's real and still drive your car for precisely the same reason that you can think that cirrhosis is real and still drink a few beers.
> The issue that plagues us is a lack of trust in long-standing societal authorities.
This misses a lot of the cause. People aren't listening on the vaccine because there were no WMDs in Iraq. All the "experts" said that too.
The fact that huge frauds have been perpetuated at scale against our entire society and nobody has been held to account is a major reason for collapsing trust. It would be very helpful if the parties responsible had at least been censured even if the political class couldn't muster the will to truly apply penalties, but few in the political class have even admitted that anything bad happened.
When trusted authorities perpetuate frauds or display extreme incompetence and then just paper it over (often at others' expense), they are destroying the fundamental trust basis of society. Opportunistic politicians are destroyers of civilization.
What if the scientific proof chain was published as a knowledge graph? This could be easily followed and all questions by skeptics could theoretically be automatically answered.
This is how I learned (approximately) how GPT-2 worked. Start with GPT-2 and it's full of nonsense you don't understand. Follow the references, follow more references. Get to basic neural networks with one hidden layer like you learned about in college. Reverse the chain and build upwards.
It's less about checking whether claims are true but about how they are rationalized. E.g. the claim could be a node "masks should be mandatory" with connections to reason, "masks reduce spread of virus" and papers supporting this policy. These would also need to be connected to nodes showing values, e.g. "reducing hospital admissions is more important than freedom of choice".
In this way it would be more obvious why policies are choosen and provide a way to express current knowledge. As more nodes are added to the graph, the policy prescriptions would change which would prevent mistrust that is based on "health department said do x, now they say don't do x".
Well the trust has to be earned by a history of proven visibly good outcomes. Never forget that there is a long history of cases where the established scientifically accepted ideas were actually harmfully wrong, particularly in the medical arena. In the USA in particular, medical practice is far from a benign charity, so people are right to question. One example: I had a tonsillectomy when I was a kid. No actual evidence that this is good medicine..
On the other hand, I don't disagree that the level of misinformation is ridiculous. It's difficult to understand why someone would trust someone with a popular podcast so much more than licensed professionals. Maybe there shouldn't be censorship, but where are the medical malpractice claims against these people? I think it's all tied in with the general acceptance of the "these statements have not been evaluated by the FDA" supplement industry. As long as your supplement does not do too much direct harm, we allow it.
> It's difficult to understand why someone would trust someone with a popular podcast so much more than licensed professionals.
I'm less convinced that this is a matter of trust, I think it's more about value systems and priorities.
I trust my dentist to recommend me the best filling or crown material for my teeth, even in the face of financial incentives.
But his advice does not tell me, in an absolute sense, whether it'd be worth me paying $1K for a crown if that means I miss the rent next month. It's not a tractable problem.
I can't speak to the US situation, but in the UK in my peer groups most coronavirus-related argumentation doesn't revolves around whether e.g. wearing a mask reduces the spread of coronavirus, but around whether wearing a mask is a net positive for the society and/or the wearer. The former is a matter for epidemiology, the latter is nowhere near clear cut.
Well I agree that there are actual valid arguments about priority, by which I mean the math works out- the costs are significant either way. Excess deaths vs. broader economic impact or even questioning how much economic benefit from allaying fear by an otherwise dubious activity (cloth masks..).
But in the US, we have this extreme irrational vaccine skepticism where the math just does not work out. People believe the vaccine is much worse than its benefit. It's hard to argue with since prominent people are supporting this idea.
I personally think that a lot of vaccine scepticism comes from the perceived force involved. For example, via passports and mandates.
I think once you get into the situation in which people think "I'm being forced to do X", you've created an impossible catch-22 and you've basically ensured that some % double down.
It's hard for me to explain because it just seems intuitively obvious. I really enjoy a beer after work. But if you forced me to drink one, and made me document the fact in order to go on holiday or whatever, I would feel less inclined to do so because I'd instinctively think that something fishy must be going on.
I've personally had the generally recommended number of vaccines at the right times etc, but when it comes to stuff like showing barcodes to get into places, I've faked it on principle, I think it's a dystopian horror and it doesn't surprise me at all that a lot of people have responded by just tuning out and refusing everything.
> I personally think that a lot of vaccine scepticism comes from the perceived force involved. For example, via passports and mandates.
You would think that, and they promote that, but it does not. It is obvious when you consider that all the vaccine skeptics were also vaccine skeptics before there was any hint of force involved.
I guess it depends on what you define as "scepticism".
I know a lot of people who took the first or second vaccine but are refusing further due to the coercion involved.
Or more generally, they behaved "carefully" in the early days of the pandemic but then switched off over time as they tired of being given legal commandments from God rather than sensible advice.
I agree that the same set of people think crackpot stuff like "5g causes corona" or the vaccines have microchips, I don't think those people are really relevant to the discussion.
I mean, the only reason for me to care if someone else is vaccinated is to the extent that I care for their wellbeing.
To me it's analogous to like, telling one of my overweight friends to go on a diet so that I can pay less for the NHS or something. They don't want the advice, it feels rude more than anything else (even explaining this feels a bit ick!)
If it was shown that we could eliminate sars-cov-2 via herd immunity I'd probably be a bit more keen to encourage it but I still don't think I'd support mandates unless we had some crazy ebola mutation and people started dying in massive numbers.
> If it was shown that we could eliminate sars-cov-2 via herd immunity I'd probably be a bit more keen to encourage it but I still don't think I'd support mandates unless we had some crazy ebola mutation and people started dying in massive numbers.
Triple vaccinated and completely in agreement.
An argument can always be made that they fill up hospitals, but that same argument can be made against parachuting, being overweight, racing (or driving in general), all alcohol etc.
If choice is to meaningful it includes bad choices.
> t there is a long history of cases where the established scientifically accepted ideas were actually harmfully wrong, particularly in the medical arena.
This isn't actually a problem so long as those past errors were caused by the limits of what we knew at the time and efforts are made to help prevent similar issues in the future when it's possible. It's inevitable that as our understating of science and medicine evolve we're going to discover that what made sense before is no longer a good idea.
The problem comes when we weren't wrong because of what we didn't understand, but because people who knew better just thought they could get more money if they manipulated results or outright lied. We had the tobacco industry pay off scientists to lie about the cancer risks the industry knew to be a problem. The resulting rise in people with lung cancer wasn't a mistake. We had doctors pushing opioids on people at insane doses because they were paid kickbacks if they did. That wasn't a mistake either.
What we need is strict regulation and oversight so that when science and medicine do get it wrong, it's because we couldn't have known better given what data we had at the time. That'd be a huge step up from where we are now.
This was a problem when past scientists stated that there was a consensus when in fact the underlying evidence was limited and allowed for multiple possible interpretations. If you dig into some of those past errors in medical guidelines you often find very shoddy, limited research which doctors uncritically accepted as fact.
- Accepted fact 1: eating saturated fat causes more coronary heart disease thank unsaturated fat (very high correlation, and accepted as causal)
- Accepted fact 2: eating trans fat is bad. A diet high in trans fats can contribute to obesity, high blood pressure, and higher risk for heart disease, because intake of dietary trans fat disrupts the body's ability to metabolize essential fatty acids. Trans fat is also implicated in Type 2 diabetes.
Historically margarine was low in saturated fat but high in trans fat due to partial hydrogenation - hence the questionable "health benefits" of it vs butter.
Currently in most of the developed world vegetable spreads are not allowed to contain significant amounts of partially hydrogenated oils, and thus margarine should be healthier - but there is a powerful dairy lobby so the bad reputation will last for a long while...
Several studies suggest that eating diets high in saturated fat do not raise the risk of heart disease, with one report analyzing the findings of 21 studies that followed 350,000 people for up to 23 years.
Clean water was the single most important advancement in human health of all time. Next was hygiene, especially during birth. Next is antibiotics. Next is anesthesia.
If you look at life expectancy graphs, they mostly correlate with GDP per person.
You can't really see the invention of antibiotics clearly in the curves.
The way I think of it is that of course penicillin increased life spans, but the discovery and industrialization of it was highly correlated with rising GDP.
Antibiotics rock, and I think penicillin is rightly praised as being as closed to a Panacea as it gets. However, we shouldn't underestimate the positive effect of better hygiene, vaccination, and advances in surgery techniques.
I share the feeling that better nutrition is probably not as much of a significant factor in developed countries. For what it's worth, we are more obese than our ancestors.
Our World in Data has this nice chart of life expectance by age in England and Wales [1]. I have this pet hypothesis that a good chunk of the increase in the life expectancy of the younger people in that chart is due to vaccination and better hygiene. And, for older ages, the a good chunk of increase is due to advances in surgery.
The effect of antibiotics is probably more uniform across the ages. It is note-worthy that penicillin was discovered in 1928, so the increase in life expectancy before then can't be due to antibiotics.
I have no expertise in this area, so you should take all of this with a pinch of salt. The Our World in Data chart is really cool though. That steep fall and rebounce due to the Spanish flu, and the contrast with COVID-19 with respect to affect on different age-groups is also interesting. [2]
No, it doesn't. It acknowledges that a diet for hunter-gatherers that rarely lived to 50 might not be best diet for a people who expend less energy and live far longer.
You should make sure you understand what outliers are. The fact that you can name people from outside modern times that lived a long time is meaningless to the discussion.
They weren't outliers. I simply looked up famous Indians because I remembered one had lived to a startlingly old age, and skipped those who got killed by other people. If their diet caused health problems, you wouldn't have all these folks living to a decent age. All the Indian leaders would have bene these young guys who died off at 45 from eating too much venison.
I definitely don't think modern diets are better than what I imagine Native Americans ate hundreds of years ago. Venison sounds like a very healthy staple in a diet.
Of course some of us did a better job than others, but enough lived that Humanity (collectively singular) is still here, and to me it seems the only way for anyone to learn anything is via the negative image formed by all the ideas that didn’t work.
Sorry I know this sounds a little woo-woo fruity, but that’s Humanity’s godlike power to transcend the limitations of our bodies’ material dimension where energy can’t be created or destroyed.
Collectively we’re like a blockchain holding a mirror-image dimension of all the best ideas just by virtue of the other nodes not making it, growing over time as our new embodiments get to start at what from their point of view is “zero” despite being unknowably old and full of patches and bug fixes. Storage and retrieval are the same operation!
Sarcasm nonwithstanding, it took over 12000 years of technological advances for agricultural societies to catch up to their palaeolithic bretheran in terms of average life expectancy and even metrics like height.
I'm not a primitivist by any means (I'm using a pc right now as it so happens) but we continue to have a lot to learn from modern hunter gatherer societies both in terms of social organisation and food consumption.
From a social psychology perspective, censorship is known to increase interest in whatever you ban or try to withhold. Censorship also interferes with educating people and helping them improve their understanding.
People often learn their understanding is flawed by saying something "stupid" and getting a response to that. Censorship fosters a climate of fear where people are less likely to say the "dumb" thing and get it explained.
A good policy with raising children is "There are no bad questions. You can ask (parent) anything and will not get in trouble, even if the answer is Wow, that's a really bad word and means (something bad). Please don't use that at school or I will get called by the teacher."
Tenure track changed papers from vehicles to express ideas subject to a strong test, into a burden to jump, for your lifes work security.
Arguably thesis by body of work is less corrosive, but the production of 3-4 papers from the PhD is basically now a signal to the university you can be that performing seal.
That, and IPR. "a new drug which xxx (in mice)" paper is worth significantly more to the company behind it during share price discussions. (Obviously this is shorthand because IPR has risks in premature publication as well)
> For decades, corporate undermining of scientific consensus has eroded the scientific process worldwide. Guardrails for protecting science-informed processes, from peer review to regulatory decision making, have suffered sustained attacks, damaging public trust in the scientific enterprise and its aim to serve the public good. Government efforts to address corporate attacks have been inadequate. Researchers have cataloged corporate malfeasance that harms people’s health across diverse industries. Well-known cases, like the tobacco industry’s efforts to downplay the dangers of smoking, are representative of transnational industries, rather than unique. This contribution schematizes industry tactics to distort, delay, or distract the public from instituting measures that improve health—tactics that comprise the “disinformation playbook.” Using a United States policy lens, we outline steps the scientific community should take to shield science from corporate interference, through individual actions (by scientists, peer reviewers, and editors) and collective initiatives (by research institutions, grant organizations, professional associations, and regulatory agencies).
Feels like too many people think “published paper = truth”, when really the paper is just the start of a journey towards truth, on which hurdles such as replication, independent scrutiny, and common-sense/real-world testing must be overcome before we can start treating them as reliable. And I think most scientists do actually get this, and what we suffer from is a media environment that jumps on the first sniff of a result and holds it up as definitive proof of something.
So many "misinformation censorship" proponents seem to have quickly forgotten just how much literally everyone in public health has spread misinformation this pandemic. It's unavoidable being wrong and often the cost of making claims without unimpeachable evidence is higher than the cost of being wrong. Name me a public health official who regularly made recommendations this pandemic and I'll show you misinformation.
The real issue here, in my opinion, is attributing malice so quickly and with extreme prejudice. Tensions are high and it seems to have become commonplace to assign ill-intent to the other tribe way too easily. In my opinion the answer to all of these problems is compassion and statistical thinking. Most people mean well and often times the only way you can get people to go to extremes is to negatively reinforce them into deeply entrenched tribal thinking.
It will always be a safer assumption to assume somebody means well even when they are wrong. Humans did not become the dominant species on this planet without cooperation and pro-social behaviors.
There is a difference between "malice" and "intentional misinformation".
Authority figures are not accurately representing risks and rewards of different actions and have deliberately shaped their messaging to achieve outcomes.
Direct from the horse's mouth: https://twitter.com/cdcgov/status/1233134710638825473 "CDC does not currently recommend the use of facemasks to help prevent novel #coronavirus. Take everyday preventive actions, like staying home when you are sick and washing hands with soap and water, to help slow the spread of respiratory illness." from 27 Feb 2020.
They weren't saying this because they thought face masks were ineffective, they said this because they were worried about supplies for healthcare workers. (obvious by reading subtext)
This is a repeated pattern. Making recommendations and sharing "facts" omitting important details in order to push for an unshared outcome.
Nobody is properly sharing how risks have changed, nobody is making premeditated decisions about which risks will be associated with which mitigating actions. The goalposts are constantly shifting to match present feelings instead of setting limits based on reason and sticking to them.
The censorship problem comes in when publishers are half-way committing to a narrative which changes by the week, failing to shut down people who are just saying things a subset wants to hear in order to get money and actively shutting down others who are actively trying to engage in reasonable debate.
The fundamental problem is this:
People have different risk tolerance and different priorities when it comes to personal risk, and individual actions affecting the community. These differences in this case are ultimately fairly small but getting blown out of proportion because it's profitable to get people upset.
It is all a moot point now and nobody seems to be able to articulate or admit it yet.
There is very little community protection left available by vaccinating. Vaccinated people can carry and spread the disease, the new variants are a lot less deadly, and a sizable proportion of the population already has natural immunity from getting the actual virus (~75M confirmed cases in the US for ~330M population, with estimates that there may have been as many as 4x as many cases which went uncounted)
It's over, we lost, the outcome we have is what we got and there is very little leverage to change it at this point because the vast majority of what was going to happen has already happened and the prevention tools we have are mostly ineffective at curbing spread.
Ok then, why is it bad to prioritize healthcare workers, who put their lives on the line in the face of a new unknown disease? That was only for a very short while. I think the behavior of people rushing to buy toilet paper shows that, indeed, had the CDC promoted masks at that point, it would have prevented nurses and doctors from getting the most basic protection while exposing themselves to infection to an incomparable degree.
It's not bad to prioritize healthcare workers, there are plenty of levers the CDC and governments have to do so.
It is bad to prioritize healthcare workers by sculpting messages i.e. manipulating the population.
A message for Feb 2020: "Masks will likely help curb infection for you and transmission for others, masks are most important for healthcare workers and the medically vulnerable, wearing them when you're around people is a good idea, but we need to make sure they get into the hands of people who need them most first"
They essentially decided to borrow against future trust in order to achieve a desired near term outcome. That may have been the option which led to the fewest deaths, or maybe vaccination rates would be ?% higher if they hadn't undermined their own credibility. It was (and still is) impossible to know, but I don't think thats the kind of call we want scientists to be making.
If lies slow down the progress of society and progress continues curing disease and saving lives - there isn't an argument as to whether you save move lives, obviously you save more lives with trust if we agree on the axioms that trust is necessary for progress and that progress will increasingly save lives over time.
> They weren't saying this because they thought face masks were ineffective, they said this because they were worried about supplies for healthcare workers.
This doesn't pass the smell test because governments could easily have banned the private sale of masks.
Many governments have used their powers to confine us to home arrest without a trial and they expect us to believe they couldn't have confiscated all the masks?
While protecting the supply of masks may have been part of the point, the CDC also believed (along with much of the medical community) that masks did little against viruses because of old research. It took quite a bit of scientific discussion and bickering for them to revise that opinion.
I largely agree with the rest of your points - particularly the outcome we have found ourselves in. Barring the emergence of some horrific new variant (Imagine Omicron with 10x the fatality rate), this is just another bug we can no longer avoid without effectively destroying our existing society - time to move on and live with it.
Our whole family had mild delta two months ago, even then we are now finishing quarantine after everyone catching omnicron. Neither vaccines or previous infection seem particularly effective at stopping re infection. The virus seems significantly different to previous iterations meaning it might as well be a different virus altogether.
It felt like a medium flu for the adults and just bounced off the kids.
It’s malice when the CDC again and again lied that natural immunity to covid was inferior to naive vaccine immunity, even cherry picking data to prove their point. Only recently after dozens and dozens of papers having had refuted their narrative has the CDC changed its official position on natural immunity, not that it ever got media attention.
The anti mask, anti N95 masks with exhalation valve stuff has been the worst here I think.
If you would let folks wear N95 with exhalation valves (highly comfortable) it just makes it easier for folks to protect themselves (and indirectly others).
And for those who don't care, let's distinguish slightly between making sure those who want to protect themselves as much as possible can, and the need to force people down a path which for the < 29 year olds really doesn't not have hi mortality rates, and for whom vaccination does not eliminate the disease the way it might something like smallpox etc.
My own view is the battle is largely lost already by the health authorities. Locally we have an indoor mask mandate including offices with all staff fully vaxed, and the natural desire to comply with this has gotten lower and lower if you are pretty spread out in large cubicals etc.
This is a bit of a tangent, but I don't understand generalized masking. At best they flatten the curve, which I think at this point I believe flattening the curve is dangerous. We have pretty well defined risk demographics and if we could have covid seasons come and go quickly I think it gives the high risk demographics the best chance of applying multi-layered defenses against a shorter season of risk.
There is of course the "overcrowded hospital" issue, but I have a hard time coming to terms with that being a real problem - if it were a real problem why has hospital capacity been in decline this entire pandemic? Is it an unsolvable problem or is it just not a real problem outside of fantastically clickable headlines? I lean heavily towards the latter but I could be convinced otherwise.
The issue for me is I have elderly parents one of which does have a health condition (serious) that makes some breathing and other issues hard. They wanted to fly out to visit me. The absolute best way to avoid covid for yourself in my view is an N95 masks with a vent, because you can actually wear that thing for hours comfortably.
I have friends who treated folks with covid wearing N95's and did not get covid.
So I really wish we could let folks make the personal health decisions here a bit more. If you want to wear a bubble helmet with filters in it - go for it. The full respirator (almost always with an exhalation valve)? Go for it. The N95 with an exhalation valve? Go for it.
Instead, delta airlines says no full cover, no N95 with exhalation etc. So now you have people who don't want to wear masks forced to wear them, and they are preventing folks who really DO want to be protected (while on a flight) from fully protecting themselves. This is all in the name of "science" which really is not science but "public health" mumbo jumbo.
It goes back to this issue of basically not realizing folks have different risk profiles / tolerances. If I'm allowed to wear my N95 with an exhalation valve for 4 hours, I frankly don't care if the guy 2 rows back has his mask off his nose.
This mistake is lecturing everyone (often with bad info) and then blocking folks from solving their own problems. remember when the anti-body rapid tests had to have a chip, an iphone app and a medical professional in the mix?
And no, I don't believe in those cloth masks, and surgical masks on a plane for 5 hours? Hard to imagine that being highly effective.
The most mindbogglingly dumb part about masks on planes is that there's still meal service on longer flights, at which point everyone takes off their masks at the same time to eat the food. Are we supposed to believe that the virus takes a little nap while everyone eats, and then only goes back to spreading itself when everyone has finished eating?
Detailed rules always lead to insanity. The best way is always to be upfront about the risks, how to mitigate them, and then let people make their own risk assessment and do whatever they're comfortable with.
But instead of saying "Flying carries a risk of exposure to this virus and others, here's what you can do", we're saying "If everyone engages in this piece of safety theater, flying is perfectly safe and fun for everyone!"
Exactly. That's all I want to be able to do. Skip the meal, wear my "medical professionals only" N95 mask with exhalation valve so I am comfortable.
Then I won't care if someone has their mask on their chin while they eat some food.
Instead, I can't wear the mask I want, and we play this stupid game of forcing folks to wear ineffective masks in a highly closed off space who don't care / don't want to / have already had covid and don't believe the doctors claims that there is no element of natural protection as a result.
I would imagine if the flight is 5 hours long and the eating time is only 0.5 hours, then there would only be 0.5 hours of breathing virus into the air than 5 hours of it. That being said, yes, most would be unmasked at that time and likely to breathe it in, I guess I just see overall as limiting the total amount of virus breathed into the air by requiring masks most of the time.
Half an hour is long enough to get infected, we know this.
If masks work, they work on a mechanistic, physical, level. They filter air, and if you use your masks wrong, or take them off, they stop filtering air.
But if you look at how people are actually using their masks, they pay zero attention to the mechanics of it, and 100% attention to the appearance of it.
Someone who wants to wear an N-95 mask for the duration of the entire flight, and who won't remove it for eating, understands the mechanics of how they work. That behaviour is not theatre.
Someone who wears whatever because they're told to, crams it in their pocket now and then, removes it while eating, and still believes they work, do not in fact understand how masks work. And whoever is making rules and mask mandates that allow for this behaviour also clearly do not understand how masks work. It's all theatre. It's masks as make-belief lucky rabbit's foot talismans, and yet we're supposed to believe that the rules are based in science.
I still don't see it as all theatre—maybe I just see it more like a probability function. 100% of people won't use them properly, no matter how much instruction or how foolproof the design is (because one way to fix improper usage is to improve designs instead of blaming the user). Will 50%? Maybe 10%? OK, if 10% use them 100% properly, what if another 30% use them at 50% effectiveness? I'm making up these percentages but just showing that I don't believe it's all theatre if not 100% of the people comply 100% perfectly.
So yes, people can get infected at half an hour. Are their odds much higher to get infected at 5 hours than 0.5 hours? I'd assume yes.
Heck, even people who wear it over their mouth but don't cover their noses, yes, they're potentially leaking covid out of their nose and breathing it in thru their nose. But if they cough, I imagine a percentage of the covid comes out their mouth and gets trapped in the mask...even despite them not wearing it 100% properly.
I'm saying that some of it may be theatre, or maybe much of it is (by theatre being ineffective but trying to appear effective), but I don't think it's all theatre.
Personally I don't think any kind of mask works on people who don't want to wear them. I hypothesize the reason general population shows little or no benefit with masks is because wearing masks effectively is significantly more tedious than just having one on most of the time where people require them.
People who don't want masks aren't getting fitted properly for n95's, they aren't shaving, they aren't behaving consciously in the many ways you can behave that render masking ineffective.
Agreed. I laugh because folks have masks on their necks, off their noses etc, when eating, when drinking etc. I don't blame them, a lot of masks fog up glasses etc if they actually are restrictive and don't have an exhalation valve.
I shave (this comes from scuba diving where shaving provides similar benefits). I have my own reasons not to want to get covid. So I like doing some things that I'm confident reduce my risk meaningfully. But I don't lecture others, I solve my own problem here.
What is this "it" you're referencing specifically? I'm not sure which point you're asking about, but any of the points aren't like to have great data. Not many masking studies go outside of modeling and the lab and the ones that do aren't that detailed (that I can recall).
> People who don't want masks aren't getting fitted properly for n95's, they aren't shaving, they aren't behaving consciously in the many ways you can behave that render masking ineffective.
Maybe you weren't saying not shaving renders an n95 ineffective.
I appreciate you sending the source. I guess I still have the same argument. A LOT less effective does not equal ineffective. Or maybe I just see certain words like "ineffective" as binary words, either it works or it doesn't, and have a reaction to that. It may be a lot less effective to have facial hair while wearing an n95 compared to no facial hair and a properly fit n95, but is that still much much more effective than not wearing any face mask?
ineffective is not binary - it depends on your endpoint, which is subjective.
If your endpoint is to not catch covid and someone ineffectively wears their mask such that they would not have caught covid if they wore it correctly - then the mask was ineffective even if statistically they _maybe_ had lower chances of catching covid than without wearing a mask at all.
and I say maybe very strongly here...we do not understand transmissible diseases nearly as well as many would like to think - there are plenty of observable datapoints that correlate mask usage with increased transmission...my gut feeling would be to explain those away with confounding variables, but my gut feeling is not science and no substitute for a testable understanding of the problem and solution
Ah, yeah, I think it was just a misalignment on the word ineffective. I think when I hear many people use it, I assume they mean pointless, ineffective for all, whereas in you saying it, it sounds like you meant ineffective in that particular instance based on what the outcome was. I see you seem to have a lot more nuance in your perspective than I may had been assuming. I'm sorry. Thank you for clarifying.
Its pretty noticeable, If I shave, I get a tight suction feel on an indrawn breath. This is same experience you get with a scuba mask.
That's interesting there is some protection, I could believe still better than things like cloth and surgical because the strapping tends to be tighter.
The rule was not made to protect passengers, but to protect the crew who are stunk in a tin can with hundreds of different people every day. The risks will add up over time.
Also, male doctors and nurses are require to shave before donning their masks for a shift because even the slightest overnight stubble will ruin the seal. The chances of general public wearing an N95 correctly to get the most benefit out of it is close to nil.
Hospitals have been jam packed the entire time. There is data everywhere to support this. Tons of procedures are not being done due to beds and ICUs being full. Ask anyone who works in a hospital. Call one if you'd like.
This is second hand from someone close, but not too close in the military, so details are off but the narrative is on track:
The Mercy was deployed after some communication between New York hospitals, and the understanding at least was that it would mostly be handling emergency, trauma, and routine patients (what military docs are good are at as a stereotype), and free up capacity for the hospitals to deal with COVID. The "normal" medical stuff promptly dropped in demand with lockdown, between fewer accidents and deferred care. So there wasn't much for the Mercy to do, the few covid patients sent over were about all of an "unknown air transmissible virus" (working theory at the time) that it was really equipped to handle, because that wasn't the mission they were expecting.
That makes sense in New York but we knew more about the situation when she was in LA and when the USNS Comfort was in New York later on. (Also I put USS Mercy when it should be USNS Mercy)
"Another of the Navy’s two hospital ships, the USNS Comfort, was sent to New York City to treat non-coronavirus patients. However, after treating only a few dozen patients, the Comfort changed its rules to allow coronavirus-positive patients on the ship.
The Comfort had been docked in New York Harbor in Manhattan since March 30 but departed last month for its Norfolk, Va., home port after treating just 182 patients."
ICUs are at max capacity all the time pre-pandemic. In fact a study done in 2013 IIRC showed that you could pick any random date and any random hospital in the U.S. and there was a 16% chance it was at max capacity.
So again, my question is this - if hospital capacity is truly a crisis, why have staffed beds been in decline since the beginning of the pandemic? Half a million healthcare workers have left their jobs since just the start of the pandemic. If this is a crisis, why isn't there any initiative to get them back? Instead the only initiative we've seen with regards to healthcare workers is massive layoffs firstly because of lockdowns, but then later also massive layoffs because of vaccinate rules.
So either this is a terrible problem with no solution, or the problem is exaggerated and it's not really a crisis.
Thats just not true. I found the study you're referencing and it says in 2013 mean ICU occupancy was 68%. If you zoom out on the map you'll see the US avg is 83%. Did you even look at it? Some states are like 9/10 at 100% capacity.
Staffed beds are not in decline, and of course hospitals are trying to get more workers, they are offering crazy pay incentives right now.
the dark gray represents ICUs in the studied time window (which wasn't 2013, it was a study done in 2013, the study measured ICUs from 05-07)
Here's an interesting fact[1]. 2020 hospital capacity in the US after the start of the pandemic never even got up to the projected capacity as projected using data from 2015-2019. So no, I don't believe these scary-tale anecdotes I've been reading the entire pandemic. They are not backed in data they are backed, at best, in bullet points without context designed to scare the ignorant.
I like how the public health folks don't connect the dots. MEDICAL PROFESSIONALS are tired of the stuff they are hearing.
I know some folks who had covid, and just don't want to trigger their bodies systems again with a series of shots.
For some reason, if you have other diseases your immune system likely provides some protection against re-infection, but we are being told covid is different. This just seems so likely to be a lie its mind boggling.
Let healthcare works test their blood for cross-reactivity with omicron / antibodies. If you are equal or better than some threshold (pfizer vaccine from 10 months ago) you get a card and are good to go.
> For some reason, if you have other diseases your immune system likely provides some protection against re-infection, but we are being told covid is different. This just seems so likely to be a lie its mind boggling.
I don't know what you're talking about. There is some protecction, it's not enough to make things safe, what's the lie?
If you're talking about those people still having to get a vaccine, that's a whole mess of tradeoffs but also not based on lying.
> if it were a real problem why has hospital capacity been in decline this entire pandemic
We filled hospitals over capacity and burned out healthcare workers. They decided to quit rather than work in such conditions. Don't overwork healthcare workers and maybe they won't burnout so quickly.
It should be the other way around: non-essential-workers only leaving the home for groceries and other essentials reduces it from 7 to 1.5, keeping distance at the grocery store reduces it from 1.5 to 1.1, wearing masks at the grocery store reduces it from 1.1 t o 0.8. Something like that. Made-up numbers.
My personal interest is filtering on ingress. I can feel the pressure when I breathe in, the air is going through the mask for sure.
On egress, just a quick look at the valve, it's going to stop the velocity and either block or drop particles - however ineffective that might be on egress there is NO WAY it is worse then what I see others wearing (surgical / thin cloth etc), mask off the nose, mask off while they eat etc.
Finally - folks historically have a natural "self interest" motivation. Ie, if you let folks act in a self interested way - personal protection, personal comfort - you are likely going to be better off policy wise.
You can do a rough risk computation yourself I guess. Both ingress and egress is worse if you have a valve though. Does seem like egress is still a bit better than what other people do (in this model at least - which could be wrong), but one shouldn't set one's standards by "what other people do".
"one shouldn't set one's standards by "what other people do""
In policy matters one can and should.
This is the stupidity I think of the scientists here.
If others are wearing surgical and little cloth masks, only wearing them a little bit, but then they say I can't wear my valved N95 (which I do very carefully) it's both totally arbitrary and unfair (my exhalations are still equal too or less others).
They also miss that the tradeoff might not be between my 5 hours of N95 use in a relative high workrate environment and something perfect, but that I will join with just about everyone else and ignore what they say, or go down to something like a surgical mask for comfort. Or take my mask off more for comfort to "drink" etc.
I'm convinced many folks on HN sit behind computers in empty rooms alone and NOT wearing a mask. If you have to wear a mask every day - these issues would matter a lot more.
The perception, of course, is that you are not filtering egress. Since the objective is/appears to be to filter egress, people will obviously take askance.
This can't be resolved without a bunch of additional training.
Of course if we'd done THAT, we wouldn't be in this situation in the first place, and the whole discussion would be moot. %-/
p.s. I worked on-site at a factory doing industrial automation for a lot of the time; I really liked FFP2 masks because they fit better and my glasses didn't cloud up.
Masks with valves were available (because part of one's regular PPE at times before the pandemic), but no longer permitted.
Yeah, that said FFP2's I think are per WHO only supposed to be for medical professionals - so use is discouraged outside that setting.
"This said, the World Health Organization does not recommend the use of FFP2 for everyone, stating that non-medical face coverings should be used by the general public under the age of 60 who do not have underlying health conditions. FFP2 masks are best used as personal protective equipment (PPE) by healthcare workers or those caring for someone who may have Covid-19 in a community setting. "
"However, whilst this approach [content removal] may be effective and essential for illegal content (eg hate speech, terrorist content, child sexual abuse material) there is little evidence to support the effectiveness of this approach for scientific misinformation, and approaches to addressing the amplification of misinformation may be more effective. In addition, demonstrating a causal link between online misinformation and offline harm is difficult to achieve [25, 26], and there is a risk that content removal may cause more harm than good by driving misinformation content (and people who may act upon it) towards harder-to-address corners of the internet27."
I'm happy that this point has been raised in this report. Lot of understandable hand-wringing and fear about online misinformation about over COVID, but it was never clear to me that online misinformation is a variable that explains differences in vaccination rates across countries, or to what extent it affects an individual's decision not to vaccinate. Turns out that this is hard to measure.
The CDC messaging was really bad and problematic on several counts right from the get go. Its not like the US is incapable of messaging about the link between individual behaviour and societal good in the long term (for instance, there were initial protracted battles, but we eventually agreed largely on condoms and HIV, or smoking and lung cancer risk) or in the short term (as in the lifestyle sacrifices that WWII engendered for wartime production). I wonder what went right in those situations (if they did go right) that went wrong in the messaging this time around.
Over and above the masking, some observations --
1. Transmission risk, I think, was never the primary end-point of vaccine development, but rather, to prevent hospitalization. But the messaging from CDC and others on the vaccine implied that transmission would keep declining with increasing vaccination rates, which turned out to be true only to a limited extent.
2. The early messaging was that this would be one wave of infections, and that one round of vaccination would end it all. This was tough to defend given what happened with the flu. I understand that telling people that "Well, the vaccine may not give long-lasting immunity and still leave you susceptible" is not going to promote vaccination, but perhaps emphasizing that vaccine-immunity is controlled and safe than an infection that might hit all organs, would have been a better way out?
Personally the CDC’s greatest sin was their narrative push against natural immunity. And sadly it was repeated as infinitum by the media. One paper with an incredibly misleading title and another so problematic that one can only wonder how it got even published. But the media and “science communicators” didn’t think twice in praising them and compounding on those same lies.
More recently another paper on covid and childhood diabetes (cause now they’ve gotta push the jabs on the young ones) has been causing the same level of scrutiny.
I agree that the link between online misinformation and real world harms is tough to measure. I've read some studies of radicalization that found meatspace was the primary avenue for radicalizing, even if cyperspace allowed extremist groups to raise more awareness, and I suspect the same is currently true here.
However, I don't think this is some "constant of human nature" and is more likely an accident of our current society. As people grow up with and on the internet, I would expect this to change: I would expect the link between online communication and radicalization to strengthen over time.
I mean, removing / censoring dumb shit will just empower people, it's a kind of martyrdom, and the conspiracy thinkers will see it as Proof that there's a Cabal out there that is Suppressing the Truth. This already applies to these warning labels that are put on videos, facebook posts, spotify podcasts, etc - if you're skeptical about the establishment, you'll see those as proof that the cabal is meddling and trying to censor these opinions. It makes people more curious.
Trust the [government] scien[tists]ce. As if "science" is a pure endeavor beyond the reach of human corruption. How do I know who is acting in my best interest? Are we paying their salary, or are corporations and government agencies paying them? Who are they beholden to? Are our interests the same?
We all aren't in the same boat. Certain decisions are harmful to certain groups and beneficial to others. Lockdowns, for example, decimated small businesses and crushed the working class. Amazon and many corporations, on the other hand, welcomed the lockdowns as a boon to their businesses.
We are not one nation. We are not one world. These centralized authorities are sanctifying the plundering of certain groups on behalf of others--or so it seems.
The three recommendations are literally the antithesis of what modern news media does today, which is, in turn, largely due to the political motivations of the media organisations and their owners. I very much doubt that the royal society does not know this already.
You miss the point. "Scientists" literally lied about the nutritional issues, as well as the smoking. They were "bought off" by companies. Now, I'm sure you have 100% confidence that the "current administration" is 100% by the book and honest in all of their dealings. While, you might have had less than 100% confidence that the previous administration was. My point? You trust one group, but not another. I might trust the other, not your group. These "groups" (I mean political, business, whatever is controlling the narrative) will lie, cheat and steal to get "their" agenda across.
That last point is 100% undeniable!!! History has absolute clarity that political forces will force the narrative to their desired viewpoint, and say what you will, deny that to your own peril and ignorance.
So, do I "trust the science?". I have an MSEE (not a scientist, but educated). I trust the science, I just don't trust who controls the narrative, and historically this has a larger affect on what "science" concludes, than, uh, the actual science.
So, be as ignorant to the forces that control the narrative as you like. I'll remain skeptical, and look at both sides.
Oh, and, when the "forces that control the narrative" decide they must censure "one side", I'm hella suspicious of that.
Major Kudos to the Brits for recognizing this. Most sane thing I've heard out of anyone in a while.
My statement was made without position on politics nor did I mention an administration or my own personal predilection to a given affiliation. Believe it or not I voted for George W Bush. But you might want to be careful before you drown in that pool of hypocrisy you just waded into. You see in our country there is this little thing called the first amendment, and while science journals have governing bodies they are not in the purview of the governed. There is nothing to say that what anyone prints is true, your party made sure of that when they eviscerated the fairness doctrine. Additionally, science journals and businesses paying for printing articles is a tradition as old as time and one that respectable science journals have worked around using a system of rating known as impact factor. Far be it from me to inform you that pay for play journalism is one of the factors that lead to bad ratings from the scientific community. However, everyone is entitled to a free press and that includes pay for play shady journals owned by big tobacco. But do NOT lay the blame for that at the feet of science. We've done what we can to ensure journalistic integrity by rates and measures and its not a perfect system but we're not and have not been willfully pushing misinformation out the door. Pay for play is, has been, and will continue to be a blight on science literature because we do live in a society that has freedom of the press. Anyone is entitled to print anything regardless of factual content. Respectable scientific journals at least have a system in place to punish those who abuse the system. So at least educate yourself on the basis of journal integrity while you reply to my comment on a machine that was built entirely on the edifice of science.
I'm not a Christian, but how life emerged is not something that experimental science will ever be able to answer. It's literally outside the domain of science.
Abiogenesis is a burgeoning field with a lot of data and several answers even without a full understanding of the entire field or a concrete holistic answer we can still say with full confidence that it didn't involve magic words spoken only 6000 years ago. We can at least say that with confidence.
I’ve said it before and I’ll say it again. Regardless of where you stand on specific issues like censorship, public health and misinformation, I think it’s really dangerous how counter-consensus opinions cannot be expressed in the open without individuals facing intense public scrutiny. There needs to be some kind of correction mechanism in case the consensus opinion is misinformed.
Maybe it would help to put in terms of a (not-so-implausible) scenario folks here are more familiar with. Suppose one day a niche topic in computer science became the subject of mass public concern, e.g., a computer virus escaped the lab and programable computers became heavily regulated. Anyone caught running unapproved software could face criminal charges. "But Turing Machines can be good, creative tools", you protest. "No, we cannot predict their behavior and unsanctioned software endangers us all," says your friend. As a programmer whose creative freedom depends on libre software, how would that situation make you feel?
That might be what comes to pass in the far future. It's possible for something to be a good and creative tool and also capable of terrible things.
I think the gain of function research that may have lead to covid could possibly have had interesting applications, but it also (possibly) lead to something terrible, probably best to regulate it more.
In fiction AI is banned in Dune and Warhammer 40k because of past events. It makes sense that if something really bad happened because of AI you might want to put regulations on development of it.
I think that banning general purpose computers only makes sense if you believe AI cannot be controlled. A more reasonable approach would be to design the program in such a way that by its very nature, fulfills certain criteria that cannot be escaped. Think of a linear type system or substructural logic. You have a core calculus which keeps track of resources and effects so that computation has no way to propagate outwards unchecked. Safety is not so much imposed by some external guard condition, but woven into the very fabric of computation itself.
It doesn't seem possible to me to ban general purpose computers. Any society that tries would at best, lose to the ones that don't. It seems impossible to me to have enough surveillance to both prevent unauthorized code and also still allow effective/competitive software development.
It may not be technically possible, but that won't stop misguided people from trying to regulate Turing machines and exploit their vulnerabilities. Rather than putting so much energy into in regulation, surveillance and offensive measures, I suspect the "winning" strategy is to focus on secure, verifiable computing. Whoever masters this first will grow an immunity to (human and AI) cyber-threats and gain a significant competitive advantage in the digital economy.
The extent to which truly secure computing is possible depends in large part on which of the many possible "cryptographic universes" we live in (based on how the algorithmic complexity categories like P and NP are).
There are different kinds of secure computing. The kind of security you are referring to is based on cryptographic schemes like homomorphic encryption or zero-knowledge proofs. The kind of security I am referring to is language-based security.
It depends on the specific opinion. I don't think there's anything wrong with people not being allowed to express the opinion that minority groups should be systematically exterminated.
Transparency for a start. Leaving our most important decision-making to only those who have the right credentials is like security through obscurity -- credentials can always be hacked by sufficiently motivated actors. Write down a program describing the logical steps you took to reach some decision, and release it. Once it's open source, formalize all the steps using type theory so anyone can verify it. It's the only reasonable way to do secure computing.
There's little difference between arbitrary arbiters (read: mostly robots) flagging content as "inaccurate" as compared to outright banning content (to me, it is very similar to the Jude Star of David). A more reliable solution would be to be able to actually pick arbiters you trust, like you can do with social media on the Fediverse (implicitly dismissed as "fringe" in the article).
Censoring scientific information to me feels like a continuation of science's replication crisis.
The tooling for all classical techniques seem to have been broken as of late, getting rid of critical thinking for free speech, and replication for science.
Agreed. Science itself has nothing to say about that though, and I fear castigating everyone to "believe the science" and censoring antiscientific opinions are two deeply counterproductive activities, in that they give people the impression that that is indeed what science is about
I was always taught that information is the product of data and bad data will yield bad information. With this in mind, I feel that the terminology of `misinformation` is perhaps not the best choice of words in some way. Though in the context of science, we have data, information and finally facts. So with that in mind, I appreciate what the Royal Society is getting at and in effectively addressing a form of non-platforming per se.
With that, the debate is what is and has been the due process of much great science and yes, science learns more from its mistakes than any unattainable being 100% right all the time.
For without the ability to question and debate things that for some may appear as facts, we stifle innovation and creativity. Which is the crux of advancements' of science for by its very definition, sits on the edge of fact or false.
I wonder if this statement from the Royal Society is more about this situation in Britain, rather than the rest of the rest of the world.
Rationality is probably the only remaining weapon humanity has to survive, and it sure is threaten, especially in light of the modern social medial business model.
Odd how there's no study on reducing the massive manipulation being done for the greater good.
The pandemic has really shown how weak and sleazy all institutions are.
Any while 'scientific misinformation' means there's at least a modicum of science in place, wait until you see what people are really sharing privately!
The disaster that will come is well deserved to all governments and institutions.
There used to be a time when transparency was seen as important. Well, no transparency if it's for the greater good!
From what I see people are not stupid. And I suspect they are getting better at detecting manipulation too. In the end, what the state / institutions say will be seen as just noise. A sort of advertisement with no real information.
Authorities don't seem to understand that their trust is earned. It's like they can't even conceive that it can be eroded. Lies and half truths are told so often and for so minor issues that it can't possibly make sense from a cost-benefit perspective, if behind the scenes it's indeed "noble lies" and "greater good", as some generous people seem to think. If you tell a noble lie every other day, nobody will trust you. Why should they?
I have a feeling that what we're seeing is a technological issue. I don't believe that most people in politics understand or know how to use the internet, so to them their tactics make sense: obfuscate and manage information for the citizens, because what's needed is their approval, not understanding. Common tactics for this include simply pretending there is no information which runs counter to their narratives, or slandering those who speak out against them.
They don't seem to realize that the internet makes their tactics extremely visible to anyone who has a memory and access to a search engine. All of their statements are semi-permanently somewhere on the internet, in chronological order.
They do realize there's a problem however, and their response has been to infiltrate the floundering but still widely respected legacy media companies and attempt to get tech companies to support their narratives. This is also clearly visible to many people, but not known widely enough to stop them.
In effect, they're competent in 20th century propaganda techniques, but unable to convincingly adapt them to the 21st century, and that's why we're seeing this insanity. There are a lot of people who distrust them out of instinct, but are unable to articulate their distrust with proper information, and so come up with wild claims (like Bill Gates microchipping people). There are others who straight up reveal the truth (Snowden, Assange) and get "taken out" one way or another.
Many other people have a vested interest in wanting to believe the institutions, either out of fear or because they are benefiting from the status quo in some way - ie. the Zoom class. They rationalize the unethical propagandizing and attacks on human rights as being "for the greater good" and participate in the essentially state-sponsored slandering of opposing voices.
What's dangerous is that people are becoming more and more detached from reality, one way or the other - propaganda or conspiracy theory. Of course, on some level the propagandists are right: there is a lot of misinformation on the internet. They however refuse to acknowledge that they are on the other side of the same coin, or that they are in fact the ones with all the power.
Joe Rogan and Dr. Robert Malone podcast. is it misinformation? Dr. Malone certainly have the credentials that none of us have and yet he is censored. is science only one voice? whoever express different opinions should be censored even if that person have the credentials to speak on the matter?
It's not. There's a heap of research behind their statements too, but all that research is also 'wrong' because the authorities say so.
There's currently more people per capita in hospital fully vaccinated than not in a lot of places. That's interesting, but I don't see a single part of 'the consensus' investigating this, just dismissals.
Saying science is only the consensus is dangerous. A lot of the studies published as the 'peer reviewed science' have poor errors/data (E.g. excluding groups that negatively affect the outcome our using relative risk as high significance against low absolute risk), but that's fine, science working properly when it's wrong that way because science changes as we learn, except if it says what we don't like.
And that's the strangest thing, the science has changed the whole time (covid isn't airborne- then it is, masks don't work- then they do, then only N95 do, vaccine will stop it dead (yes Fauci said this)- then it will reduce spread- then it just reduces hospitalization and death- then it's better than nothing- then we're working on a new one because the old ones don't work. All the while refusing to even be curious about the side effects because the greater good and then vaccinating children because 'the risk is not zero' even though the risk of side effect including death is not zero'
But hey, shut up Rogan etc, they make us feel uncomfortable about our accepted 'truths.'
It is misinformation because it is outright wrong. Follow Malone for a couple of weeks and you will see he has nothing else to share but: Vaccines are bad, Vaccines are killing people.
> As we prevent three deaths by vaccinating, we incur two deaths.
> "Are we headed for the situation where the ~30% unvaxxed will be devoting their lives to operating whatever is left of the economic infrastructure and serving as caretakers for the vaxxed?"
This is what got him banned from twitter.
Why don't you try to investigate a bit yourself? People with credentials can have no other motive to spread misinformation and all the motive to "save the humanity" ? Sad to see this on HN.
I listened to the podcast he did with Joe Rogan. 95% of what he talked about was fairly convincing and I agreed with it pretty strongly (or mundane and uncontroversial,). 5% was questionable and less convincing. Based on Malone's history (I think he had a strong, rare allergic reaction to one of his COVID shots that very nearly killed him) it seems likely that he argues in good faith. My prior here is that I already considered the harms in censoring good faith incorrect arguments greater than the benefits of it.
The things he mentioned in the podcast that I agreed with and honestly are pretty convincing:
- censorship of criticism of COVID medicine online is out of control and dangerous. How can you ethically give a drug if you're not allowed to publicly question whether it's safe or not? He gives a ton of examples of very mundane statements by many different people on many different platforms that led to disproportionately negative impacts on their lives relative to the fairly mundane claims they made.
- conflicts of interest exist and are not to be taken lightly. The FDA, CDC, drug companies, are all basically the same people. There is a profit motive to minimize the harms of certain drugs. This is not a hypothetical -- see oxycontin, tobacco, etc. There is reason to be suspicious of "health authorities".
- a lot of people seem to have crazy, obviously wrong/outdated views of COVID that are orders of magnitude off from the actual risks, and are very militant about policing those views and imposing them on others through government action. In some respects it resembles mass hysteria (this is the "mass formation psychosis" meme that went the rounds on the news, where he compared it to the rise of the Nazis. This was taken out of context largely I think, but you should listen to it for yourself).
- the nuance is lost in discussion of covid these days. You're either a science-based smart person or a conspiracy theorist antivaxxer. There is no in-between, and the truth is probably not absolute on one side even if it's heavily leaning on one side.
The thing he mentioned that I disagreed with or found less convincing, but are worth mentioning because they raise points that can and should be addressed:
- Malone's most questionable claim and the one he caught the most flak for is his claim that the risk of serious side effects in some populations as an adverse reaction to the MRNA vaccines can exceed the reduction in risk from covid-19. These are populations like kids, 20-year-olds, or people with a history of allergic reactions to vaccines. He claims that while the data on its own suggests the opposite is true, the side effects of vaccines are underreported due to flaws in the federal vaccine side effect reporting system, and that the deaths by covid are overreported by hospitals.
(He gives a fairly reasonable argument for this, and I think it's worth listening to it for yourself rather than reading the same mostly out-of-context sentences repeated on news articles reporting on the podcast, but after further research I am mostly unconvinced -- the main counterargument that is convincing to me being that over 100 countries have given the MRNA vaccines and you'd expect at least one of them to notice such a serious effect if it existed)
I have not responded without listening to what he said, I followed him for quite some time to see what exactly he had to say.
I do not like censoring by big tech as well, but when they take down outright lies which actually get viral and change people's opinions, I am no longer sure. Nuanced facts, data does not go viral. Tweets with controversial information do.
Serious side-effects, risk-benefit calculations, are very nuanced and take much more effort to bring up and share [1]. He presents a very one-sides story, every single day. That is not helpful.
He took very selective parts of news which aligns with his opinions and tweeted just that. Thanks to twitter's censoring, I can't even share those :facepalm: but you can look up archived data [2]. It is not even a single person, they have a pretty good group doing it every single day (Peter McCullough, I am sure you heard of him) [3] [4].
I find it difficult to create a censorship model that can distinguish between "nuanced counterfactual take" and "contrarian falsehood". I think a lot of people already know this is a problem (e.g. censorship of the Wuhan lab theory).
I am fairly biased though -- I would rather 100 dangerous falsehoods get shared (even if it results in a lot of people believing wrong things) than a even 1 true fact get censored and that puts me in the weird position of often defending people and takes I disagree with and dislike.
It's misinformation because it is wrong, and deliberately so. We know that because it is contrary to all the actual data. You are aware that for every 1 Robert Malone, there are 99 people who think he's full of bullshit?
> Misinformation is information that is incorrect.
No, misinformation is true information that is misleading (as to some other topic), morally wrong, etc.
If it is not true, then it is “not” information. It is “dis”-information.
(From Fluoride pdf)
Update: E.g. Consider the statement: “According to the FBI, African-Americans accounted for 55.9% of all homicide offenders in 2019.” That is information. It is knowledge which is conveyed / stored / etc. Some people use that fact to argue that AAs are inherently violent people. In the context of such an argument, the fact is potentially misleading / misinformation. The premise is true, but not the conclusion.
An example of disinformation would be to say: “According to the FBI, African-Americans accounted for 85.9% of all homicide offenders in 2019.” That’s not true. That’s “not” information.
When any kind of censorship, including self-censorship, total or partial, goes into a publication or its silencing, it will transform a piece of knowledge in ammunition of technical-propaganda instead of science.
PS: same happens with true art.
You shouldn't be. Are you anxious to censor Brainfuck for example? [1]
When you censor, you make the case for an argument to become weaker not stronger. You are hiding the knowledge to show another way in which something is invalid. The end result invariably will be that you make what initially could be science, to become propaganda.
The natural selection of paradigms do not require censorship anyway. Reality will reveal itself and always win in the long run. Propaganda is always an epoch phenomena that doesn't pass the test of time. It's like a counterfeited wannabe classic posing as timeless but always revealing itself as some kind of fraud in the end.
And without unconditional love to truth and reality you make impossible to do true philosophy, hence true science.
Hehe I'm quite neutral actually as I'm more of a dynamic language kind of guy.
I'm a lot more into Bret Victor's The Humane Representation of Thought https://vimeo.com/115154289 and really far away of ideas like "The Compiler is Your Friend".
I care more about concepts that are meaningful for humans and I find frivolous the machine needed technicalities to get them working.
All this pro-science stuff and I never see an opinion from anybody from the only relevant scientific field: epistemology. Why do we have to pretend this is in good faith when they don't even believe themselves?
Why not just create an open database which would contain important comments, critique and known cases of misinterpretation/misuse for every scientific paper (uniquely identified by a DOI)?
This is of obvious public interest, primarily targeting wide audience of non-experts (e.g. "reporters raped by scientists" and those reading the news) rather than scientists who already know what they are doing when reading the papers. So it has to be funded by a charity or a government.
The lack of public understanding of the scientific process and ability to parse published articles in context over the last year unfortunately shows the desperate need for scientific outreach to explain the current state of research to there public.
And obviously nobody listens anyway because everything has become overly polarised.
Maybe journals should be behind some sort of access restriction (not cost) so that the scientific method is less disturbed by the public and political idealogs...
From the article:
'The report defines scientific "misinformation" as content which is presented as
fact but counter to, or refuted by, the scientific consensus - and includes
concepts such as ‘disinformation’ which relates to the deliberate sharing of
misinformation content.'
They define misinformation in terms of a "scientific consensus". That alone seems problematic[0].
because people don't need to be scientifically literate to express their desires in the form of a vote, which is mainly concerned with tasking representatives with what people want, not how they're going to get it.
It's perfectly coherent to think that voting is a good mechanism to express preferences while at the same time thinking that your average person can't really validate scientific claims.
I don’t think there’s nearly as much daylight between these as you’re suggesting. Just look at how current beliefs in COVID or climate science fall along political preference lines. Scientific literacy is absolutely important to express coherent preferences on how policy should address these matters.
My take would be that precisely because we've democratized science there is now such a thing as a political preference on scientific matters. Which shouldn't exist, because science should not be in the business of adjudicating political preferences. And we now often have the worst of both worlds. Politicized scientists and a pseudo-scientific public.
discourse about values in regards to covid or climate is sensible. Do you want more freedom, more safety, fewer deaths, more economic growth? That's something every person has a meaningful position on. The safety of mrna vaccines? That's not a layman's question, and these factual technical questions are where misinformation enters the picture.
And the twisted thing about the debate is that saying this often creates accusations of elitism. But really, if scientific literacy is required to have coherent political preferences, what does that say about the right of participation of people in a democracy who are simply not educated or smart enough?
<< But really, if scientific literacy is required to have coherent political preferences, what does that say about the right of participation of people in a democracy who are simply not educated or smart enough?
Ellul would likely argue that prerequisite of any propaganda is the obligation for a preference -- regardless of whether the individual knows anything at all. The purpose is to include him/her in the machine.
Personally, to me it is akin to my aunt telling me computer's are devil's creation ( she did ) and be the downfall of us all ( possible ). Yeah, maybe basic literacy is necessary to participate.
You shouldn't expect voting to yield good decisions even if voters are individually rational [1], which they mostly aren't [2]. However, voting does have a big civil rights benefit: it lets minority groups protect themselves from the tyranny of the majority in cases where they have a strong preference one way (e.g. because their rights are being blatantly violated) and the majority has a weak preference the other way. A few percentage points one way or another can swing elections, so a sufficiently pissed-off minority can have surprising political power on the margin.
I've always felt voting was a bad idea. It drives candidates to waste inordinate amounts of time and money on campaigning. For many posts, the ancient Athenians selected people at random (sortition) usually from a pool of willing candidates. A large enough random selection would be represenative. This would work well for the US House of Representatives.
Your comment implies that you don't want representatives, you want other people to vote for whatever it is you want. All I can say to that is: ask God for the serenity to accept those things you cannot change.
It's been used in student government and there's been some research behind it. I originally heard about it on an episode of Malcolm Gladwell's podcast.
I don't trust myself to vote. I feel that being informed enough to make an informed vote is a full time job, and I don't have that "luxury". Otherwise you might as well roll a dice or vote the pretty face. Or just don't vote.
Elections are determined by people who don’t think about that. If you are probably more informed than the average voter, you can improve the signal:noise ratio.
I'm probably less informed. I don't follow the news (or really any mainstream media) because signal to noise ratio is too bad and there's too much negativity. I have no clue what our politicians are up to.
If there were widespread voter fraud, how would you hear about it? Knowledge mechanisms have become so centralized I’m not certain the general public could be informed anymore.
For some people democracy means letting people chose from options pre-approved by experts, because the people, in their view, can’t be trusted with being correctly informed, or coming up with useful options.
As much as I love being relatively unrestricted on the internet, is there actually any evidence that censorship decreases trust in institutions?
China is notorious for censorship but from speaking to friends there it seems everyone is totally on board with vaccines and very few people doubt climate science. People know everything is censored but trust in science, experts etc is relatively high.
> As much as I love being relatively unrestricted on the internet, is there actually any evidence that censorship decreases trust in institutions?
> China is notorious for censorship but from speaking to friends there it seems everyone is totally on board with vaccines and very few people doubt climate science.
A consideration: could it be that acts of censorship decrease trust in a society that is already divided?
Yeah I can definitely see how this could be true in the current political climate.
More generally though, it seems to me that:
1. A minimal censorship information environment is essential to scientific progress
2. The average person lacks the epistemological foundations to handle a minimal censorship information environment
If these are both true, it seems like we should either:
1. Invest massively in improving people's epistemological models or
2. Apply a censorship gradient (i.e unrestricted for the purpose of academic exploration but have minimum standards for wide-reaching, publicly shared information)
I have mixed feelings about this. On the one hand, I'm glad to hear education prioritized and the dangers of censorship discussed; on the other hand, I don't like the sound of "monitoring and mitigating evolving sources of scientific misinformation online". It comes off as, "we need to crack down on the smaller platforms because people can think and say what they want there." And the final non-committal comment from Vint Cerf, representing google here, reads simply as, "what we've been doing is fine and we should keep doing it." I would have expected something more like, "we need to encourage productive conversation around controversial topics instead of silencing minority voices by pretending we can find the truth with an algorithm." Purposefully preventing someone's voice from being being found, or taking away their opportunity to make a living through sharing, is still censorship.
There’s a major distinction between misinformation and disinformation.
Disseminating DISinformation should be considered fraud, provided it can be proven beyond a reasonable doubt that the purveyors knew the information was false. Otherwise, any provably-false thing is misinformation.
It's noisy as hell in most sciences. It takes a loooong time for true consensus to be established, and much longer to challenge an existing consensus. So "provably false", while admirable, is not applicable to any of the topics that the authorities are upset over, like vaccine efficacy/safety, climate change and so on. Much less the appropriate reaction - like vaccine mandates or what energy sources we should invest in. Anyone who claims consensus is either lying or I'll informed.
Just throwing my 2c into the well, as someone who used to be highly "pro-science" but lost confidence in much of academia and the validity of scientific research in general after starting a PhD and seeing how the sausage is made.
The biggest problem science is facing is not an external threat from a rabble of ignorant science deniers, but the complete degradation of quality within the scientific institutions themselves.
Most research is flawed or useless, but published anyway because it's expedient for the authors to do so. Plenty of useful or interesting angles are not investigated at all because doing so would risk invalidating the expert status of the incumbents. Of the science that is communicated to the public (much of which is complete horseshit), the scientists themselves are often complicit in the misrepresentation of their own research as it means more of those sweet, sweet grants. The net result is the polished turd that is scientific research in the 21st century.
"Educated" people can say what they want about how important it is to believe in the science and have faith in our researchers and the institutions they work for.
The fact remains that if one were to assume that every single scientific discovery of the last 25 years was complete bullshit, they'd be right more often than they were wrong.
> Most research is flawed or useless, but published anyway because it's expedient for the authors to do so.
For the record I now know a professor at one of the premier institutions in the world, who is a total fraud. Their research was fraudulent in grad school. Their lab mates tried to raise concerns and nothing happened. That person graduated, with everyone on the committee knowing the issues. Then they got a premier post-doc position. People in their lab (who I caught up with at a conference) mentioned their work was terrible. Now they’re a professor at a top tier university.
Along the way, everyone knew and when people tried to bring up concerns higher ups in the institution suppressed the knowledge. Mostly because their fraudulent work was already cited tens of times.
This wasn’t directly in my field, but I saw it go down and followed it.
In my day job, I just throw out papers that don’t publish datasets and code. Most cs work is equally useless. It’s all a farce.
EDIT: I recommend the book for some insights - “Rigor Mortis: How Sloppy Science Creates Worthless Cures, Crushes Hope, and Wastes Billions by Richard Harris”
> I just throw out papers that don’t publish datasets and code
There was an interesting research paper that claimed you could up your IQ with a computer game called dual n back and a lot of paper tried to replicate it, but my mind was blown when I realized none of them used the same code. None of them actually shared the code they used for their research and then they all claimed they were testing the same thing when they obviously weren't.
To me, refusing to share the source code to a program that could be used to replicate your research seems like a big middle finger to science itself. It shows a total disregard to study replication.
Exactly, it'd be like omitting the "methodology" section of the paper. There's no way to verify or prove wrong a paper if you don't know how they reached their conclusions.
Even more than sharing code, I think the key is to share raw data, postprocessed data and describe methods as exceptionally clear equations and pseudocode.
Some well intentioned papers share code which is hard to run years afterwards. It's sometimes much simpler to reimplement things that to get the code to run if they are described accurately.
Besides, some articles hide ugly things, nasty tricks and lies in the code, which make their results a lot less believable and valid. Being super upfront about models in terms of equations and pseudocode is important.
Of course, we should also have standards to make code reproducible. Perhaps depositing a VM with everything ready to run.
> Perhaps depositing a VM with everything ready to run.
The danger then is that the VM has so much undocumented complexity that if anything goes wrong, or goes "well" when it shouldn't, no one can explain why. Which also reintroduces a vector to hide nasty tricks.
this. the point of sharing reproducible steps and not the experiment itself is that it can be fully reproduced independently. not just independently verify that the result show what the paper claims.
Results that haven't been independently replicated are suspect. There are just too many factors that can lead an experiment to give some results that are not transferrable or not relevant.
The worst aspect of this is the lack of will or funding to replicate, replicate, and replicate again all significant results that get published. Post-processed data can be altered, but a TB of raw data is meaningless as well if it hasn't been produced properly, has been obfuscated, or is weirdly formatted.
Data availability is a red herring for the vast majority of the science being made right now (almost everything that does not depend on a multi-millions dollars experiment). If data availability is an end in itself, we would just have moved the goalposts and have a data quality problem instead of a reproducibility problem.
> Perhaps depositing a VM with everything ready to run.
Yes, this is hugely important. We need clearer requirements for what constitutes properly 'published' data, code and methods. It should include all raw data (both used and discarded) as well as complete snapshots of each preliminary and intermediate processing step in a time-stamped chronology.
This is an area where the expertise and best practices of software development, documentation and source control could help inform standards and tooling which would dramatically improve the quality of scientific publishing and replication.
> Some well intentioned papers share code which is hard to run years afterwards.
Umm, not really? How many years are we talking about here? Because even Cobol is runnable. Sure some OS related quirks may need changing but getting the raw source is much more likely to expose a subtle flaw in the researcher's method than just the equations. For example, an incorrect negative check (to string vs < 0) or an off by one error.
Don't think COBOL; think Python 2.1 (not 2.2) with this specific version of PIL, and some of the code implemented as a CPython module (for speed) in C that only compiles properly with one specific EGCS fork that adds support for IA-64. Parts of the code are written in slightly-buggy inline IA-64 assembly, but it's okay because after a system call, that particular operating system makes sure those registers are zeroed each loop iteration, if Python's line-buffering its `print`s so that the system call gets consistently run.
Also, the Python script crashes on loading the data files unless there are two (but not more than two) spaces in its full file path. This is not documented anywhere.
Yeah. I can easily run FORTRAN code from my PhD supervisor's PhD supervisor, written way back in the 1980s, but I cannot run some of the Python scripts written by a post-doc 10 years ago. It's a mess of abandoned libraries and things that work only with some specific versions of some libraries, and compiled modules that rely on the dodgy behaviour of an old compiler. Perl seems to be better, if only because people used to rely much less on external libraries.
But properly running code is not the solution either. I can count on the fingers of one had the downloads of some of the codes we've published (easily compilable modern Fortran), and AFAIK nobody ever published anything using them. Having a multitude of codes available does not mean much if nobody runs them, assuming they can be compiled. And I would guarantee that none of the scientists who download these codes would be able to understand what they do in any detail.
Indeed, that's not proper science, too many moving parts. If it cannot be easily replicated by anyone at any time, it is just an experiment which would need more refinement to get published. No experiment should be accepted if it requires a Rube-Goldberg machine.
Sharing code and data can be as harmful to science as it is beneficial. You don’t want to reuse the same instruments that conducted the last experiment to validate it.
Independent replication is inherently expensive, but also critical to the field at large. Some sort of code vault that releases the code and data after a period could be a solid compromise.
Well yes, but no. You want to avoid any systematic bias, which if you reuse tooling/instruments/code, you run the risk of, but source code is also something which can be peer-reviewed.
Reproducing the code base from scratch just isn't going to be tractable for large scientific pieces of software (e.g. CERN).
Better to have the source code and dig through it carefully as part of the review and replication than insist everyone write their own cleanroom copy just to avoid systematic bias.
Suppose someone decided to build another LHC, but to save money they would use exactly the same design, constriction crew, and code used to build the first one. Would you consider that a perfectly reasonable choice or would it seem risky?
That said I am all for peer reviewing code, but that’s not where this ends up. People are going to reuse code they don’t bother to dig into because people are people. The obvious next step would be to reject any paper that’s reusing code from a different team, but that’s excessive.
That said reusing code or data is much safer than reusing code and data.
I’m going to be upfront and say I don’t understand your position You seem to be making a number of questionable assumptions: 1) That people would blindly reuse code to the harm of science, including on a multi-billion dollar project like an LHC. That’s not likely to ever happen.
2) That rejecting those who plagiarize others’ code in a peer-review process is somehow excessive or problematic.
I can’t begin to understand where you are coming from.
I'm not sure if I have exactly the same concerns as the person you're replying to, but I've definitely noticed problems coming from this form of "replication" through just re-running existing code. I don't think that means code shouldn't be released, but I think we should be wary of scientific code being reused verbatim too frequently.
If existing code is buggy or doesn't do what its authors think it does, this can be a big problem. Even if the idea is correct, whole swathes of downstream literature can get contaminated if authors didn't even attempt to implement an algorithm independently, but just blindly reused an existing docker image that they didn't understand. I consider that poor practice. If you claim that your work is based on Paper X, but what you really mean is that you just ran some Docker image associated with Paper X and trusted that it does what it says it does, instead of independently reimplementing the algorithm, that is not replication.
In older areas of science the norm is to try to replicate a paper from the published literature without reuse of "apparatus". For example, in chemistry you would try to replicate a paper using your own lab's glassware, different suppliers for chemicals, different technicians, etc. Analogies here are imperfect, but I would consider downloading and re-running an existing docker image to be dissimilar from that. It's more like re-running the experiment in the first lab rather than replicating the published result in a new lab. As a result it can, like re-running a chemistry experiment in the same lab, miss many possible confounds.
Of course, you do have to stop somewhere. Chemistry labs don't normally blow their own glass, and it's possible that the dominant way of making glass in your era turns out to be a confound (this kind of thing really does happen sometimes!). But imo, on the code side of things, "download and rerun a docker image" is too far towards the side of not even trying to replicate things independently.
For that reason a special multimedia sharing tool was created, which is now called WWW. So the international physicist community could share their code, papers and CAD designs for the LHC experiments. Quite a success for them, but academia is still resistant
It's not reasonable, but for different reasons. Building a carbon copy of LHC would not add anything new in the way of science. A better example would be LIGO. Adding another gravity wave detector of the exact same spec but halfway around the world would be fantastic, because it increases our triangulation ability, and keeping the engineering, hardware, and software the same reduces cognitive load of the scientists running it. Yes that means any "bug" in the first is present in the second, but that also means you have a common baseline. In fact there will inevitably be discrepancies in implementation (no two engineering projects are identical, even with the same blueprint), and you can leverage that high degree of similarity to reduce the search space (so long as subsystems are sufficiently modular, and the software is a direct copy).
The original comment was with respect to some n-back training program. There's so many other potential places of bias in an experiment like that, that you'd be foolish not to start with the exact same program. If an independent team uses a different software stack, and can't replicate, was it the different procedure, software, subjects, or noise?
The first step in scientific replication is almost always, "can the experiment be replicated, or was it a fluke?" In this stage, you want to minimize any free variables.
It's a matter of (leaky) abstractions. If I'm running a chemistry replication, I don't need the exact same round bottom flask as the original experiment; RBFs are fungible. In fact I could probably scale to a different size RBF. However, depending on the chemistry involved, I probably don't want to run the reaction in a cylindrical reactor, at least not without running in the RBF first. That has a different heating profile, which could generate different impurities.
Likewise, I probably don't need the exact same make/model of chromatograph. However, I do want to use the same procedure for running the chromatograph.
Ideally, that would be a concern of the peer review. When I finished my undergraduate degree, I had to present a paper describing a program. I had to show that program working as part of my presentation. Anyone reading my "paper" and knowing I got a passing grade already knows the code I supplied works with the data I supplied, so I don't think it's that important for them to run it themselves.
Essentially, this would be like trying to reproduce a paper and starting by checking if their mathematical apparatus is correct. It's not useless, and it can help detect fraud or just plain bad peer review of course, but I wouldn't call that an attempt to reproduce their results per se.
It could be a nice quick sanity check, in the sense that if they've completely lied or you've completely misunderstood how to use the program, you won't get the same results. So it could tell you that you shouldn't even bother trying to replicate their claims. But there's a risk that people might mistake re-running the code for reproducing the findings of the paper.
The paper is entered into the scientific record, not the code, which will inevitably become obsolete (old language, old frameworks, implementation details tied to old hardware or operating systems, maybe some source code will be lost, github will go out of business). If the code is necessary, then crucial details have been left out of the paper, so it is not reproducible (although there are some journals that let you submit code as an artifact).
The risk here is that the original code contains a bug that is inadvertently reproduced by the replicating scientists after reading it. It can easily happen in some fields that some computation looks very plausible in code, but is actually incorrect.
"To me, refusing to share the source code to a program that could be used to replicate your research seems like a big middle finger to science itself."
I used to work (current PhD student) in HPC (switched specialties) and I was extremely surprised that not only did these people not share code (DOE actually encourages open souring code), but they would not even do so upon request. Several times I had to get my advisor to ask the author's advisor. Several times I found out why I couldn't replicate another person's code. It is amazing to me that anyone in CS does not share code. It is trivial to do and we're all using git in some form or another anyways.
Sharing source code for a HEP experiment is not that easy or possible at sometimes. A lot of stuff is done by different group and whole framework used and overall a lot of raw data collected and reconstructed involved. To analyze those even if available you willl need a lot of people with a lot of resources. So even it made available (which would be a lot of efforts itself) it wouldn't make much sense for replication purpose.
So what? Tough shit! It's still simply the only way to replicate or audit something.
If it's hard or tedious... well so what? So is life. That's most of science is exactly that rigor.
You might as well say no one else can grow a potato because of all the work your farm had to do all year to arrive at your crop. Yes, any other farmer will have to do all the same stuff. It's an utterly unremarkable observation.
How is it possible to peer review a paper when the only people qualified to do so are the one's involved in the research. Seems like a massive issue. Don't want to cast to much shade here, but there's a fair number of people that called out high energy physics for having problematic methodology. See Constructing quarks by Andrew Pickering. Ultimately, it should be CERNS job to release the data, yes even if it is terabytes in size, because that's the whole point of science.
>How is it possible to peer review a paper when the only people qualified to do so are the one's involved in the research. Seems like a massive issue.
Peer review isn't what it's chalked up to be. Basically it's two other people in the same field saying "There's no blatantly obvious issues with this paper" and even that isn't always guaranteed. Reviewers don't make any efforts to actually replicate the research, crunch the data, etc.
Re running the authors code with their data will likely just repeat any methodological issues they had (either accidentally or fraudulently).
In medical studies, one level is to reevaluate the data from the same set of patients to see if any bias or errors wasn't included, perhaps with a different or newer statistical idealogy. The best is for the study to be repeated with a fresh set of patient data to see if the underlying conclusions were valid.
Reading this, I am reminded of a quote by Edsger W. Dijkstra from 1975 [1]:
> In the good old days physicists repeated each other's experiments, just to be sure. Today they stick to FORTRAN, so that they can share each other's programs, bugs included.
"Re running the authors code with their data will likely just repeat any methodological issues they had"
Yes, but that's useful too.
COVID-19 lockdowns largely kicked off in the west due to an epidemiological model from Imperial College London, written over a period of many years by the now notorious Professor Neil Ferguson.
When his team uploaded a pre-print of his paper and started sending it to government ministers, the code for his model wasn't available. They spent months fighting FOIA requests by claiming they were about to release it, but just had to tidy things up a bit first. When the code was finally uploaded to GitHub the world discovered the reason for delay: the model was a 15,000 lines of C trash fire of race conditions, floating point accumulation errors and memory corruptions in which basically every variable was in global scope and with a single letter name. It was a textbook case of how not to write software. In fact it was a textbook case of why variable names matter because one of the memory corruptions was caused by the author apparently losing track of what a variable named 'k' was meant to contain at that point in the code.
Not surprisingly, the model didn't work. Although it had command line flags to set the PRNG seeds these flags were useless: the model generated totally different output every time you ran it, even with fixed seeds. In fact hospital bed demand prediction changed from run to run by more than the size of the entire NHS Nightingale crash hospital building programme, purely due to bugs.
And as we now know the model was entirely wrong and the results were catastrophic. Lockdowns had no impact on mortality. There are many people who looked at the data and saw this but here's just the latest meta-analysis of published studies showing that to be true [1]. They destroyed the NHS which now has a cancer treatment backlog and pool of 'missing' patients so large that it cannot possibly catch up, meaning people will die waiting for treatment from the system they supported with their taxes for their entire lives. They destroyed the tax base, leaving the government with an unpayable debt that can be eliminated only via inflation meaning they will have soon destroyed people's savings too. It's just a catastrophe of incompetence and hubris.
The incorrectness of the model wasn't due only to programming bugs. The underlying biological assumptions were roughly GCSE level or a bit lower (GCSE is the exams you take at 15/16 in the UK), and it's quite evident that high school germ theory is woefully incomplete. In particular it has nothing to say on the topic of aerosol vs droplet transmission, which appears to be a critical error in the way these models are constructed.
Nonetheless, even if the assumptions were correct such a model should never have been used. Anyone outside the team who had access to the original code would have seen this problem immediately and could have sounded the alarm, but:
1. Nobody did have access.
2. ICL lied to the press by claiming the code had been published and peer reviewed years earlier (so where was it?)
3. Then when it was revealed the model wasn't reproducible, they teamed up with another academic at Cambridge and lied again by publishing a report+press release claiming it actually was reproducible and claims otherwise were misinformation.
4. And then the journal Nature and the BBC repeated these false claims of reproducibility.
All whilst anyone who looked at the GitHub issues list could see it filling up with determinism bugs. If you want citations for all these claims look here, at a summary report I wrote for a friendly MP [2].
So. It's good that the Royal Society is telling people not to engage in censorship, but their justifications for taking that stance reveal they're still living in lala land. By far the deadliest and most dangerous scientific misinformation throughout COVID has come from the formal institutions of science themselves. You could sum all the Substacks together and they would amount to 1% of the misinformation that has been published by government-backed "scientists", zero of whom have been banned from anything or received any penalty whatsoever. For as long as the scientific institutions are in denial about how utterly dishonest their culture has become we will continue to see a weird inversion in which random outsiders point out basic errors in their work and they respond by yelling "disinformation".
The meta-study you cite has quite bizarre conclusions. It basically states that isolation behavior is highly effective at preventing Covid deaths, but that lockdowns were a bad predictor of such behavior in people. But, they still seem to attribute huge economic impact to the lockdowns, not the pandemic and natural response (voluntary behavioral changes) to it.
Overall it seems that they would have been better served by adding lockdown compliance to their models, which would likely explain much of the difference. It's absurd to claim that lockdowns don't work when a country like Vietnam (100 million people, first detected cases of COVID-19 community spread outside China) had <100 total COVID-19 deaths in 2020. Strict targeted lockdowns, strict isolation requirements after every identified case, for contacts up to the third degree ( contact of someone who was a contact of someone who came in contact with a patient) - these all must have played a role, and any study that fails to explain such extreme success is simply flawed.
Are you sure it states that? Maybe you mean this paragraph:
"If this interpretation is true, what Björk et al. (2021) find is that information and signaling is far more important than the strictness of the lockdown. There may be other interpretations, but the point is that studies focusing on timing cannot differentiate between these interpretations. However, if lockdowns have a notable effect, we should see this effect regardless of the timing, and we should identify this effect more correctly by excluding studies that exclusively analyze timing."
"It's absurd to claim that lockdowns don't work when a country like Vietnam (100 million people, first detected cases of COVID-19 community spread outside China) had <100 total COVID-19 deaths in 2020"
Many countries claim an improbably low level of COVID death. That doesn't mean their numbers can be taken at face value, although poorer countries do seem to be less worse hit simply because they have far fewer weak, obese and elderly people to begin with.
It's certainly not absurd to claim lockdowns don't work. You're attempting to refute a meta-analysis of studies looking at all the data with the example of a single country. That's meaningless. A single data point can falsify a theory but it cannot prove a theory. To prove lockdowns work there has to be a consistent impact of the policy.
It is completely absurd. "Lockdowns" is shorthand for "people not coming into contact with other people in a way that spreads the virus". Because the virus cannot teleport through walls, lockdowns necessarily prevent transmission.
This is the sort of mis-use of logic that has led to so many problems.
"Lockdowns" is shorthand for "people not coming into contact with other people in a way that spreads the virus"
It's shorthand for a set of government policies that were intended to reduce contact, not eliminate it, because "not coming into contact with other people" is impossible. People still have to go to shops, hospitals, care homes, live with each other, travel around and so on even during a lockdown.
Your belief that it's "completely absurd" to say lockdowns don't affect mortality is based on the kind of abstract but false reasoning that consistently leads epidemiologists astray. Consider a simple scenario in which lockdowns have no effect that's still compatible with germ theory - you're exposed to the virus normally 10 times per week every week. With lockdowns that drops to 5 times a week. It doesn't matter. Everyone is still exposed frequently enough that there will be no difference in outcomes.
"Because the virus cannot teleport through walls, lockdowns necessarily prevent transmission"
Viruses can in fact teleport through walls: "The SARS virus that infected hundreds of people in a 33-story Hong Kong apartment tower probably spread in part by traveling through bathroom drainpipes, officials said yesterday in what would be a disturbing new confirmation of the microbe's versatility."
you're exposed to the virus normally 10 times per week every week. With lockdowns that drops to 5 times a week.
In this scenario, lockdowns work fine, but are insufficient! Perhaps with stringent use of N95 masks during contactless food ration delivery it can be reduced to 0.05 times per week. Then the pandemic ends and we go back to normal.
Also, if you only go out half as often, that's not exactly a lockdown either. A lockdown is people not coming into contact with other people in a way that spreads the virus. It's not people still often coming into contact with other people in a way that spreads the virus but not as often as before.
You don't appear to realize that food gets to your front door via a large and complex supply chain that involves many people doing things physically together at almost every point. Your definition of a lockdown is physically impossible even for cave men to sustain, let alone an advanced civilization. It isn't merely a matter of "works but insufficient".
This kind of completely irrational reasoning is exactly why lockdowns are now discredited. The fact that the response to "here's lots of data showing that lockdowns didn't work" is to demand an impossible level of lockdown that would kill far more people than COVID ever could simply through supply chain collapse alone, really does say it all.
> A lockdown is people not coming into contact with other people in a way that spreads the virus. It's not people still often coming into contact with other people in a way that spreads the virus but not as often as before.
That might be a definition issue. Here in Germany, we had several lockdowns. Many (most?) countries would say we had no lockdown at all.
The question isn’t whether lockdowns with compliance work, the question is how effective is lockdown as a policy. If you create a lockdown policy, what is the impact of that?
> First, people respond to dangers outside their door. When a pandemic rages, people believe in
social distancing regardless of what the government mandates. So, we believe that Allen (2021)
is right, when he concludes, “The ineffectiveness [of lockdowns] stemmed from individual
changes in behavior: either non-compliance or behavior that mimicked lockdowns.”
> Third, even if lockdowns are successful in initially reducing the spread of COVID-19, the
behavioral response may counteract the effect completely, as people respond to the lower risk by
changing behavior
Since it is an obvious consequence of the germ theory of disease that isolation stops the spread of disease, the only real question is if lockdowns are efficient at enforcing isolation (and in what conditions, with what costs etc). At best, the paper concludes that they are not, and that government propaganda about the risks of the disease works as well or better.
> You're attempting to refute a meta-analysis of studies looking at all the data with the example of a single country. That's meaningless. A single data point can falsify a theory but it cannot prove a theory. To prove lockdowns work there has to be a consistent impact of the policy.
I would say the null hypothesis would be that lockdowns do work, by relatively simple, almost mechanical, principles (lockdown forces isolation, isolation means the virus can't physically get from one person to another).
And there is basically no country that hasn't seen significant isolation that did well in the pandemic. Whether that isolation was caused by efective, if perhaps draconic lockdowns (such as in China or Vietnam) or by cultural norms and self preservation (such as in Finland or Norway), this remains true.
The real exceptions are countries that successfully isolated themselves from the outside world, and thereafter isolated only those carrying the virus, through broder closures and strict quarantine requirements plus testing - Taiwan, New Zealand are examples of such.
For this type of response, using broad statistical studies that equate very different ground-level phenomena (lockdowns varied wildly in form and in the degree of compliance, but the paper ignores all that in its quantitative data) to make it look good on a graph are mostly misleading - but what more can you expect from three economists dabbling their hand at epidemiology and sociology?
"The model generated totally different output every time you ran it, even with fixed seeds." - I remember seeing code takedowns of the model from anti-lockdown people who repeatedly cite this issue.
But there is a valid reason for this to happen, and it doesn't mean bugs in the code. If the code is run in a distributed way (multiple threads, processes or machines), which it was, the order of execution is never guaranteed. So even setting the seed will produce a different set of results if the outcomes of each separate instance depend on each other further in the computation.
There are ways to mitigate this, depending on the situation and the amount of slowdown that's acceptable. Since this model was collecting outcomes to create a statistical distribution, rather than a single deterministic number, it didn't need to.
The fact the model was also drawing distributions also will be why different runs produced possibly vastly different results. Those results would be at different ends of a distribution. Only distributions are sampled and used, not single numbers.
Regarding the GCSE level comment, my concern was the opposite, that the model was trying to model too much, and that inaccuracies would build up. No model is perfect (including this one) and the more assumptions made the larger the room for error. But they validated the model with some simpler models as a sanity check.
My view on criticisms of the model were that they were more politically motivated, and the code takedowns were done by people who may have been good coders, but didn't know enough about statistical modeling.
> "The model generated totally different output every time you ran it, even with fixed seeds." - I remember seeing code takedowns of the model from anti-lockdown people who repeatedly cite this issue.
But there is a valid reason for this to happen, and it doesn't mean bugs in the code. If the code is run in a distributed way (multiple threads, processes or machines), which it was, the order of execution is never guaranteed.
Then there's literally no point to using PRNG seeding. The whole point of PRNG seeding is so you can define some model in terms of "def model(inputs, state) ->" output, and get the "same" output for the same input. I put "same" in quotes because defining sameness on FP hardware is challenging. But usually 0.001% relative tolerance is sufficiently "same" to account for FP implementation weirdness.
If you can't do that, then your model is not a pure function, in which case setting the seed is pointless at best, and biasing/false sense of security in the worst case.
As you mention, non-pure models have their place, but reproducing their results is very challenging, and requires generating distributions with error bars - you essentially "zoom out" until you are a pure function again, with respect to aggregate statistics.
It does not sound like this model was "zoomed out" enough to provide adequate confidence intervals such that you could run the simulation, and statistically guarantee you'd get a result in-bounds.
I reckon the PRNG seeding in such a case might be used during development/testing.
So, run the code with a seed in a non distributed way (e.g. in R turn off all parallelism), and then the results should be the same in every run.
Then once this test output is validated, depending on the nature of the model, it can be run in parallel, and guarantees of deterministic behaviour will go, but that's ok.
I didn't develop the model, so can't really say anything in depth beyond the published materials.
I just found it odd at the time how this specific detail was incorrectly used by some to claim the model was broken/irredeemably buggy.
Edit: Actually, in general, perhaps there's one other situation where the seed might be useful, assuming you have used a seed in the first place. Depending on the distributed environment, there's no guarantee that the processes or random number draws will be run in the same order. But it might be that in most cases they're in the same order. This might bias the distribution of the samples you take. So you might want to change the seed on every run to protect yourself from such nasty phantom effects.
My understanding is the bugginess is due to* unintended* nondeterminism, in other words, things like a race condition where two threads write some result to the same memory address, or singularities/epsilon error in floating point calculations leading to diverging results.
Make no bones about it, these are programming faults. There's no reason why distributed, long-running models can't produce convergent results with a high degree of determinism given the input state. But this takes some amount of care and attention.
> So you might want to change the seed on every run to protect yourself from such nasty phantom effects.
That's a perfect example of what I mean where a seed is actually worse. If you know you can't control determinism, then you might as well go for the opposite: ensure your randomness is high quality enough such that it approximates a perfect uniform distribution. Adding a seed here means you are more likely to capture the true distribution of the output.
The other takedown reviews focused on the fact that there was non-determinism despite a seed, without understanding that's not necessarily a problem.
Agreed on the second point about not having a seed, but I added the "assuming you have used a seed" caveat because sometimes people do use the seed for some reproducible execution modes (even multi-thread/process ones), which are fine, and it's just easier to randomly vary the seed generation rather than remove it all together when running in a non-deterministic mode.
"There are ways to mitigate this, but since the model was collecting outcomes to create a statistical distribution, rather than a single deterministic number, it didn't need to."
This is the justification the academics used - because we're modelling probability distributions, bugs don't matter. Sorry, but no, this is 100% wrong. Doing statistics is not a get-out-of-jail-free card for arbitrary levels of bugginess.
Firstly, the program wasn't generating a probability distribution as you claim. It produced a single set of numbers on each run. To the extent the team generated confidence intervals at all (which for Report 9 I don't think they did), it was by running the app several times and then claiming the variance in the results represented the underlying uncertainty of the data, when in reality it was representing their inability to write code properly.
Secondly, remember that this model was being used to drive policy. How many hospitals shall we build? If you run the model and say 10, and then someone makes the graph formatting more helpful, reruns it and it now it says 4, that's a massive real-world difference. Nobody outside of academia thinks it's acceptable to just shrug and say, well it's just probability so it's OK for the answers to just wildly thrash around like that.
Thirdly, such bugs make unit testing of your code impossible. You can't prove the correctness of a sub-calculation because it's incorporating kernel scheduling decisions into the output. Sure enough Ferguson's model had no functioning tests. If it did, they might have been able to detect all the non-threading related bugs.
Finally, this "justification" breeds a culture of irresponsibility and it's exactly that endemic culture that's destroying people's confidence in science. You can easily write mathematical software that's correctly reproducible. They weren't able to do it due to a lack of care and competence. Once someone gave them this wafer thin intellectual sounding argument for why scientific reproducibility doesn't matter they started blowing off all types of bugs with your argument, including bugs like out of bounds array reads. This culture is widespread - I've talked to other programmers who worked in epidemiology and they told me about things like pointers being accidentally used in place of dereferenced values in calculations. That model had been used to support hundreds of papers. When the bugs were pointed out, the researchers lied and claimed that in only 20 minutes they'd checked all the results and the bugs had no impact on any of them.
Once a team goes down the route of "our bugs are just CIs on a probability distribution" they have lost the plot and their work deserves to be classed as dangerous misinformation.
"My view on criticisms of the model were that it was more politically motivated"
Are you an academic? Because that's exactly the take they like to always use - any criticism of academia is "political" or "ideological". But expecting academics to produce work that isn't filled with bugs isn't politically motivated. It's basic stuff. For as long as people defend this obvious incompetence, people's trust in science will correctly continue to plummet.
If you check the commit history, you'll see that he quite obviously didn't work with the code much at all. Regardless, if he thinks the model is not worthless, he's wrong. Anyone who reviews their bug tracker can see that immediately.
> To the extent the team generated confidence intervals at all (which for Report 9 I don't think they did), it was by running the app several times and then claiming the variance in the results represented the underlying uncertainty of the data, when in reality it was representing their inability to write code properly.
Functionally, what's the difference? The output of their model varied based on environmental factors (how the OS chose to schedule things). The lower-order bits of some of the values got corrupted, due to floating-point errors. In essence, their model had noise, bias, and lower precision than a floating point number – all things that scientists are used to.
Scientists are used to some level of unavoidable noise from experiments done on the natural world because the natural world is not fully controllable. Thus they are expected to work hard to minimize the uncertainty in their measurements, then characterize what's left and take that into account in their calculations.
They are not expected to make beginner level mistakes when solving simple mathematical equations. Avoidable errors introduced by doing their maths wrong is fundamentally different to unavoidable measurement uncertainty. The whole point of doing simulations in silico is to avoid the problems of the natural world and give you a fully controllable and precisely measurable environment, in which you can re-run the simulation whilst altering only a single variable. That's the justification for creating these sorts of models in the first place!
Perhaps you think the errors were small. The errors in their model due to their bugs were of the same order of magnitude as the predictions themselves. They knew this but presented the outputs to the government as "science" anyway, then systematically attacked the character and motives of anyone who pointed out they were making mistakes. Every single member of that team should have been fired years ago, yet instead what happened is the attitude you're displaying here: a widespread argument that scientists shouldn't be or can't be held to the quality standards we expect of a $10 video game.
How can anyone trust the output of "science" when this attitude is so widespread? We wouldn't accept this kind of argument from people in any other field.
At the time, critics of the model were claiming the model was buggy because multiple runs would produce different results. My comment above explains why that is not evidence for the model being buggy.
Report 9 talks about parameters being modeled as probability distributions, i.e. its a stochastic model. I doubt they would draw conclusions from a single run, as the code is drawing a single sample from a probability distribution. And, if you look at the paper describing the original model (cited in report 9), they do test the model with multiple runs. On top of that they perform sensitivity analyses to check erroneous assumptions aren't driving the model.
I have spent time in academia, but I'm not an academic, and don't feel any obligation to fly the flag for academia.
Regarding the politics, contrast how the people who forensically examined Ferguson's papers were so ready to accept the competing (and clearly incorrect https://www.youtube.com/watch?v=DKh6kJ-RSMI) results from Sunetra Gupta's group.
Fair point about academic code being messy. It's a big issue, but the incentives are not there at the moment to write quality code. I assume you're a programmer - if you wanted to be the change you want to see, you could join an academic group, reduce your salary by 3x-4x, and be in a place where what you do is not a priority.
Your comment above is wrong. Sorry, let me try to explain again. Let's put the whole fact that random bugs != stochastic modelling to one side. I don't quite understand why this is so hard to understand but, let's shelve it for a moment.
ICL likes to claim their model is stochastic. Unfortunately that's just one of many things they said that turned out to be untrue.
The Ferguson model isn't stochastic. They claim it is because they don't understand modelling or programming. It's actually an ordinary agent-like simulation of the type you'd find in any city builder video game, and thus each time you run it you get exactly one set of outputs, not a probability distribution. They think it's "stochastic" because you can specify different PRNG seeds on the command line.
If they ran it many times with different PRNG seeds, then this would at least quantify the effect of randomness on their simulation. But, they never did. How do we know this? Several pieces of evidence:
2. The program is so slow that it takes a day to do even a single run of the scenarios in Report 9. To determine CIs for something like this you'd want hundreds of runs at least. You could try and do them all in parallel on a large compute cluster, however, ICL never did that. As far as I understand their original program only ran on a single Windows box they had in their lab - it wasn't really portable and indeed its results change even in single-threaded mode between machines, due to compiler optimizations changing the output depending on whether AVX is available.
3. The "code check" document that falsely claims the model is replicable, states explicitly that "These results are the average of NR=10 runs, rather than just one simulation as used in Report 9."
So, their own collaborators confirmed that they never ran it more than once, and each run produces exactly one line on a graph. Therefore even if you accept the entirely ridiculous argument that it's OK to produce corrupted output if you take the average of multiple runs (it isn't!), they didn't do it anyway.
Finally, as one of the people who forensically examined Ferguson's work, I never accepted Guptra's either (not that this is in any way relevant). She did at least present CIs but they were so wide they boiled down to "we don't know", which seems to be a common failure mode in epidemiology - CIs are presented without being interpreted, such that you can get values like "42% (95% CI 6%-87%)" appearing in papers.
I took a look at point 3. and that extract from the code check is correct. Assuming they did one realisation I was curious why. It would be unlikely to be an oversight.
"Numbers of realisations & computational resources:
It is essential to undertake sufficient realisation to ensure ensemble behaviour of a stochastic is
well characterised for any one set of parameter values. For our past work which examined
extinction probabilities, this necessitates very large numbers of model realizations being
generated. In the current work, only the timing of the initial introduction of virus into a country is
potentially highly variable – once case incidence reaches a few hundred cases per day, dynamics
are much closer to deterministic."
So looks like they did consider the issue, and the number of realisations needed is dependent on the variable of interest in the model. The code check appears to back their justification up,
"Small variations (mostly under 5%) in the numbers were observed between Report 9 and our runs."
The code check shows in their data tables that some variations were 10% or even 25% from the values in Report 9. These are not "small variations", nor would it matter even if they were because it is not OK to present bugs as unimportant measurement noise.
The team's claim that you only need to run it once because the variability was well characterized in the past is also nonsense. They were constantly changing the model. Even if they thought they understood the variance in the output in the past (which they didn't), it was invalidated the moment they changed the model to reflect new data and ideas.
Look, you're trying to justify this without seeming to realize that this is Hacker News. It's a site read mostly by programmers. This team demanded and got incredibly destructive policies on the back of this model, which is garbage. It's the sort of code quality that got Toyota found guilty in court of severe negligence. The fact that academics apparently struggle to understand how serious this is, is by far a faster and better creator of anti-science narratives than anything any blogger could ever write.
I looked at the code check. The one 25% difference is in an intermediate variable (peak beds). The two differences of 10% are 39k deaths vs 43k deaths, and 100k deaths vs 110k deaths. The other differences are less than 5%. I can see why the author of the code check would reach the conclusion he did.
I have given a possible explanation for the variation, that doesn't require buggy code, in my previous comments.
An alternative hypothesis is that it's bug driven, but very competent people (including eminent programmers like John Cormack) seem to have vouched for it on that front. I'd say this puts a high burden of proof on detractors.
He is unfortunately quite wrong, see below. I don't believe he could have reviewed the code in any depth because the bugs are both extremely serious and entirely objective - they're just ordinary C type programming errors, not issues with assumptions.
Also food and other required resources are similarly unable to teleport through walls, so the people involved in growing, transporting, preparing and delivering them to your door can't do "lockdown" like the minority who are able to work from home.
I have been adjacent to industrial and academic partnerships where both the university and company wanted to maintain the IP of the work and in their minds that extended to the software. Paper was published and the source code was closely guarded. I wondered how people could replicate the findings without the fairly complex system the researchers used.
> Along the way, everyone knew and when people tried to bring up concerns higher ups in the institution suppressed the knowledge.
As you are doing here. I don't see a name or any specifics anywhere in your comment. Of course not including any names or specifics also means you could be making it up for a good story. We have no way of knowing.
Presumably OP does not have hard evidence and is only relating an anecdote. There isn’t much of an incentive to waste time and resources fighting for academic integrity or some other high minded concept like this. Baring those incentives we’ll be left with anecdotes on forums and a continued sense of a diminished trust in academia.
Op sounded very confident that the persons allegedly involved _are_, in no uncertain terms, not just definitely total frauds but also definitely engaged in a giant fraud conspiracy that definitely goes all the way to the top. If what they meant was "someone told me once that..." they could have said that instead, but they've chosen to word things very very differently. At best they've drastically overstepped reasonable limits of what claims one is able to rightly make, and that assessment feels extremely generous.
Yes. Exactly. Thank you for sharing what I was going to share. Corruption exists where it is allowed by the people who act out of cowardice.
As an aside, I've worked with plenty of academics and while I sometimes thought their research area was stupidly low stakes, the only researchers that I thought were truly wasting time were the ones that had to do research for a medical degree. Basically forced research.
Now I went to a premier university and I'm friends with some smart cookies, but I don't buy for a second the overall theme of this comment chain. There is a reason the West is incredibly wealthy and it isn't because our best and brightest are faking it.
There is a lot less fakery in science than poorly-designed studies, misleading endpoints, underdocumented or incorrectly documented methods, and cargo-culting. The success of the process comes from having a good filtration process to sift through this body of work, and the idea that there will always be some people in the system doing actually good work.
That said, I have also witnessed plenty of low-level fraud: changing of dates to match documentation, discarding "outlier" samples without justification or even documentation, etc. Definitely enough to totally invalidate a result in some cases.
Your first comment was effectively "Everyone knows this person is a fraud and no one is willing to stand up and put a stop to it".
Your second comment was effectively "Well I don't know they are a fraud and I'm not going to be the one who upsets people by trying to put a stop to it".
I don't say this as a criticism of you. I say this a defense of the people you are criticizing. Stopping people like this takes a lot of work and often some personal risk. Most of us aren't willing to do it despite us pretending otherwise.
>Stopping people like this takes a lot of work and often some personal risk.
I don't recall where I first heard it, but the principle that it takes 10x the amount of energy to refute bullshit as it does to generate it certainly seems to apply in this case.
A post-doc of a collaborator of a professor of mine was once found to have doctored data. The professor and her collabs only found out when they looked at the images and found out that there were sections that seemed to have been copy/pasted. Getting the papers retracted was a gruelling process that (iirc) took over 3 years.
>The people in charge were given evidence, face much less risk, and it's their job to put a stop to it.
That is an awfully authoritative statement to make based off what OP shared.
But either way, this has been proven time and time again whether we are talking about simple corruption like OP mentioned or more serious forms of injustice. People will judge themselves based off motivations and others based off actions. We can excuse ourselves because of the personal inconvenience that acting will cause us. But other people don't get that luxury. They get criticized purely based off that lack of action because if we were in their situation surely we would do the right thing. Being honest about this is important step to actually fixing the system because we need to identify the motivations which lead to the lack of action in order to remove them and encourage action. Simply vilifying people for not acting accomplishes nothing.
> That is an awfully authoritative statement to make based off what OP shared.
Which part do you disagree with? Maybe the 'much' on much less risk?
'given evidence' is true unless OP made op the story. (And even if OP made up the story then we're judging fictional people and the judgements are still valid.)
I think 'less risk' goes part and parcel with being administration rather than someone lower rank reporting a problem.
And it seems clear to me that it's their job.
So, criticism. Which is not a particularly harsh outcome. And doesn't necessarily mean they made the wrong decision, but them providing a justification would be a good start.
Inaction is often excusable when it's not your job. It's always important to look at motivations, but vilification can also be appropriate when there's dereliction. Sometimes there are no significant motivations leading to lack of action, there's just apathy and people that shouldn't have been hired to the position.
>'given evidence' is true unless OP made op the story.
OP never mentioned evidence. People just "knew" this person was a fraud but there was no mention of the actual evidence of fraud. The closest thing to evidence is that "their work was terrible" but that isn't evidence of fraud.
>I think 'less risk' goes part and parcel with being administration rather than someone lower rank reporting a problem.
We have no idea the risks involved. It would be highly embarrassing for a prestigious school and/or instructor to admit that they admitted a fraud into their program. Maybe this isn't even the first time this has happened to these people. Would you want to be the person know for repeatedly being duped by frauds? Maybe that would that ruin your reputation more than looking the other way and letting this person fail somewhere else where you would not be directly tied to their downfall. It is also incredibly risky to punish this person without hard evidence as that can lead to a lawsuit.
These are not meant to be definitive statements. They are just hypothetical that show how we can't judge people's motivations without knowing a lot more about the situation.
This website is pretty easy to post on anonymously. It takes five seconds to make a throwaway and another three to post a name.
While I agree that there are frauds out there in academia and elsewhere, I have no reason to believe that you’re not yourself some sort of fraud. You’ve essentially posted “I have direct knowledge of [unspecified academic fraud in which somebody claimed to have direct knowledge of [unspecified academic fraudulent conclusion] but can’t back it up] but won’t back it up”
Your overall point is… what? Fraud of perpetuated by cowardice? Self interest? An overall sense of apathy towards the truth? Your comment could be construed as any of those.
There are people that love to spread fear, uncertainty and doubt without having to rely on being truthful. People that are intentionally misleading with the sole intent of leveraging people’s biases and emotions to confirm folks notions and whip people up into an artificially-created frenzy make statements yours.
Serious questions:
1. Do you actually give a shit about this big fraud you’ve brought up but not revealed?
And
2. Why did you post?
> 1. Do you actually give a shit about this big fraud you’ve brought up but not revealed?
To expose it directly here would likely have little effect generally, but would have an outsized effect personally (damaging trust). If it did have an impact it would likely expose those involved (many careers). I am not in the command chain, I have seen the evidence and it's overwhelming. But I'm not in a position to enact the requisite change.
That said, I don't actually give a shit about this particular case. It's widespread, insanely wide spread. Most studies / work cannot be replicated.
> 2. Why did you post?
As an anecdote to highlight something that I've seen. I also linked to several other informational pieces with public accounts (so don't trust mine, fine -- trust theirs).
> To expose it directly here would likely have little effect generally, but would have an outsized effect personally (damaging trust). If it did have an impact it would likely expose those involved (many careers). I am not in the command chain, I have seen the evidence and it's overwhelming. But I'm not in a position to enact the requisite change.
Sorry, I don’t mean to pick at you but… what?
If you were to anonymously post the name of an academic fraud, you personally would necessarily be found out, and it would ruin multiple careers?
The powers that be know who you are, know of the fraud and its nature and those involved? You’ve been privy to this big juicy secret that’s shared by many, but if its content were to be revealed, you would certainly be the one pointed out?
Are you the only person that could reasonably know about and publicly object to this fraud? If so, is it a necessary function of your social or professional life to cover up this fraud? If so, I’ll go back to “why did you post?” (“Sharing anecdotes” isn’t really an answer to “Why did you share this anecdote?”)
Not sure why people are that hostile. I'm sure that identifying someone also can identify you if the circle around the person is small enough, and if said person is powerful enough, people choosing to believe them over him would end careers, yes. He'd have to say "well, i worked on this particular study with him, and the data was made up vs actually collected."
I think its more "how dare you hint scientists are corrupt!" that kind of drives the outrage.
> This website is pretty easy to post on anonymously.
Correct me if I am wrong, but this website, and the owners are in U.S. of A. If so, an appropriate court order would force the disclosure of logs, IP addresses, accounts, etc. There are plenty of examples where sites had to disclose sufficient details to track an 'anonymous' writer down.
With multi-million (billion?) dollar endowments on the line, I would be worried to presume just a throw away account would work.
VPNs exists and are trivial to use. As is the TOR browser. Yes it's probably compromised already, but the FBI is not going to show their hand chasing a random libel case.
People on HN are so odd sometimes. Ah yes, I must be a liar if I don't want to spill all the details my friends told me in confidence and ruin our relationship.
Sure, sometimes people lie on the internet but this is not an outlandish story.
“Best case” scenario everyone loses their funding, including the colleagues who did report this stuff. The institution, the lab, etc would lose their funding. Those wanted to get PhDs in those labs will lose theirs. It’ll have a serious negative impact on everyone, even those who did the correct thing.
This is why academia is as corrupt as it is. They all evaluate each other’s paper, give each other grants and anyone who exposes anything loses their entire careers.
the two parent comments speak to me (uninvolved) as wholly throwing all people, institutions, grant selection and everything else, under the bus without distinction. The drivers for that are unknowable, but I guarantee that neither these statements, NOR their complete compliment, are Truth. Proof is that without sufficient distinction, nothing claimed is distinguishable enough to even weigh. Add that the selection pressure in many institutions is many orders of magnitude more than ideal, and the consequences of the outcomes, similar.
I don't know, which puts me exactly where I was ten minutes ago. They have no control over the situation, and if they hadn't posted I would know even less, so it's not suppression.
They have complete control over how complicit they are in preserving the coverup of acts they allege to know are definitely true.
> and if they hadn't posted I would know even less
The reality is that right now you know exactly as much as before they posted because what they posted was unsubstantiated. They may as well have said that their name is Princess Peach and that they're pregnant with the mustached child of a certain Italian plumber for all the good it does you.
One of three things must be true:
1) They have real firsthand knowledge that the claims are true and they're actively deciding to protect the identity(ies) of a conspiracy of rampant fraudsters whose actions are so egregious that they tarnish the very essence of the scientific academy itself.
2) They don't have any real knowledge that the claims are true, and the story is rumormongering.
3) I swear I had a third one, but now that I've written those two I can't think of what it was. I'll leave this placeholder here in case it comes to me.
Of course, there are frauds in any industry/profession.
But in my experience (math, science, & engineering), it is actually far less prevalent than in other places.
Forget the overzealous #sciencetwitter people. I have found that academia is one of the rare places where people the absolute top of their field who are often actually modest & aware of their ignorance about most things.
>I have found that academia is one of the rare places where people the absolute top of their field who are often actually modest & aware of their ignorance about most things.
This view doesn't really align with designing public policy - while those people exist - they aren't the problem (or are insofar they aren't voicing their positions loud enough).
Coming from a place of modesty and acknowledging limitations is not the "believe science" movement.
Eric seems lightly inclined to fringe theories and self-importance, but nothing I'd call fraud. Bret has been pushing some pretty unfortunate stuff though, including prophylactic ivermectin as a superior alternative to vaccination:
> “I am unvaccinated, but I am on prophylactic ivermectin,” Weinstein said on his podcast in June. “And the data—shocking as this will be to some people—suggest that prophylactic ivermectin is something like 100% effective at preventing people from contracting COVID when taken properly.”
He wasn't just claiming that ivermectin might have some efficacy against SARS-CoV-2 (possible, though I doubt it), or that the risks of the vaccine were understated to the public (basically true; but it's a great tradeoff for adults, and probably still the right bet for children). Bret was clearly implying that for many people--including himself, and he's not young--the risk/benefit for prophylactic ivermectin was more favorable than for the vaccine. There was no reasonable basis for such a belief, and the harm to those who declined vaccination based on such beliefs has become obvious in the relative death rates.
The first article I've linked above is by Yuri Deigin, who had appeared earlier on Bret's show to discuss the possibility that SARS-CoV-2 arose unnaturally, from an accident in virological research. This was back when that was a conspiracy theory that could get you banned from Facebook, long before mainstream scientists and reporters discussed that as a reasonable (but unproven) hypothesis like now. So I don't think Bret's services as a contrarian are entirely bad, but they're pretty far from entirely good.
What's fraudulent about them is not their papers (there are none of any relevance to speak of) but their character. They are both self proclaimed misunderstood geniuses who have been denied Nobel prizes in spite of their revolutionary discoveries (in 3 different fields, Physics, Economics and Evolutionary Biology). In actuality they are narcissistic master charlatans with delusions of grandeur.
Then the comment by ummonk above is off topic because there is no credible claim of scientific fraud. Lots of people are blowhards with odd opinions. So what?
And the ones that are masquerading as having Nobel worthy research chops to get the audience to believe their gripes about the scientific establishment is on topic enough.
It seems you don't like them for some reason, but complaining about the scientific establishment isn't fraud and has nothing to do with censorship. So what's your point?
One claims to have discovered a Theory Of Everything, putting forth a paper riddled with mathematical errors and with the caveat that it is a "work of entertainment". The other claims to have made a Nobel worthy discovery that revolutionizes evolutionary biology.
I wouldn't use the word 'charlatan' as leniently. Without commenting on the validity of their work, the 'mainstream opinion' about them is most certainly negative. Yet instead of pivoting elsewhere, they stick by their convictions. They might be right or wrong on their opinions, but they are hardly doing it to win any favours. And it's certainly not wrong to stand by something you believe even if the mainstream discredits you; time will tell who was right.
I've got not dog in the fight, but I've listened to some podcasts by Bret Weinstein and Heather Heying, and compared to the absolute nonsense I see on TV today, it is a breath of fresh air. They're reading scientific articles, discussing implications in long form, and have been open and honest about their mistakes.
I have not seen or heard of Eric Weinstein so I can't comment.
I'm not sure what kind of bar you're using to compare your chosen media, but it seems extremely high, and I'd like to know what you consider to be suitably informative.
They really haven't. They continued to double-down on ivermectin and other COVID era flim flam as it came to light more studies were dodgy or outright fraudulent. It's pretend science theatre from a former small university lecturer who managed a couple research papers in 20 years. For example, telling the audience with a straight face that it doesn't matter if the studies going into a metaanalysis are biased because the errors will cancel out.
>There are many examples, at the end of the day... the people in the institutions will protect themselves.
This extends well beyond the science itself, for what it's worth. It's an open secret in every department that professors X, Y and Z have the same level of sexual restraint as Harvey Weinstein. It doesn't stop the "woke" institutions doing everything they can to protect these professors' reputations, though.
> Their lab mates tried to raise concerns and nothing happened. That person graduated, with everyone on the committee knowing the issues.
I understand after some time it would be an embarrassment for the department because they hired, vetted them, etc. But why did that get tolerated initially, was it a case of nepotism? It seems they must have had some kind of a special privilege or status to skate by so easily despite the accusations.
I agree that there is an integrity problem in some areas of research (due to incentives), however the Eric Weinstein reference is laughable. I'd consider him a prime example of pseudo science for dollars / influence with zero credibility. Not saying he can't be on the right side of an issue some times, just that he's a particularly untrustworthy source, and the way he gets to his conclusions is just... wow.
> Most cs work is equally useless. It’s all a farce.
Hey careful there, it must depend on the area because all the type theory and functional programming research comes with and always came with proofs and working code. It couldn't be more rigorous and useful.
it's almost like our fav person Ayn Rand got that right [0]. Wat is pure science? It's little without a consciousness and there are a lot of perverse stimuli. For one, I found the requirement to publish (something "worthwhile", which is subjective) during my PhD so stupid. You can work hard and cleverly in science for 4 years yet only debunk stuff or not even find anything at all (but that by itself should be publishable!) or you can get lucky and publish a lot. All the publish-or-perish pressure does is force subpar stuff into the community. Like you two, I also got a bit disillusioned with non-applied science as we have it.
Edit: I don't want to say (like Ayn Rand, who can be pretty black and white) that it is all bad and we should do away with it, but it's something we should be very aware of and try to build in mechanisms to protect ourselves from these effects.
Probably, I was being sarcastic, if you've been on HN for some time, you know that Ayn is not universally loved here. At times this is ironic because she matches quite well the opinions of people here wrt government involvement in the market, love, and now indeed state sponsored science.
Now think about research on chemicals, everybody has a different source, different quality control (most academic labs do 0 qc on incoming chemicals). I have bought chemicals from major and minor vendors and I could tell you all kind of horror stories... Wrong molecule, inert powders added to increase weight, highly toxic impurities... Now you add that to assays and academics that have been optimized for years to scream "I HAVE A NEW DRUG AGAINST X" anytime they stare too long at the test tube...
This is absolute baloney. I've ordered numerous research grade chemicals from multiple suppliers and not once has any of them been the wrong one nor outside of stated purity grade — and I regularly checked, since it's standard practice. If a solid organic material is in a lower grade of purity it is typically recrystallized.
Now, yes, impurities — even minor ones — can have significant effects. But that tends to be in rare circumstances and chemists are quite aware of the need to check for it where it's most needed such as catalysis research.
No one is going to scream "I have a new drug" for something for which the composition is unclear.
I don't know what world you live in, but it isn't one of a typical North American nor European university research lab.
This was my work to check for quality of chemicals entering the lab. NMR, MS, IR... And over 15 years I have seen dozens and dozens of cases. Now most labs call HPLC with UV sufficient for quality analysis. Lots of things looked "fine" that way that's for sure. Note that I was in the drug discovery world not in the inorganic chemistry world where things are ususally of much better quality.
I'm biased - I'm a researcher at a big research institution and finished my PhD 15+ years ago. Seeing how the sausage is made shouldn't cause one to become less "pro-science". It should give people perspective that the scientific process is a process performed by humans, with all their flaws (ego, ambition, competition, etc). Furthermore, science is all about making hypotheses and seeing how they work out - with the understanding that many are wrong for all kinds of reasons (bad idea, bad motivations to push known bad ideas, etc.).
When I hear people disillusioned with how science works, more often than not I see someone who is forced to see that the mental model they had of how it works doesn't line up with how it actually works. It's messy, and has always been messy. Just read some history books, and you'll see "scientific" ideas in the 19th and early 20th century that persisted for a long time simply because some people disliked specific other people with opposing ideas. You'll see horrible egos, non-scientific arguments to reject ideas, people scheming and trying to undermine each other, and so on. Basically, humans being humans to each other. There never was a time when science was performed in the idealized model that we sometimes wish it was done: it always involves people being people, flaws and all.
Plus, I don't know any working scientist who doesn't assume most scientific discoveries are complete bullshit (or, more likely, they aren't bullshit as much as they ultimately contribute no new knowledge or understanding). We all know how the game works and why people put ideas out there (especially with the increased volume of publications people are expected to have now). What keeps people like me going is that I know most ideas are dumb or flat out garbage, but good ones emerge from that churn too and push things along. I like to think that some of my dumb ideas, when I throw them into that churn, will also occasionally push things along.
Don't get me started in the "believe in the science" line. The fact that we've turned that phrase into a political cudgel is not helping the situation.
I agree. One can be sceptical of the system while still working in the system and trusting that there will be a small number of winners that compensate for the sea of noise.
I feel the HN community might resonate with how startups are seen currently. Most of them fail, some are outright fraud, but a few completely change the game.
The model to assess the scientific enterprise should be similar to a VC. Most will fail but the few big successes will more than compensate for the losses.
Noise is one thing. There are plenty of instances where humans have had to pick out signals from seas of noise.
The anti-pro-sciencers are claiming there is not just noise, but a sea of deliberately fabricated conspiracy theories concocted over many decades with an ultimate end goal of putting tracking chips in them.
GP here - I honestly think we're honestly mostly in agreement!
It's obviously not a completely worthless institution - there have clearly been some impressive discoveries despite all of the bullshit.
I think the main source of distrust comes from the jarring disconnect between the way that the institution is presented to the public vs. the way it actually functions in real life. "Messy" is a beautiful way to describe things from the inside.
Your comment makes it sound like there's an agenda in some entity consciously misrepresenting how acamedia works. But academia is really accessible to public nowadays.
The problem, in my opinion, is how external entities treat science. Some examples:
* Media reporting low grade papers as "Science" which constantly contradict each other. "Scientists found out $something causes cancer", then a year after that "Scientists found out $thatsamething cures cancer
* Pseudo scientific drugs being sold for a health-and-beauty-obsessed general public with little to no effect
* Political think tanks funding flawed research to push their own agenda and heavily echoing those researches in their echelons for political gain
It seems that academia has become a foundation for other entities to cling on for profit, and now the it's the acamedia's reputation that has been shattered.
>Your comment makes it sound like there's an agenda in some entity consciously misrepresenting how acamedia works.
There absolutely is.
Scientists and universities are constantly declaring themselves to be bastions of reason and enlightenment, and their research that they themselves know is shit to be of great significance and rigor.
I'm not even talking about the outright corruption that you mentioned - just the more mundane things like researchers publishing papers with completely useless/misleading information (e.g. testing a ML algorithm on a single dataset and declaring it as a breakthrough in neurosignal decoding) just to get those updoots from your mates.
Even when something in academia doesn't have an agenda, it will be interpreted by everyone else as if it does.
Despite all the brilliant minds and appearance of professionalism and pride and campus pageantry...
it's easily a much more insecure, drama-filled, immature, and petty social dynamic and environment than most high schools.
That seems imo to be due to a combination of academia itself and a govt (or govt style) administration apparatus, where everyone is constantly trying convince everyone else of the importance of what it is they do.
The problem here is that your initial post, critical of how academia is presented, reads as very black and white to me.
I get the feeling that you claim that it is all fraud and lies, and that those who disagree can see no wrong with it at all.
Then someone responds with a nuanced reply, something that I, as someone who lives in a university town in Sweden, have the impression that almost everyone in academia actually agrees with.
And then it turns out that you also mostly agree. So, now I'm asking myself (and you): Why did I read that first post of yours as so hostile? As someone who was actively trying to create mistrust?
Is it just because how the debate climate is on the internet, or should you perhaps consider trying to be a bit more nuanced when communicating on the internet?
>Why did I read that first post of yours as so hostile? As someone who was actively trying to create mistrust?
I think this happens all the time - popular opinion swings far in one direction, criticisms then go in the opposite direction as disillusionment and problems get exemplified, again too far, back and forth, rediscovering the lessons of the past and settling down.
Thanks so much for the realistic take. I'm in the software industry with a bachelor's degree and currently considering going back to school because I want to try research. Your comment is a very refreshing take. HN is absolutely loaded with anti-academia posts that are rather demotivating to read.
So in science the game is never be caught lying, and they usually define bullshit as lying. I’m with you in that getting payed to write two six page papers that contain absolutely no new information per year is bullshit. Not because what’s on the papers is a lie, but because pretending society needs to invest in you so that in the future we are all more productive or happy or wise is a lie. Science itself is not bullshit and most papers are not bullshit, what’s bullshit is that we need to have so many useless science just in the from within it some useful discoveries will emerge, which is a strong underlying premise under which a lot of scientists get payed.
As a person working in science (long past PhD), I disagree.
There are tons of problem in science: grant system, pressure to publish etc. Also everyone in science is still a person with standard people problems: ego, desire to be right, wanting recognition etc. Also some fields had more issues because of poor statistical methods etc.
But the self-correcting nature of science is still its biggest redeeming feature. And therefore unless I can figure out things by myself, I'll trust scientific consensus. I may not trust individual papers, as people make mistakes, may have an axe to grind, but I'll believe the consensus. For sure occasionally even scientific consensus will be wrong.
Science does eventually self-correct, but unfortunately it takes far too long to do so.
One area I've studied pretty extensively is the history of cancer treatment. In the long story of the history of cancer treatment, it is absolutely scandalous how often the scientific consensus was wrong and persisted for years in spite of the evidence. For example, the radical mastectomy for the treatment of breast cancer continued to be used for many years, leaving many women disfigured, in spite of wide evidence that it did not produce better outcomes vs more restrained breast tissue removal.
In the history of science, many of these kinds of bad ideas have persisted simply due to deference/seniority - the incentives are all stacked towards paying your dues and not challenging the status quo and absolutely not towards being right/following the actual scientific method. There is a reason the saying "Science advances one funeral at a time" exists - as Max Planck noted: "a new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents die, and a new generation grows up that is familiar with it.”
I saw this first hand walking with my wife through two years of intensive cancer treatment. It seemed impossible to break through the ‘not standard of care’ wall to even incorporate low risk adjuvant therapies.
Overall it just felt like she was a hot potato and nobody wanted to put their name on *anything* outside of protocol. Even blood work. She was treated at the James Cancer Center in Ohio and we got second opinions from Cleveland Clinic and MD Anderson in New York. These are all fairly well regarded institutions in cancer treatment. I was expecting strong opinions and got hand waving and reluctance to interfere with any treatment selected by her primary oncologist. In the process of all of this i read hundreds of studies and research papers, spoke with numerous PIs, trial coordinators and industry reps. I couldn’t get any traction for anything and came away feeling a bit hopeless.
After it was all over i started making public offers of $25k as a starting point to just review her case from end to end to assess the quality of her care and determine if anything could be learned from it. The only takers I got for that were lawyers who were hoping to twist it into malpractice case, which I wasn’t interested in.
The experience left me extremely bitter about the current state of healthcare. After a while i was able to develop some empathy for the providers. They’re trapped in a system that mortgages their future with student loans and directly threatens their ability to cover with litigation and insurance. They have to stay on the rails or risk financial ruin.
This seems to be a particularly salient issue in the area of medicine, since many practice but may not actually “do science” unless they’re affiliated with a university or research hospital.
Most doctors are more like engineers than physicists.
They're more like airplane mechanics than engineers. Few doctors design treatments or procedures but they're very careful about performing them according to the right practices.
I just got the horrifying image of a lead engineer who refuses to move on from Java 9 because "it's all you need," but they're working on human bodies. Yeesh.
I just got the horrifying image of a lead engineer who jumps onto a newest JS framework of the day because "it's exciting," but they're working on human bodies. Yeesh.
But the difference came about only due to relentless informed written criticism of scientists, many of whom happily described their critics as spreading misinformation (or similar words from their era).
"Even individual fanatic scientific advocates of the Einsteinian theory seem to have finally abandoned their tactic of cutting off any discussion about it with the threat that every criticism, even the most moderate and scrupulous ones, must be discredited as an obvious effluence of stupidity and malice"
People often present science as some sort of free-floating edifice that "self corrects" through mysterious mechanisms. It doesn't. Scientists with wrong ideas have to be explicitly corrected by other people, some of whom will be random Swiss patent clerks and other outsiders. Therefore you cannot have science without free speech, because otherwise there's no way for bad ideas to be corrected. It's as simple as that. And yet, academic "scientists" are often at the forefront of demanding it be shut down.
Sounds like someone pushing quackery, honestly. "Science-based medicine is flawed and sometimes make you feel bad! Obviously, you need quackery!"
This also paints a too-simple picture, since there are obvious things like the promise of quick recovery if you do simple stuff (cancer being a big one) and the alternative medicines seeming cheaper than regular medicine (It may be the only treatment you think you can afford in the US). Not to mention the dismal state of "Health classes" - I'm a little over 40, and those classes mostly just told you about the parts of the body, the food pyramid, and that sex was bad and would probably kill you through disease unless you only had sex with another virgin.
I don't think most people really try to argue against "science" as is, or even the scientific process in the large scale. The gripe we have is against people who first do bad science (or popularize them), refuse to admit they're wrong, then patronizingly tell the public to trust them anyways, and still have the audacity to reassure us that "it will all work out (50 years later)".
What’s weird is it makes sense to trust scientific consensus, even though scientific consensus is also kind of territorial and protective.
It always fights against anything new, and will try to be opposed to anything against the status quo. It is so confident it is right it actually feels like it is doing the right thing by trying to suppress ideas opposed to the establishment.
Kind of like Galileo I guess, and the religious leaders of hus day are like the intellectual elitists of our day.
Thinking about the copernican revolution it made sense the establishment was opposed to it, not because it wasn’t factually correct, but it was like knowledge people aren’t ready for. And it was too unpredictable what it would do if the idea became dominant.
It may be useful to separate science itself (with its self-correcting nature and such) from academia and the community/economics of scientists and research. Furthermore, if the incentives for publishing research are wrongly aligned, can you really trust published papers in mass?
Scientific consensus can be a hard thing to define sometimes. Climate change is an example where there are enough studies out there that a meta-study can prove consensus, but what about something more recent and difficult to measure, like the effectiveness of covid vaccines/measures?
You open the newspaper to an article on some subject you know well. In Murray’s case, physics. In mine, show business. You read the article and see the journalist has absolutely no understanding of either the facts or the issues. Often, the article is so wrong it actually presents the story backward—reversing cause and effect. I call these the “wet streets cause rain” stories. Paper’s full of them.
In any case, you read with exasperation or amusement the multiple errors in a story, and then turn the page to national or international affairs, and read as if the rest of the newspaper was somehow more accurate about Palestine than the baloney you just read. You turn the page, and forget what you know.
I’ve read State of Fear, and I agree that Crichton isn’t a great source for scientific facts.
But this quote isn’t really a criticism of science. It’s a criticism of journalism.
I think most people here on HN agree with his premise regarding the quality of science journalism. I’m not so sure about his conclusion that all journalism should be treated with the same suspicion.
His books all have a scientist doing evil because of their lust for grant money.
"State of Fear" is a polemic against climate science.
If all you read was the comment I responded to, then it's no surprise that you don't think I provided enough information to criticize his books. Read his books, then read my criticism.
>His books all have a scientist doing evil because of their lust for grant money.
How unrealistic. Scientists like Peter Daszak and the EcoHealth Alliance would never do anything evil or untoward, such as using grant money to perform dangerous gain-of-function research on coronaviruses, demonstrably lying to the NIH about the scope of this research, and then misleading the public on the COVID-19 lab-origin theory while failing to disclose their financial and professional ties to the lab in question [0][1][2].
The original comment you made, which was downvoted (but I upvoted it) conveyed to me an emotion of angry refutation. (Which I'm not judging, so don't start about tone policing)
You seemed to be saying "don't pay attention to his quote because he's clueless about climate change", but the quote is about people who aren't, say, climate scientists, being gullible about the topic.
It's a self-referential quote. If you refuted it, does that mean that non climate scientists can judge climate science?
It seemed like an amusing paradox, and I guess I'm curious about your motivation.
My motivation is that I'm a scientist (before and after a long Silicon Valley career) who is annoyed by writers who write a bunch of novels where the villains are scientists corrupted by their lust for grant money. It's a trite complaint that is used in politics to claim 100% of scientists are biased, and plays a big role in the pushback against environmental and climate science.
Not op, but I think that the post that you replied to has a valid point.
Sure, the quote in itself is great, but the post that you replied to is pointing out that the person who made that quote is himself doing the same thing as those journalists that he is criticizing.
When you say you are no longer highly 'pro science' would you say that you would rather people take advice from scientists or from people who have a background of "Ultimate Fighting Championship color commentator, comedian, actor, and former television presenter". Because that's the level we're at at the moment.
The current climate we're in means that consensus science from 50+ years of research and medical practice around the world is being questioned. Not some BS CS paper on a performance improvement in something that will be read and forgotten about, or that some new medication/fruit/treatment has some miracle cure that others can't duplicate.
We're in a world were people think that Bill Gates is inserting microchips into people and that Covid is a hoax by the world government and no one has died. Academia may be broken - but don't loose faith in science.
>When you say you are no longer highly 'pro science' would you say that you would rather people take advice from scientists or from people who have a background of "Ultimate Fighting Championship color commentator, comedian, actor, and former television presenter".
I would prefer that the general public take advice from scientific researchers with a grain of salt, with an understanding that the system is far from perfect and much of what is published either contains methodological flaws that lead to false conclusions or cannot be reproduced at all.
I also think a little bit of common sense would go a long way.
I too have worked in academia, and in partnership with them over 3 deacdes, so seen how the sausage is made. So I understand and recognise some of your points. Still we can and should be pro-science; but recognise that science in academia is extremely political, and increasingly bending to marketing pressures and industry influence.
A big problem, as I see it, is the unwillngness of academia to engage in debate with the public. Whilst academics nowadays are supported and encouraged to engage in "outreach"; the nature of that is almost always in broadcast mode.
In the age of social media, science needs to learn how to communicate, and engage with non-tenured people asking good, evidence based, questions. Sure they get bombarded with abuse and crackpots, but that should not mean ignoring absolutely everybody that approaches them with reason and civility; as so often happens.
Trust is breaking down in society. Is there any wonder, when trust grows from interaction?
> Sure they get bombarded with abuse and crackpots, but that should not mean ignoring absolutely everybody that approaches them with reason and civility; as so often happens.
I think it is a really hard problem. Figuring it if someone is arguing in good faith is hard. And engaging with someone arguing from bad faith is really draining. Besides the difficulty, people worry that engaging with some people will draw even more responses, many of them being in bad faith. Certainly only responding partially often leads to "yeah you are responding to them, but what about X".
I think in the end, wading through all the abuse is just so painful that many people decide to ignore input. And doing anything that might cause more abuse to flood in is reflexively avoided, since that abuse actually hurts.
Solving this is going to require specialization. Because being able to wade through the abuse is not something everyone can do. It will also require proper incentives to actually do this, because the people who can handle the abuse aren't going to do so out of charity.
Isn't the simple solution to just not engage with comments on social media? Honestly, that's where the problem is aka trolls and partisan bots. Have face to face conversations between scientists and skilled interviewers, the scientist supplies the answers and the interviewer generates the questions and keeps the conversation moving along and interesting for the viewer. Publish the content on social media since that is really the only thing it is good for but don't read the comments. We have specialists in this area already, they're called journalists. All they need to do is make a phone call, set up the interview and record it.
That all sounds disturbingly like you kinda are okay with Joe Rogan and Bret Weinstein spreading nonsense about how vaccines are killing people and how ivermectin is better than vaccination.
Joe Rogan doesn't spread anything. He interviews experts in the field and listens to what they have to say. Currently everyone is angry because he interviewed Robert Malone, the inventor of mRNA vaccines. You can read about Robert Malone's contributions here:
"Dr. Robert Malone is the inventor of the nine original mRNA vaccine patents, which were originally filed in 1989 (including both the idea of mRNA vaccines and the original proof of principle experiments) and RNA transfection. Dr. Malone, has close to 100 peer-reviewed publications which have been cited over 12,000 times. Since January 2020, Dr. Malone has been leading a large team focused on clinical research design, drug development, computer modeling and mechanisms of action of repurposed drugs for the treatment of COVID-19. Dr. Malone is the Medical Director of The Unity Project, a group of 300 organizations across the US standing against mandated COVID vaccines for children. He is also the President of the Global Covid Summit, an organization of over 16,000 doctors and scientists committed to speaking truth to power about COVID pandemic research and treatment."
Do you really think it is unreasonable to listen to what someone with this kind of a background has to say about mRNA vaccines?
Malone worked on that stuff almost 30 years ago, and hasn't been involved in any way, shape, or form with the vaccines that exist today. Moderna's platform has a decade of work behind it and he had nothing to do with it, and the studies that show that it's safe and effective have nothing to do with him, nor does he have any particular knowledge about them that a smart grad student in a lab wouldn't have.
So no, I don't think his random commentary is particularly relevant or reasonable to listen to, any more than Jaron Lanier's haterade @ the tech industry is relevant just because he was working on shitty VR in 1990. Some people are dinosaurs and just need to be ignored, it's not unusual for people to grow to hate the things that they worked on when they were young and get increasingly unhinged arguing against them.
I don't think Rogan should be censored, for sure, 100% and without any reservations - free speech should be absolute, I'm old-school in that respect. But I absolutely think anyone paying attention to this crap should be socially scorned as a dumbass, and TBH I can't get my chuff up too far over people trying to convince Spotify to boot Rogan over this, free markets are also sacrosanct and that's what this is (even if I also would vastly prefer that Spotify holds to a hardline free-speech ethos and leaves him up).
Joe Rogan is a both-sides-ist. He shows people who speak the truth, and people who speak lies, and then claims to be impartial because he showed both sides.
Imagine this in any other field. It is not exactly reasonable to separately bring both a Nazi and a non-Nazi and claim to be impartial. In that scenario, most likely the normal person would sound relatively ambivalent, while the Nazi would be extremely agitated and emphatic that urgent action must be taken.
Several things that really truck me with this comment:
> consensus science from 50+ years of research and medical practice [...] being questioned
This is precisely what scientists should have been doing! I'm afraid having your work done for you by unqualified amateurs is a kind of punishment for skimping on it for too long.
> a background of "Ultimate Fighting Championship color commentator, comedian, actor, and former television presenter"
This is essentially a class-ist argument: Someone who tells dirty jokes on netflix couldn't possibly, by the essence of their being, decide for themselves which scientist to talk to. What's particularly striking is that this comes after the author seemingly accepts the parent's premise that academia is borken. What I'm hearing is "Yeah, we haven't earned your trust, but at least we're not those people". This is not a winning argument, and certainly not a scientific one.
> We're in a world were people think that Bill Gates is inserting microchips into people and that Covid is a hoax by the world government and no one has died
This too seems heavily influnced by class. When people believe "Bill Gates is inserting microchips" they are factually wrong but tentatively right. They are powerless in politics, while oligarchs, including foreign ones, have outsized influence. Wild conspiracies are how these feelings find expression in those that have no other way of doing so. By taking everything literal ("no, Bill Gates is not literally inserting microchips"), their enemies can avoid dealing with the issues a more charitable interpretation would be getting at.
> don't loose faith in science
To the extent that science is used as the underwriter of various political projects, I think people are perfectly fine losing faith in it. 5 decades of "belief" in science hasn't done much good, so what's the point? On the other hand, I'm not worried that people will stop using material science... By and large mistrust is very well placed.
> By taking everything literal ("no, Bill Gates is not literally inserting microchips"), their enemies can avoid dealing with the issues a more charitable interpretation would be getting at.
This is so on the mark - too often we forget to focus on what someone is _really_ trying to communicate, and take their words at "face value" - which is not communication at all.
> This is precisely what scientists should have been doing! I'm afraid having your work done for you by unqualified amateurs is a kind of punishment for skimping on it for too long.
How else do you think scientists reached consensus, exactly? Maybe the problem lies with the unqualified amateurs, who are, well unqualified to do that?
Scientific consensus is reached gradually over the course of decades, after a theory stood the test of many many challenges. No scientific theory is absolute and final and challenges are welcome, but they should be scientific, i.e. based on real data and statistically gounded analysis. Yes, it's hard to change consensus, not only in science but in any kind of community. This does not mean everybody in that community is corrupt. "Extraordinary claims require extraordinary evidence". If changing a consensus was easy, it would not be a real consensus.
>> a background of "Ultimate Fighting Championship color commentator, comedian, actor, and former television presenter"
>
>This is essentially a class-ist argument: Someone who tells dirty jokes on netflix couldn't possibly, by the essence of their being, decide for themselves which scientist to talk to.
No it's not. Getting all background knowledge and staying up to date with all pubblications in a given scientific field is very difficult and time-consuming. Expertise does exist, and takes year to build. Somebody who tells dirty jokes on netflix likely does not have the time or energy do to that. Maybe, but quite unlikely.
> To the extent that science is used as the underwriter of various political projects, I think people are perfectly fine losing faith in it.
What is the alternative? If no policy should be based on science, then what? Tea leaves? Crystal balls? Shamans? Prayers? Goat sacrifices?
> 5 decades of "belief" in science hasn't done much good.
You really need to elaborate here.
Please, please, please. Do not confuse science (the process of discovery) with scientific institutions (the groups of humans with all their defects).
> What is the alternative? If no policy should be based on science, then what? Tea leaves? Crystal balls? Shamans? Prayers? Goat sacrifices?
I'm afraid you're lacking imagination. Also, the irony of these being historical examples rather than made up seems to be lost on you. Humans are strange.
On a more serious note, if you were interested in what a modern society looks like that is no longer guided by science, you could study Russia. It seems a cynical blend of religion, nationalism, and imperialism is what it is. I'm not advocating for this, I'm describing what it looks like.
>> 5 decades of "belief" in science hasn't done much good.
> You really need to elaborate here.
I specifically prefaced it with "as underwriter of political projects". Over that time, the #1 influence on policy has been economics, and besides covid all of the most pressing issues today are economic: Stagnation, deindustrialization, inequaliy, and inflation. People are asking, if policy over that period was built on such solid scientific foundation, how come we've ended up here?
>> consensus science from 50+ years of research and medical practice
>This is precisely what scientists should have been doing!
My reference was to the years since we've virtually eliminated the deaths from polio, TB, and measles and other diseases in countries that have used vaccines against them. There have been studies that have shown that some vaccines cause side effects in some people, but none have shown to be comparable to the millions of lives that have been saved from debilitating diseases.
> 5 decades of "belief" in science hasn't done much good.
Says the person writing a comment on HN through a network of hardware and software unimaginable 50 years ago that was at least partly made possible by scientists and academic researchers making real breakthroughs. My comments were to the poster who thought scientific research was a polished turd because of his experience in one part of academia
Just this weekend I was reading that more than 60% of the research into cholesterol is funded by the egg industry. Research funded by industry overwhelmingly shows things in as positive light for industry whether it does or not.
This kind of example is common. It tells me its worth questioning research and looking deeper at what ever is released. When someone questions research to look at the merits of what they say.
Exactly. The guy who is an "Ultimate Fighting Championship color commentator, comedian, actor, and former television presenter" doesn't have a conflict of interest, which makes him less biased and more believable.
BIG conflict of interest… if someone goes on his show stating something like “brushing actually destroys your teeth, you should take X supplement which I developed and read my book” he gets money and viewers interested in this “no brushing” thing. If the conclusion is that you should brush and use toothpaste he doesn’t have a podcast.
He’s in the industry of going against “mainstream [ie scientific] knowledge”… which provides him and endless stream of salesmen who cherry picked scientific papers that support their schtick… and a lot of interested listeners who get an excuse for everything they don’t feel like doing (“oh, this guy on Rogan said brushing destroys your teeth, so I guess I’m better off leaving them alone”)
> BIG conflict of interest… if someone goes on his show stating something like “brushing actually destroys your teeth, you should take X supplement which I developed and read my book” he gets money and viewers interested in this “no brushing” thing. If the conclusion is that you should brush and use toothpaste he doesn’t have a podcast.
This seems to be equally valid for basically any media (using brush and toothpaste is not news). Would you say that all media is in BIG conflict of interest? Or how is Joe Rogan fundamentally different?
It's different because that's the niche he's focusing on. He's not the only one... Oprah? (to a less sensationalistic extent), Gwyneth Paltrow? Malcom Gladwell also had huge success with books that cherry-pick scientific publications to "prove" things that go against conventional notions in various fields.
We know that traditional media has analogous biases towards celebrity scandals, and towards FUD and polarization in politics, and that bottom feeder broadcasters with no ethics thrive on those.
... He does it with lifestyle / well-being specifically, inviting people that peddle diets, supplements, etc.
That's not fair to say. The guy who is "Ultimate Fighting Championship color commentator, comedian, actor, and former television presenter" has a massive conflict of interest, namely between generating viewership and presenting factual information. Everything he presents is a choice on a gradient between those two.
These snake oil salesmen have quickly cottoned on to the fact that they can take disenfranchised and usually uneducated groups and regurgitate them their own opinions with the air of authority, thus gaining an almost fanatically loyal following.
It's the new age televangelists, but instead of preaching salvation for tithe, they preach actually dangerous drivel that gets people killed. It feels like a low point in human evolution, where our tools enable sociopaths to mislead clueless masses on an industrial scale.
And do you know what is the worst aspect of this? That we can't ban them in good faith, because banning information is a slippery slope one way street that we mustn't take a single step on. The only way out of this quagmire is education, but it's a long slog that will pay off in decades time and only if we make a great effort now. It's literally the tree we have to plant and yet our current generation won't get to enjoy it's shade.
Less biased I agree with. To be more believable I need to look at the merits and details. Sometimes he says things that I could argue are true. Other times I can make an agreement he shared complete hogwash.
Being less biased doesn’t make one more true. Both can be wrong
I'm a fan of Joe Rogan, but the "doesn't have a conflict of interest" argument doesn't seem correct to me. The point of Joe's podcast is that his biases and motivations are transparent, not that they don't exist. Joe is interested, like many people, in contrarians. It's a heuristic that leads him to interesting conversations, and those kinds of interesting conversations are why the podcast is so popular. Oftentimes people who seem one-dimensional in media coverage are shown to be much more than meets the eye when placed in a long-form non-confrontational setting. There is no guarantee of factualness or good faith in any conversation on the podcast, and in many cases he will bring in verifiably crazy people just to see what they're really like -- these are some of the most popular episodes, by the way.
Even if he isn't in a conflict of interest, that doesn't mean he has any kind of qualification on any scientific matter. Scientific debate should continue in the scientific community, not on podcasts and social media.
1) That is not the choice to be made. I don't think there are more than a small handful of scientists in the debate. Scientists tend to be much more circumspect about things, and don't come to consensus quickly enough to agree on the measures that were rammed through.
2) The scientists who are involved are unusually politically motivated. Most of the scientists I can name off the top of my head are in government (the likes of Fauci in the States or Kerry Chant of Australia. These people are in a position of arguing that their opinion should be given emergency power to decide ... most aspects of the social sphere. This is an excellent time to be looking beyond credentialism to actual arguments and trade offs. Without the "noble lies", please, lets make policy decisions based on facts.
3) I still haven't found anyone who can argue why the vaccine mandates aren't human rights abuses. We're talking the really basic stuff like right to work, right to assemble, right to religion, etc. I don't care who we listen to as long as they are arguing on the side of those basic freedoms. If Joe Rogan is arguing for those things, I'd rather people listened to him. Those concepts have a better track record than technocrats. We can worry about run-of-the-mill stupidity after we've dealt with the real risks here.
So is the choice between “covid is a hoax and Bill Gate’s microships” and “follow the official guideline whatever it says”? What happens to opinions in the middle, “like vaccinates people at risk, not those who already had covid”? Or “covid may well have leaked from a lab”? In which of those categories do they fall?
The former is universally an excuse to not get vaccinated. If we had very reliable vaccines it would be reprehensible for so many people to refuse to do their part to immediately end the pandemic, but given current data it is not so clear.
The latter is pretty much moot.
You do have to understand that when people say "I got banned because I said COVID may have leaked from a lab" they don't actually mean "I got banned because I said COVID may have leaked from a lab". It's significantly more likely that they got banned because they said COVID definitely leaked from a lab funded personally by Dr Anthony Fauci to create bio-weapons against America to aid the spread of communism with Chinese characteristics because he is secretly an illegal immigrant from Kenya.
>> When you say you are no longer highly 'pro science' would you say that you would rather people take advice from scientists or from people who have a background of "Ultimate Fighting Championship color commentator, comedian, actor, and former television presenter". Because that's the level we're at at the moment.
That's a strawman. He's interviewing doctors, scientists etc. He's not putting forward his own views, apart from when he recounts his own covid experience.
It's like saying the left gets all their news from John Oliver.
> We're in a world were people think that Bill Gates is inserting microchips into people and that Covid is a hoax by the world government and no one has died. Academia may be broken - but don't loose [sic] faith in science.
Human bar code tattoos financed by the Gates Foundation. Opponents are discredited because they get an insignificant detail wrong.
We see the same thing in the reporting of the Canadian trucker protests. There are 10,000+ peacefully protesting people, the media find three with a swastika flag (which could even be a literal false flag operation), and reports "right wing protests".
They also report the "desecration of an unknown soldier's memorial", which was a single woman basically walking on the monument. Compare that with "mostly peaceful" BLM protests, where whole cities were burning and monuments were actually destroyed.
> The current climate we're in means that consensus science from 50+ years of research and medical practice around the world is being questioned.
(A) In some aspects, 50+ years of accepted medical science was wrong. "Droplet dogma" [1,2] was wrong. Now we know that respiratory viruses spread by airborne aerosols [3].
The epistemological tradition of modern medical science seems to be almost solely based on randomized controlled trials (RCTs). They think everything should be studied and verified the same way that new drugs are studied. We use a lot of physical safety measures, like seat belts and airbags in cars, hard hats in construction sites, etc. because they make physical sense, and their protective properties have been measured in engineering laboratory studies. Like using crash test dummies when crash testing cars. Engineering and physical sciences have methods to achieve and test knowledge, that are not an RCT with real people.
Now the topical topic is face masks. Engineers know respirator masks can filter over 95% of fine particles, but a sizeable part of the medical science still doesn't believe in face masks, due to lack of decisive studies using an RCT-based setup.
(B) But then we also have a surprisingly strong anti-vaccine movement. In this case, the consensus of medical science is correct, and vaccines work.
So we have A, where medical science was wrong and a bit stubborn to correct itself. Science as whole was not wrong: Engineers and physicists worked out the details of aerosol spread of the virus and the filtering properties of face masks. And then we have B, a genuine anti-science movement.
But what to do? If we didn't question the established dogma as used by the authorities, then we would still believe in the droplet dogma, and we would still think face masks are useless. But if we allow the questioning of the established dogma when the dogma is indeed wrong, the how can the general public know when the old dogmas are questioned for good scientific reasons (A), and when not (B)?
We're also living in the world where certain governments are mandating vaccines that pose 50% higher excess risk of myocarditis compared to an infection[1] for those aged under 40, according to a peer-reviewed, UK population-level study.
Or shutdowns of discussion that some health authorities, like Sweden, have (IMO correctly) judged that there is no clear benefit to vaccinating kids under 12 against COVID [2].
The issue I see is less on the fringe 'microchips' but more around the complete shutdown of discussion of complex topics are not binary.
You should also take all the benefits and complications into account when looking at vaccines. The complication rate of infection for those under 40 is far from zero and assumed to be significantly higher than the complication rate of the vaccines. Discussion is good but your argument sounds less like a discussion and more like a statement.
Your first study does not show what you claim it does, or at least it is a gross simplification.
These statistics are at the 10 per million scale i.e 0.001%, which is smaller than fatality rates for unvaccinated under-40s.
That is to say that there is a selection bias for people who did not die, and this is a significant oversight in your interpretation.
The argument he/she's making is that specific findings are being suppressed, so nobody has a full and accurate picture. Thus you cannot refute it with a statistical argument that assumes you do have the full picture.
That is the argument you are making, and I struggle to see in what world being published in Nature is "being suppressed".
They make the claim "vaccines that pose 50% higher excess risk of myocarditis compared to an infection", and this is a misleading claim.
The risk to an individual of taking the vaccine is more sensibly measured against what happens to somebody who isn't vaccinated, not what happens to somebody who isn't vaccinated and also didn't die.
This is like comparing injuries in people who survive jumping out of a plane with and without a parachute. How important is it if people without a parachute break their arms less often?
Their study isn't an attempt to answer the holistic question of whether vaccines are saving lives or ending them in aggregate, and they never claimed it was, so I don't see what's misleading about it. The claim dannyw is making is about the studies and discussion that isn't being published in Nature, or at all.
Your counter-claim is that this alternative question is what they should have been discussing, and if they had been, it'd be misleading to focus on the risk of myocarditis alone. Which is correct, but then they'd also have to take into account injuries and deaths from other non-myocarditis vaccine side effects, the costs of medical care not given due to the spending of resources on vaccines instead, QALYs and so on. But the sort of institutions that fund such research don't want to know about vaccine downsides, so don't fund any research into it, and moreover expend considerable effort to suppress whatever little research does get done.
> We're also living in the world where certain governments are mandating vaccines that pose 50% higher excess risk of myocarditis compared to an infection[1].
If everybody else takes a vaccine it is almost always far safer for you not to take it. Vaccines help society by stopping illnesses spreading, they just need enough people to collectively accept the personal risk for the good of everyone else - the same was true for Polio, TB, measles.
It's when you get immune to getting infected by a virus - antibodies neutralize the virus particles before they can do any harm or meaningfully replicate. All current (and probably future) SARS-CoV-2 vaccines don't do this. Through a combination of several vaccinations and infections the hope is that you get close to sterilizing immunity.
Funny thing about that is he always says that he is the only one speaking up because everyone else has incentives (gets payed) to shut-up. Imagine that, a pandemic breaks out, you invent the cure and get nothing in return. I would want to burn down the world.
It says he discovered a way to put mRNA into human cells. We don't even know that, really - we know he ran an experiment doing so.
It does not say he knows anything about the things he talked about on JRE: hydroxychloroquine or ivermectin, or anything about the specific mRNA being delivered to fight COVID (unless you are suggesting that ALL mRNA insertions kill people), or anything about immunity at all, or anything about Israel/Palestine, or anything about epidemics, or anything about monoclonal antibodies, or anything about mass formation psychosis (except to the extent he is trying to make it happen).
I have a proposal to make things easier: why don't you and your friends give us a list of people allowed to talk about ivermectine, or mRNA, or Israel/Palestine, so we can more easily put them all on one channel that will give us the Truth and make it easier to protect us from all of the other articles, podcasts, etc ? I've seen some regimes do that and it works much better.
Is this comment satire? Robert Malone is the person Joe Rogan interviewed, and that everyone is up at arms about. He is literally the first name that appears in the first sentence of that article you linked.
"We're in a world were people think that Bill Gates is inserting microchips into people"
Well, as far as I remember - the root of that conspiracy has a true base - that Bill Gates really talked about the possibility of a health chip injected into peoples, as a vaccination certificate. So no chip coming with the vaccination, but a digital health certificate under the skin. But now it is framed, as if he never even spoke about chipping people and I cannot even find the original quote on reddit anymore.
"The current climate we're in means that consensus science from 50+ years of research and medical practice around the world is being questioned."
And too many things are getting thrown into the same bucket. Because the mRNA vaccinations are not 50 years old, but brand new. I can very much understand, being sceptical of a cutting edge proprietary technology, to be injected into my system. And I cannot verify the build process.
I have to trust government inspectors to do that.
And as a matter of fact, one of the scientists involved with the original mRNA technology Dr.Malone advises strongly against it.
And the process of getting the data for the safety, is not perfect either, as a whistleblower showed:
Now sure - when reading a bit about Malone, it seems he left the realm of science and entered crackpot area some time ago and the data scandal is not so bad as it sounds - but they are real events nonetheless.
Also despite it is being repeated as a mantra that the vaccines are totally safe - there are strong indications, that people died because of the vaccinations. Not because the vaccines are a poison, developed by shape shifting reptiles, but because some production facilites messed up and some production batches got contaminated with various rubbish, which can happen in the real world when people are in a hurry.
So there is a chance, a vaccination does not help, but harm you as a individual. This chance might be very, very small and statistical speaking it probably still makes sense to get the shot. But the possibility exists. And is not really communicated this way, probably out of the fear that since people do not know statistics - they misinterpret.
But this is what fuels conspiracy theories. I mean, the flat earthers are out of the realm of reason anyway - but not everyone who is sceptical of the vaccination and the official data about them, is a crackpot tinfoil hat nuthead. Or am I?
Well, I know that I got angry beeing told by my doctor, that my vaccination (Astrazeneca) is totally safe - and only learned about the contaminations by chance the next day. Something the doctor should have known. So this was bullshitting me, which I do not like. Now I think I do understand the reasons for this bullshiting - because of negative placebo effect. Meaning if I would believe the vaccination is dangerous for me - it actually might be. I suspect many of the reported heart attacks as side effect, are showing just that. People got so scared and stressed that they developed real symptoms. While people being calm about it, have a better of not getting bad side effects.
So this is the problem as I see it. There are people who can handle real data and people who cannot. But we are all getting treated the same idiots. (And I believe people will never learn to deal with real data, when they are always only getting comfortable lies).
"So shut up and take the shot. The one we prepared for you."
I understand people getting upset with that as I do not like being treated as a sheep either. I am not a medical researcher, but I can read papers. And I would like to choose for myself, of whether I trust Moderna, Biontech, Astrazeneca, Sputnik or Sinopharm. But I do not get even that choice. For some reason BBIBP-CorV, the one from Sinopharm, which uses a very safe old school vaccination principal, of deactivating real virus - is not approved in EU, despite being in use worldwide and approved by the WHO. People being vaccinated with it, are recognized as unvaccinated here. Why is that? Is there some data, which is not shared? It is hard to tell for anyone not deeply with it.
I don't think many people outside of academia truly understand how much scientific research has degraded over the years.
There are an extremely small number of fields in which the overall quality is still quite high (mathematics, etc.) but overwhelmingly the social sciences, medical science, etc. are wastelands of p-hacked, low-N, biased, poorly designed studies that can't be replicated (not to mention the outright frauds and absolutely rampant plagiarism).
Every intelligent person should be deeply, deeply skeptical of papers published in particular fields over the last ~20 years.
> I don't think many people outside of academia truly understand how much scientific research has degraded over the years.
150 years ago we had phlogiston alongside Maxwell on electromagnetism. Science is always a mixture of more and less wrong stuff. It's people doing work, for good reasons and bad reasons. Some of them are crazy, some are corrupt, and many are doing their best in good faith.
Unless you think we had a special good period in, say, the twentieth century that we've retreated from. It did seem to be an acceleration.
But there's at least some pretty amazing biotech going on this century.
I think most every activity sector has the phenomenon that 80-90% of the people/companies in it are a pointless waste of time and money. It's the price we pay for the 10-20% treasure.
> 150 years ago we had phlogiston alongside Maxwell on electromagnetism.
First, phlogiston theory was considered obsolete by the late 1700s, so you're off by a couple of decades.
Second, that's not even remotely comparable. While phlogiston theory is incorrect, at the time it was first proposed it seemed as good a guess as any other, and remained a viable explanation for how combustion worked until experiments proved it wrong. What's happening today is that respected researchers at reputable institutions publish results that they know are wrong or statistically meaningless, in order to game the academic system towards awarding them greater respect and influence. The problem is fraud, not ignorance.
What's the basis to believe that this is worse now than other times in history, or on average?
I strongly suspect there's a survival bias, where the past fraudulent and otherwise incorrect stuff is forgotten in favor of the great stuff. So it always feels like today is the worst time ever.
> What's the basis to believe that this is worse now than other times in history, or on average?
We produce more science now than ever before. If you have 10 scientists and 9 of them produce rubbish it won't take you long to read a paper from the one who doesn't.
If you have 10,000,000 and 9,000,000 produce rubbish you can spend a thousands lifetimes reading nothing but rubbish.
Social science != Science. (OK I over-react but come on, the field seems barely literate in statistical methods last I looked, i.e. mean != average for all cases)
Medical science has a massive reproducibility problem largely brought on by the demands and pressures of industry funding.
Chemistry is still allowing us to make better turbine blades for jet engines and Physics is still explaining the weird world of quantum mechanics.
Just because some fields can't keep their facts and figures straight we shouldn't tar the whole of "scientific research" with the same brush.
with the dizzying speed, complexity and volumes of reference materials, the single word "degraded" does not have enough descriptive power to start to be interesting. Its just so much more of everything, so much faster, plus all the factors named here.
This comment would have been more apt a decade ago, but we've already seen a substantial correction and changes to the 'rules' in response to the replication crisis. There's an explosion in the amount of shite published in garbage tier predatory journals but most of that is noise that isn't making it into any decisionmaking
There are a ton of problems with how science is done and most of it comes down to the same root cause: greed. You have corporations buying off scientists (if not to rig experiments/falsify results, then to bury unfavorable findings), journals willing to publish any garbage with little (if any) meaningful review, then media companies willing to turn that junk science into a click-bait friendly press release for a company or industry.
The corruption of science into advertising and science by press conference is a huge problem, but it's not the fault of science itself. It's the institutions we've built around science that are responsible. The way we choose to handle funding, the universities, and the journals, these are all systems that were created and they are all systems that can be replaced or reformed. The underlying framework for scientific observation and testing are still solid and remains the best way to further our understanding the world we live in, but we need accountability and regulations in place to keep the output high quality.
It's the same issue we have with medicine. Not enough accountability and oversight has allowed for things like doctors taking kick backs from pharmaceutical companies (the opioid crisis was a good example, but it's been going on for ages) and people like Stella Immanuel who can tell her patients their illness is caused by demon sperm and alien DNA but she still gets to keep her license to practice medicine.
Without regulation and oversight every system is vulnerable to corruption and failure. It doesn't mean you should throw away the system, it means we aren't doing our job to keep it functioning.
> You have corporations buying off scientists (if not to rig experiments/falsify results, then to bury unfavorable findings), journals willing to publish any garbage with little (if any) meaningful review, then media companies willing to turn that junk science into a click-bait friendly press release for a company or industry.
What reason does a corporation have to pay for science that they know will not work? That's self-defeating. At best it's only advantageous in the very short term before you try to actually sell a product that doesn't actually work. (One more reasons these absurd pre-revenue SPACs are horrendous.)
Hypothetically: Let’s say you manufacture Volkswagens, and your small engine diesels include a defeat device that allows your cars to pass emissions tests by enabling pollution controls only during emission tests.
When the world learns of your treachery, you sponsor scientific research in which monkeys are exposed to your modern VW diesel tail pipe emissions and also model year 1990s Ford diesel fumes, to show that you’re really pretty harmless compared to old pickup trucks. Your research is trash, the conclusions meaningless, and scientists cash your checks.
Hypothetically: you manufacture cigarettes..
Hypothetically: your fracking is injecting heavy metals into my groundwater..
> What reason does a corporation have to pay for science that they know will not work?
Profit. The only reason a company does anything. You see it when companies pay for research so that they can get "X product may reduce risk of cancer" in the headlines, or when they want to put "Scientifically proven to Y" on their product labels. Even if it takes funding 100 studies to get the results they're looking for they can just bury the results of the first 99 that contradict their soundbite or ad copy.
The tobacco industry paid off scientists to lie about the cancer risks of their products so that they could continue profit from killing their customers. DuPont did the same thing. The company knew about the dangers of PFAS for decades, but they went around funding research on it and pulling that funding the minute research showed results that their product was harmful. They also paid off a scientist whose job was peer review to protect their interests while hiding his ties to the company.
There are a ton of ways lies disguised as actual science can make money for those with no morals and corporations don't have morals, just shareholders.
With current Covid pandemic. I could be wrong but it seems most studies are looking at effectiveness of vaccines and little to nothing on effectiveness of natural immunity. I say this as someone who was one of the first to get vaccinated. Not that I am saying the pharmaceutical companies sponsored the studies. So to answer your question sometimes it is what is not funded that is probably more telling.
It seems even more like there are groups of people who are dead-set on certain catchphrases (like "natural immunity" or it used to be "herd immunity") no matter what facts they are presented with.
A lot of the antivax folks cling to the idea of natural immunity being better somehow, but who in their right mind would choose immunity from getting infected with a virus that can make you very very sick (or dead) and cause you to infect many others around you if they could otherwise get immunity from a free vaccine with none of those problems?
I am in South Africa. We initially struggled to get the vaccine and first phase was literally a clinical trial for health workers. We eventually did get stock but it seems by then a significant number of the population got Covid. Omicron variant doesn't seem to have affected us as much and surveys have shown that up to 80% South Africans have natural immunity from having had Covid. Yes there are some papers on the topic but government policy does not feature anything related to natural immunity. It wasn't a case of get intentionally infected rather it was a case of getting vaccines later than other nations.
It's certainly not good that so many people were forced to roll the dice initially because of lack of access to a vaccine that already exists. As new waves of variants get spread around data has been being collected so we can learn more about the immunity given by both vaccines and prior infection. Your comment mentioned a lack of research being done, but said nothing about government policy. How do you think government policy should have changed in relation to natural immunity?
From what I've seen a prior infection may give stronger protection than vaccines which is great for people who caught the virus and survived without long term health issues, but that's no comfort for the people who didn't. I hope that access to vaccines has improved in South Africa since whatever degree of protection we get from either an infection or a vaccine doesn't last very long. I'll be trying to get my 4th shot in a couple months.
> How do you think government policy should have changed in relation to natural immunity?
This is my own opinion based on my observations in South Africa. Yes it is unfortunate we could not get access to the vaccine at the quantities and time we would have liked to but the lesson is we need to improve our ability to manufacture vaccines. There are some promising initiatives in this regard. I digress though. South Africa keeps an eye on weekly deaths[1] and can then work out excess deaths. The excess deaths during the recent Omicron wave which peaked at the end of December were significantly lower than the deaths during previous waves. Government continues to encourage vaccination but the restrictions to reduce spread Covid are at the lowest we have had. Even during the peak of the Omicron variant over New Years restrictions were being lifted [2]. So without coming out and saying it, it looks to me that SA government acknowledges that natural immunity seems to be helping keep hospitalisations and deaths low. This is because most of the research in SA anyway tends to place less emphasis on natural immunity but focuses on the vaccine. So all communication from government continues to stress need to vaccinate yet restrictions continue to fall. It could be that I have not read enough but government's attitude has been welcomed.
It's not only greed, but corporatism. People on institutions are entrenched and will defend their own interests even when it's immoral and stupid. And at the end of the day most will have zero doubts about their integrity and rightness because the human mind is amazing and allow for a large degree of cognitive dissonance.
The sad thing is that there was always money to be made by having a greater understanding of our world and developing new technologies. Investing in research pays off very well, but it requires long term thinking. If you only care about next quarter's profits and growth you aren't going to make that kind of investment. Not when you can manipulate science today and get a lot of money right now. 3M and DuPont poisoned the world and even their own children with PFAS for decades after they knew it was harmful because they couldn't resist the money they'd make doing it. For a lot of people, no amount of wealth is ever enough.
Assuming that that paper is not false, it highlights lots of areas/reasons why things can go wrong.
Nonetheless, flawed as it is, the 'scientific method' is the best we've got. "How do you get people to discover reliable knowledge without getting sidetracked by bias, bribes, career incentives, corporations or politics" is actually a very hard problem to solve. The cobbled together bunch of institutions, practices and traditions that we call 'the scientific method' needs to be understood for what it is - a patchy, error prone but just about servicable method for fighting our own biases and shortcomings.
Always good to look for ways to improve it I guess but overly cynical or overly optimistic takes on it don't help much. We need to have a realistic view of it.
The difference versus acedemia of old is that science used to be the hobby of rich gentlemen who were under no financial pressures to pay their mortgage and feed their families. Nowadays, I have no doubt everyone is in their field because they love it and want to further it, but if you have to choose between unemployment or taking on projects that you didn't choose most can live with the latter.
So the solution is to somehow decouple it again. Give universities a blank cheque to employ scientists to work on whatever for sciences sake, not for business. Sure, that's obviously open to exploitation in a new way, but we only need one major discovery for the whole thing to be worth it which we can't get with small incremental research.
> Give universities a blank cheque to employ scientists to work on whatever for sciences sake, not for business.
Personally, I favor the opposite—stop awarding universities the "indirect costs" part of grant awards unless they agree to cap the hiring of grant-seeking investigators. The number of grant applicants needs to be balanced with the supply of public funds, and I don't think the public is willing to pay much more than they already are. And I say this as someone who is currently grant-funded.
I don't know which field you studied, but I think that in physics it's not quite that bad. There might be some research published that's just wrong or fraudulent, but not that much. I think it's because in physics, outside maybe high energy particle physics, it's usually feasible to check other people's experiments. In theoretical physics it's certainly common to work through other people's results before extending them.
However, I would say there's a lot of pointless research. Lots of graduate student and postdoc working hours are spent to produce results that no-one could possibly think are interesting or useful just because papers need to be produced.
I think it's the reduction of the system into a career mill (PhD for students, tenure for postdocs and grant money for professors) that ruins it. Science is inherently too unpredictable to be reduced into an algorithm that just about anyone can use to produce knowledge.
>I don't know which field you studied, but I think that in physics it's not quite that bad.
It's not that bad because it doesn't need to be. In theoretical physics I could have spend all day every day working on problems that have no physical evidence behind them at a top tier university. This is fraudulent because physics should have something to do with the real world. If you want maths for maths sake go into maths.
In your opinion, when was most scientific research not flawed or "useless"? Remember, the Einsteins, Clarks, etc of the world were wildly exceptional outliers.
Do you really think that if you were to look at most research done between 1897 and 1922, the overall quality would be higher? And that there would be less social factors competing with the integrity of work?
> In your opinion, when was most scientific research not flawed or "useless"?
How long has it been since replicating someone else's study to validate the results was routinely part of the scientific process?
>The replication crisis is an ongoing methodological crisis in which it has been found that the results of many scientific studies are difficult or impossible to reproduce. Because the reproducibility of empirical results is an essential part of the scientific method, such failures undermine the credibility of theories building on them and potentially of substantial parts of scientific knowledge.
Isn't that an economic problem rather than a scientific one?
Science doesn't exist in a vacuum, it still needs humans to do grunt work and those humans need to be paid, and there isn't much money in replicating an existing study and saying "yep, that's what we thought".
Even new research needs funding. The point is that you can't trust the findings of research to be accurate if it hasn't been proven to be reproducible. When it comes to science it doesn't do any good to fund research into X if you don't actually do the grunt work and part of that work needs to be seeing the initial results carefully reviewed, and then replicated.
Right now, far too often "review" is a rubber stamp and replication never takes place. That's because often science isn't really being done. If you're Tropicana you might happily fund study after study after study tweaking it each time until you get the results you're looking for so that you can get "OJ may reduce risk of cancer" into the headlines, then bury the results of all the research you funded that contradicted that, but that isn't science it's just advertising. In a better world, anyone involved in that kind of shit would be blacklisted as disreputable if not charged with something.
Research that isn't or can never be replicated is just barely better than speculation, and not really worth much of anything. If someone wants to fund science, we should be insisting that the process is actual science and the results are meaningful.
But again, this is an economic problem. In a perfect world, we'd have an infinite fund for doing science and you couldn't publish a paper until your results were reproduced.
But we live in a capitalist society where incentives are profit-driven (mostly). That's a reality regardless of whether you think it's good or bad.
That's why we need strong regulation and oversight. We know humans are highly vulnerable to greed. We can't (and arguably shouldn't) change that. We can however put measures in place to limit the harm we do to ourselves because of it.
> Isn't that an economic problem rather than a scientific one?
I'll add the other sentence from the first paragraph of the source above.
>Because the reproducibility of empirical results is an essential part of the scientific method, such failures undermine the credibility of theories building on them and potentially of substantial parts of scientific knowledge.
You have to fund it, simple as that. Tenure is granted to professors who get grants. Grants are gotten by publishing papers. Papers are published by conducting novel research.
One thing we could do is mandate that some portion of all grant money has to be given to independent researchers who will work to confirm your findings. I could imagine some downsides to doing this, but at least it would put money in play.
Also, another problem is that the people doing research are usually grad students working toward a Ph.D.. No one wants to do a Ph.D. confirming someone else's results. You'd need some other workforce to do the work of reproducing research. Again, doable but there needs to be money allocated for this task.
> Do you really think that if you were to look at most research done between 1897 and 1922, the overall quality would be higher?
I suspect yes, because the poor research was simply not be being done. Science was not a career, there was no pressure to publish, it was the pastime of an elite few.
> I suspect yes, because the poor research was simply not be being done. Science was not a career, there was no pressure to publish, it was the pastime of an elite few.
One of the critiques I hear about Academia these days, even in hard sciences, is that it is now significantly more dominated by the sons and daughters of elites (who don't have to take out loans and can afford unpaid research opportunities etc. etc.) So it might be on its way back to that.
Not to mention the "publish or perish" ethos that has become such an integral part of promotion & retention policy (and grants) in so many academic institutions.
Perhaps you are imagining less imperfect, less egotistical, less insecure scientists in a past Golden Age of Science. I for one have stopped believing such an era ever existed, or will ever exist. But I continue to believe that the scientific process will continue to serve humanity well despite the terrible flaws of us humans attempting to apply it. Yes, I believe in the scientific process , that it will self-correct, that the attacks by bad actors and self-deluded participants will come out in the wash.
As a doctoral student I had a paper rejected because I didn't toe the dogmatic line of one prestigious reviewer (seriously, the review feedback lacked only the word "dogma"). Terrible, right? But this is a common theme over centuries as near as I can tell, and here we are, living in a magnificent future!
We will continue to benefit from the scientific process, despite all our individual flaws, and despite how easily we each fool ourselves (and sometimes others).
The landscape is different today, which attracts a different sort of character. Back when scientific pursuit was a passion, you mostly had the creative and innately curious types who were drawn to the field all working on the same problems.
Now you have a large swathe of careerists with a handful of curious types, which invariably means the curious types get drowned out in the discourse and ultimately chased away from scientific pursuit.
The incentives are different as well. Peer review and showcasing ideas/work for grant money are new things. All of your work has to go through a committee of your non-curious careerist B-tier colleagues. If they don't like what you're doing, you won't get published and you won't get a grant.
It's now become a Job with the objective of getting all your work approved by committees. So it's different today.
I personally know two people who left Ph.D. programs because of fraud from their professors/departments in research. I think a lot of it stems from working out an experiment or theory for years and finding something that invalidates it. They then cover it up or embellish data to keep funding and not lose all the work time.
At the heart of the issue, as you arrived at, is that our researchers are constantly under pressure to meet some funding goal, constantly looking for their next grant to continue work. This pressure should not exist. It is my belief that, in absence of that pressure, whatever 'fraud' is happening will no longer need to exist. Of all things, researchers should not need to be worried about funding.
As a researcher, although I love the idea of a pressure-free job, I'm also keenly aware that I'm being paid by tax money and therefore should be accountable for my productivity (or lack thereof). I would rather have more robust review of research activities, and reliable consequences for misconduct, than freedom from financial pressure.
I am also somewhat skeptical that removing funding pressure will reduce fraud. It may reduce the amount of show publications, i.e, work that doesn't really make any progress but isn't wrong either. But in my experience outright falsification of results comes from entitlement. A fraudster feels that they entitled to a certain level of achievement and recognition for it, and are willing to make something up when they're not getting that recognition. I believe that the pressure that drives a person to commit fraud is internal, not external.
I think you aren’t leaving any room for scientists to be mistaken, which is an essential part of the process. In fact, it’s impossible to prove you’re correct in science, only to falsify a statement.
Your claims have merit, but they’re stretched slightly too far. The best work is done among the chaos. And doing stellar work is often a lot more chaotic than it appears in camera-ready papers.
Thank you for sharing. I did not expect to watch the entire 1.5hrs, but this story was exceptionally well prepared and presented, educational, and surprising about how far a bit of fraud can go even in hard sciences. Also, why these systems work the way they do.
> if one were to assume that every single scientific discovery of the last 25 years was complete bullshit, they'd be right more often than they were wrong.
Has that maybe been true since the beginning of 'natural philosophy'? Perhaps things seem (much) worse now because we have survivorship bias, the same way classical music seems higher quality than pop, because the 90% crud[0] has been long forgotten.
> Has that maybe been true since the beginning of 'natural philosophy'?
Unlikely. In its infancy, science -- aka "Natural Philosophy" as it was called at the time -- had to establish its credibility. It did so by delivering results.
The science system isn't perfect, scientists are just people, and if you start out with an overly naive view of what the science system is like (as I once did!) you might get disappointed.
Still, you're take is wildly, overly cynical.
At the end of the day science is nothing more than systematically looking for explanations that help to make predictions. There is no way that that doesn't converge on something useful, since every explanation that is of any consequence will ultimately be put to the test and it will become entirely obvious whether it is useful or not.
And so here we are, computers compute, drugs are effective, airplanes don't fall from the sky. Is there uncertainty at the margins? Sure! But no one who proclaims to know things better than "the scientists" due to some fundamental flaw in "science" as whole has anything to say that would be worth listening to.
Because that's not how the world works. Understanding things is not about picking the right team, it's about deeply engaging with the data and formulating logical hypotheses. And once you're doing that, you're now also a scientist.
Edit: To add a bit of personal experience to the somewhat theoretical argument above... I've met a lot of good people and a lot of assholes in academia. I've certainly met a lot of people who liked to make their results sound a bit more important than they probably really were. Never once have I met someone who didn't ultimately care about drawing valid conclusions from good data.
> Educated" people can say what they want about how important it is to believe in the science and have faith in our researchers and the institutions they work for.
This comment shows the problem in the argument. Science does not require belief or faith. In fact the idea of faith is the antithesis of science.
Science is about the hypothesis supported by evidence and remains current knowledge until a hypothesis that is better supported by the evidence.
Questioning is required, inquiry is required, not faith.
The problem is when you would expect someone else to to have faith just because you do. Or equally, that you would expect someone to find an explanation compelling just because you did.
People who don't understand speak with conviction as if they do, but then resort to this chained together appeal to authority as their reasoning.
Yes, I'm sure everyone has a home lab that they go to to verify when they read the newspaper. I'm not being snarky - these illustrates a core problem in epistemology, trust and social systems (in case the NIH/Fauci/Wuhan debacle already hasn't).
Witch exactly illustrates my point. You need to intelligently reason about what you have been told, understanding that you may not be a domain expert and that your conclusions will almost certainly miss something that the domain experts have taken into account.
When presented with a claim by authorities about a new virus and say it's going to be dangerous, do you think, authorities always lie about everything or do you think well virus are a thing, I know they can mutate because I know that if I get the flu this year, it's no guarantee that I will be immune from the version that will come around next year. You also have seen evidence that vaccines are generally safe and work because most people have had them and for the most part don't have people suffering from polio in the western world because of that.
You could say, nah, they lie, this is plot to force us to have genome altering vaccines that will install mind control chips that kill kittens. (yeah, over the top I know)...
Most people have a degree of trust (not faith, very different) until proven wrong that public health figures are actually their trying to improve public health.
I don’t know what is going on in this thread. Did science sneak into your house(s) and kill your pet cat?
I don’t know what Universe you all live in, but I’m in my 40s and scientific research has made incredible strides just in my lifetime. This includes academic research, industrial research, both together or looked at separately. We have made amazing progress in understanding the human genome, the internals of living cells, devising incredibly powerful semiconductor devices, probing the mysteries of the Universe, developing useable machine learning technology and other areas. We have a goddamn space telescope at L2. Even the “controversial” areas are incredibly successful. In the aggregate, climate change papers from the 1980s and the 1990s broadly predict the warming we can measure today, and COVID research (despite being about two years old) has gifted us vaccines that provide enormous measurable benefits and were delivered on a timescale that previous generations would be amazed by.
Yes, at the micro level there are some individual bad results as there have always been and a few fields have had difficulty with replicating small-effect-size experiments, but even there we only know about these issues because scientists pointed them out and began improving their techniques. In the vast majority of sciences, if you pick a point 20 or 40 years ago and look at the tangible progress we’ve made, it’s enormous and more impressive than anything we have accomplished in any other area of human endeavor. I’m not sure if people here are angry at the world, or if they’ve just raised their expectations to unattainable heights, but it’s awfully depressing watching people ignore this progress.
The natural sciences offer a unique opportunity for an individual to change the world on their own. Institutions and peer review exist only to verify that the individual offered the best new model of the world.
Academia has been taken over by an advanced persistent threat of social sciences that are all completely corrupted. That corruption infects and touches every aspect of the academic's life cycle these days. Funding determines most research and doubley so the conclusions. The infection politicizes everything and rewards those who use scientific branding to support the establishment.
I think the expression "see how the sausage is made" is widely misused, here included.
While you might not have enjoyed your stint the sausage factory, sausages have a long, successful history. Handling meat safely is hard, and it has been done with varying levels of success over the years, but historically sausages weren't much worse than any of the other meats despite the nature of their creation, and with modern food standards they are pretty much safe when handled properly (of course proper handling depends on the type of sausage).
If you didn't like what you saw and, as a result, decided not to eat any more sausages, this seems more like a gut response on your end than any particularly deep insight into sausage production. People are generally aware that some of the steps in the process aren't nice to look at, but they judge the product based on the end result.
I also disagree. Nobody documents or publicizes the small labs doing small "useless" stuff. The kind of stuff that seems irrelevant but that slowly piles into a mountain of new information. As a PHD myself, many times I thought to myself: "this research is quite insignificant" and what we found was truly small potatoes, but I can only hope that, in the aggregate, all of us that did small stuff during that period advanced science in significant ways.
It’s often useful to have the obvious or insignificant already established in the literature allowing one to focus on more elusive and revolutionary research!
This highly depends on the field. In Physics the barriers are very high. I mean not necessarily for things to end up on arxiv.org but to be published in a peer reviewed magazine. Although probably that's not for every field and perhaps not even feasible because the data is just not there. Thinking about Medicine, every single data point there is probably a lot of work (and speaking of human trials 4 digit expensive).
Across the academy, there is a movement away from the mission of seeking truth to the mission of seeking justice. The exact sciences are about a decade behind the social sciences in this transition.
Nah. Just because societies and journals in STEMier fields are moving towards being more inclusive in broader practices does not equate to a fundamental difference in what counts as science
Change the economic incentives in academia and research. If right now universities and researches are rewarded for publishing studies/papers no matter the quality, change the rules so that you're rewarded for quality, not quantity.
I think research quality isn't the primary problem. Yes there are bad quality papers being published but it usually highly depends on the institution. The problem here really is unending demand for novelty. I find it weird that academia has a very high risk appetite for research that's "shiny and new" instead of focusing on reproducibility. They'd rather have a bunch of "new" topics which has flaws in their methodologies rather than verifying old ones which has a solid foundation but boring. The thing is, a lot of advisors are incentivized to do so especially in private universities where they can advertise these new papers just to entice more students to enroll with their exuberant tuition.
I'm speaking from a first-hand experience in a private university whose image heavily depends on what "shiny and new" thing their students somehow invent. My university spends so much advertising those things and constantly raise prices at the same time, a shady business model. One of my business professors, who teach in the same institution, also subtly judge the ethics of these kinds of universities.
A lot of institutes are also pressuring their PIs to not only produce novel work but also highly translatable and patentable work. I know of some institutes that want to preview any work a PI wants to publish or speak about in a public venue, in case any of it can be patented. Keep in mind that most of these places are funded through government grants and tax payers but none of the money from these patents is ever funneled back to them.
I'm not in a position to do any of that, sorry. What do you recommend I do right now to obtain the best available public health information for myself and my family?
(1) Obtain a solid understanding of statistics, (2) formulate a hypothesis, (3) collect multiple viewpoints, data and arguments related to that hypothesis, (4) sift through the collected data and judge their quality based on reproducibility and history of the author, (5) be prepared to not reach any solid conclusion
Making a decision for your and your family's health is orthogonal to all of that.
For public health choices for your family, I say do what the majority of experts in the field are doing. And only listen to actual experts specifically in the field of study that pertains to the public health issue. I.e., don't listen to a non-practicing celebrity cardiologist (Dr Oz) about immunology and virology.
For example, you will be hard pressed to find a virologist, immunologist, or epidemiologist that isn't vaccinated and having their 5yrs and older kids vaccinated.
I also like to get my health information direct from doctor groups like the American association of pediatrics. Another good resource is directly on the website of groups like the mayo clinic, Cleveland clinic, or the editorial board of the New England journal of medicine. All of these groups are made up of many expert doctors and issue guidance based on input from a lot of experts.
They are rewarded for impact, which weighs quality over quantity, but only in a complex way that equally weights social dynamics mixed with profit motives. i.e. journals.
The journal editors have a lot of power. As a start they could devote at least half of their articles to reproduction studies instead of novel research.
Science and academia have pushed with all their might over the past few years to convince anybody with a functioning brain that random YouTube videos are, in fact, better sources for truthful information.
You only need to look at the all "the science" which gets ignored by people pushing an agenda to know it's not about science it's about power and control.
The ones who are ignoring "the science" to push agendas are the ones currently without power and control, who wish they had it. Since they cannot earn it honestly through correctness, they try violence.
All of this may be true, but it leads to the question: if not science, what do you recommend instead?
Not being a virologist or epidemiologist, I have to listen to somebody. Are you saying I'd actually be better off listening to Art Bell and Joe Rogan than official sources like Fauci? If so, that's an extraordinary claim if there ever was one.
Accept that there is no source of info you can just believe at face value.
You need to collect information from across your infosphere and intensively de-bias and cross-check it.
It'll take a lot of effort. If that's not worth it to you, go agnostic on things you don't have time to figure out, or join a tribe and shout the slogans with your tribe-friends. Your life choices will depend on your values.
This has been my method for quite some time. It is very time consuming. I rarely reach conclusions, but I get close enough to feel confident in my choices.
Agree 100%. And the same is true for a lot of other things, they're just not as contentious.
Every time I get on a plane I have to trust the science and engineering behind it, I have zero clue why it works but I can't become an expert in aerodynamics just to go on holiday. When I drink milk I have to trust the science behind pasteurization, I trust the physics calculations when going up a high rise and the same for when I take medicine.
I have to trust someone for 95% of my day to day actions, and while a system like "science" is flawed it's the best we've got so far. All up for improvements, but let's not throw the baby with the bathwater.
I mean you raise a valid concern, it's a form of corruption, "muddying the waters", or CV padding that should be combated.
But doesn't low quality science impact the universities and papers that host these things? Wouldn't e.g. Elsevier want to keep charging money for high quality content? Are people and organizations not losing their reputations, jobs, and income over this?
I don't think switching over to laypersons' podcasts is a valid alternative to bad science.
"But doesn't low quality science impact the universities and papers that host these things?"
Sadly, no, it doesn't.
"Are people and organizations not losing their reputations, jobs, and income over this?"
They are not. The system doesn't actually self correct in any visible way. The people who fund bad science either can't tell the difference between good and bad science, or don't care, or both.
"Wouldn't e.g. Elsevier want to keep charging money for high quality content?"
You mean the companies that happily publish gibberish written by spambots as "peer reviewed science"?
"Hundreds of articles published in peer-reviewed journals are being retracted after scammers exploited the processes for publishing special issues to get poor-quality papers — sometimes consisting of complete gibberish — into established journals. In some cases, fraudsters posed as scientists and offered to guest-edit issues that they then filled with sham papers. Elsevier is withdrawing 165 articles currently in press and plans to retract 300 more that have been published as part of 6 special issues in one of its journals, and Springer Nature is retracting 62 articles published in a special issue of one journal."
If that's the only way to hear the full panel of scientists (as opposed to only those who are in bed with Big Pharma), then i'll gladly listen to whatever podcast they are on. And I don't need liberal-lefties to tell me who I should listen to and how i should make my opinions, thanks.
Much more abstractly, even if everyone had perfect motivations, it's science's role to be progressively less wrong over time by invalidating faulty hypotheses/theories but almost never to be conclusively correct. Any claim that we understand how or why something works should be taken with a mountain of salt. Claims that a certain observation were made are usually much closer to actual science than any claims of a conclusion.
For example, Aristotle (generally a very influential and intelligent man) believed scallops emerged spontaneously from sand because of the limits of what he could observe back then. The observation itself wasn't wrong as far as the absence or presence of visible scallops, but what he didn't realize was that there was an entire system that he couldn't observe with the tools at his disposal, and this ignorance lead to an incorrect conclusion.
Even Isaac Newton, who forever changed the world with his work on physics and calculus, was an alchemist as well and had some weird ideas about how to treat the plague of 1665-1666 (https://www.theguardian.com/books/2020/jun/02/isaac-newton-p...). These were also born out of observation and Newton just doing the best he could with the tools he had at the time.
Overall, I believe it's both important to believe in science and scientists (even in the face of those cynically exploiting the system), but also to realize that even at its best, scientific information is ultimately about observations, not the ultimate truth of systems, even if that's the unattainable goal of science.
> Most research is flawed or useless [...] if one were to assume that every single scientific discovery of the last 25 years was complete bullshit, they'd be right more often than they were wrong.
If this were true, then you could easily provide numerous examples of discredited papers published in journals like those of the AAAS and National Academy of Science, and yet you didn't. Your comment is worthless.
After years of studies and working for university I got to agree, at least for most parts. However, I think there are significant variation in quality of science between different fields.
Some fields, like social sciences have long been very politicized and ideological, and this situation hasn't changed for better. I wouldn't take most studies from fields like social sciences, business and humanities very seriously, especially qualitative ones.
Secondly, pay and working conditions in academia generally suck, which means most actually competent people avoid it like plague. This also has an effect on the quality of research. Becoming a researcher these days in my country at least generally means becoming a working poor, who wants that?
Many responses already commenting that science (the process) is intentionally capable of addressing these issues, but that timespan can be longer than we’d like.
One major point in this comment and in many others in support is the observation that the quality of research output is increasingly poor and degraded. It reads to me there is an implicit assumption that because of this the overall process is no longer valid.
It’s not at all clear this is the case, and jumping to that conclusion is taking a very emotional and unscientific stance. Many academics get a shock at some point early on when they look into how things actually work at a small scale, and often later in their careers come to see the effectiveness of consensus and reproduction (or don’t care and just try to get credit for them and theirs). Jumping to the conclusion that science is somehow falling apart requires taking about change over time, not point observations.
Now I’m not providing evidence either way, and definitely see reasons why it could be the case. But at the same time the volume of scientific output is larger than it has ever been, and many fields are developing at an unprecedented rate. And we’re seeing the result of this in rapid technological advancement in many fields, so it’s clearly not all complete horseshit. It’s possible, likely even, that due to comparatively rapid advancement in comparison getting to solid consensus appears to take longer.
You have hard science where the research is based on repeatable experiments in a lab where you can clearly isolate every other effect.
You also have softer sciences where experiments are difficult or impossible, and where people try to come up with the best approach. Climate science falls in that category (you can’t really experiment with Earth’s atmosphere, at best backtest your model to check it is compatible with your data).
The problem is that medicine falls within both of those categories. Some parts are fully replicable (like test 3 anti-allergy creams on the same skin at the same time), but many others don’t have the luxury of being able to experiment on humans. So you end up relying on correlation analysis with low samples and a million possible variables. Some bits like epidemiology are more akin to climate science.
The problem with these softer sciences isn’t that they aren’t science (the progress of medicine over the last century is prodigious), but that it’s hard to trust any study individually. So when someone points to a paper that shows a reduction of X by Y% if you do Z, my first reaction is skepticism. So I am struggling with the whole “follow the science” thing, and the idea that any objecting opinion must be crushed.
I spent 6 years in a top 3rd unvieristy in US, working on Nanotech. My resepect for research is not same after that. I felt everybody runs after Science/Nature papers and funding.. I haven't found honest curiousity to discover new things. The funding organizations are also to be partially blamed, instead of funding a good idea / projects, they started influencing and interfering in the research for their interest.
Yes, but there is still hope. Not in the current academia (IMO rotten beyond repair) but in human nature.
Because have you noticed how when something "real" hits the science shelves, any honest research grounded in reality and careful thought, it goes like hot cakes? The research doesn't even have to be revolutionary: anything repeatable, or at least sanely executed. The bar is incredibly low.
Everyone is starved for real ideas… or at least real implementations. Anything authentic, in the sea of marketing BS.
Which I find a hopeful sign! As long as that human instinct persists, the truth bubbles up. People still "feel" wrong when lying, which is great. When that stops (a question of culture?), we're in real trouble.
Funny aside: The front page of today's national news: an elected university rector had to step down. Reason? Academic fraud!
> The biggest problem science is facing is not an external threat from a rabble of ignorant science deniers, but the complete degradation of quality within the scientific institutions themselves.
You're correct, in that what you describe isn't a threat to the ivory towers of 'science'.
Unfortunately, here in the real world, outside of academia, those people's unassailable belief in all flavors of bullshit has quite a negative effect on the quality of life of the rest of us. But at least I can sleep soundly, knowing that 'science' isn't being harmed by them.
When, say, the planet burns down around us, because of inaction on climate change, I'll take solace in knowing that the science behind climate change was never in any real danger - unlike the rest of us.
Given the boogey man was that we'd be out of coal by 2020 in the late 90s (I can dig out the material from British schools to show you this!) I think we can survive the over-exaggeration of how soon and alarmingly and the end of the world is coming.
As I keep saying, why can't we just make the planet better? Why is that so difficult? What's the risk?
I think you're hitting an important point here. The academy at its best is meant to be like a hospital for people's ignorance about their subjects and neglectful, intellectually lazy practices. What you're commenting on right now is the many forms of proverbial sicknesses that have gotten out of control. I personally think if the situation is going to be improved, it's a "physician heal thyself" situation, where some of the very people guilty of p-hacking and junk science are even now being called to come forward and change, taking on a risk of their careers as a sacrifice in exchange for a real possibility for us all to move forwards.
I'm assuming you entered a field where you weren't prepared for the egos and controversy surrounding a topic.
This is what a lot of students hit if they're ill prepared (this is not a bad thing on behalf of the student). I remember my advisor saying he saw it as his responsibility to shield the new students from something crazy controversial until we knew how to walk for ourselves, make plots quickly easily, understand the jargon and materials... aka until we were mid-way through, then we should be thrown to the wolves to form our own opinion on topics.
You're now falling for the daily mail mistake of 'big sensational headline in field X' therefore it must be false.
Just look through the nobel awards for understanding what directions and leaps fields of science are making and ignore the self-perpetuating egos that end up on the front of 'Nature' or the daily fail.
We have cloned animals, we have perfected fiber-optics, we have discovered the nature of the Higgs Boson, gravity waves and I don't even understand enough about chemistry to understand why or how that field has advanced but it clearly has.
Unfortunately I'd suggest your supervisor failed to prepare you for your PhD.
Don't worry I can point to several papers which have maths flaws in that impact the result by 2-sigma, but nobody cares about those results because it doesn't change the over-all message. "We saw nothing after intensive scrutiny and looking for it." despite being on flaky mathematical groundings.
That 'rabble' of science deniers is growing very quickly (just look at the evolution diners in the US) and keep making arguments like. "you can't take the people of of science".
If the rabble gets big enough, politicians listen and then when politicians are listening to anti-science people on whatever topic, funding inevitably gets cut or re-directed and scientific research suffers.
Don't worry I'm not talking about types of research that supports raytheon by making bigger bangs, but research that cures cancer in babies and infirmity in adults and makes better and materials for the world...
I don't disagree that there's degradation of quality. There should be a lack of trust in our institutions (and really a need to invest in them more). What confuses me though is why there is trust in sources that have so much less reason to be trusted. The forces undermining the quality of scientific institutions are at play outside those institutions as well, and outside the framework, there's far less of a defense against them.
We're seeing all kinds of chicanery that I had thought had largely been consigned to history, but it's all making a comeback.
> The biggest problem science is facing is not an external threat from a rabble of ignorant science deniers, but the complete degradation of quality within the scientific institutions themselves.
With respect, you can make this criticism about every single field and discipline. This is because the endemic problems under discussion have nothing to do with science per se, but are unique to the human condition.
The optimistic outlook is that science is an integral part of human progress, technological or otherwise. The implication is here is that even if there is a lot rubbish papers out there, they will be weeded out, because progress demands practical concepts that actually works. So dodgy work of "scientists" that rort the grants system will be ultimately discarded and forgotten.
This forum and all the infrastructure that supports it was based primarily on hard science and conducted in the "hot path" of tech progress where inaccurate results are both unacceptable and quickly falsified: mathematics (Shannon, etc.), physics, chemistry, materials engineering, electrical engineering, and computer science.
There's a reason technology has changed the world significantly in the last 50 years and other scientific areas (e.g. medical research) haven't as much.
What other areas have you experienced? What you're describing is just human nature, every field of human endeavour is like that, because they're run by humans.
Despite this, we have made unimaginable scientific progress over the last few 100 years, and we are continuing to progress at and even faster rate.
The rabble of ignorant science deniers aren't a problem for science, but they are a problem for policy. Reality itself can't sue climate change deniers for defamation, and for all intents and purposes they have control of policy in the world's largest economy as long as the US senate is deadlocked.
As someone who also works in Academia it is astonishing how many people come into this thread who have no background in the field cast doubt on you. There is a serious degree of conservatism there is among liberals who think that science and academics should not dare be questioned.
Funny how "don't believe, always question everything" have transformed back into "believe in science".
Believing in science is a very old thing, Maya have believed in their eclipse prediction science so much that were even making human sacrifice to save the sun from being eaten.
We should teach the public not to "believe in science", but the main skill of any scientist: to try to find errors in the theory you think is right with more ardour than in the theory you think is wrong.
I think we got there by mixing up science and policy. In an emergency (like a pandemic) you can’t have an atmosphere of doubt and uncertainty in decision making - someone have to make the tough calls and stand by them. Unfortunately we don’t have clear delineation between “science” (what and how we know, and how certain we are) and “policy” (what should we do based on this information) in the public discourse - the same talking heads are often representing both. Then we naturally end up at a place where we have to either “believe in science” or cast doubt on policy decisions, which seems suboptimal.
Why can’t you have an atmosphere of doubt and uncertainty in decision making? Doubt is always better than false certainty. Public figures lying during pandemic have only prompted more people to be distrustful of science and the official policy decisions.
Because then nothing would ever get done. We would debate endlessly whether declaring the war on Germany was a good idea, or whether going to the Moon is a stupid goal, etc. etc. There’s certainly no shortage of arguments against any decision. Once the decision is made, continuing to doubt it is often not helpful.
I actually think this was one failure in our response to the pandemic. Take N95 masks or testing or what have you. There’s been so much back and forth on how well things work and changes of messaging - often driven by real studies, thus reflecting the uncertainty we’re talking about! - and I think that’s the cause of some distrust you’re talking about. In an alternate reality CDC would came out on day one to say “N95s are obviously good, and so are widely available rapid tests so let’s make a ton of them” and stay on this message. It’s the uncertainty and doubt that creates distrust, not the other way around.
But people were debating about all these things, war on Germany was not declared for a very long time, war in Vietnam was abandoned not sure if consider that good or bad.
The initial stance on masks was to lie that they are useless so that people don't buy masks that were needed for doctors, so if CDC was to stay on message it would stay on that message not the one that masks are good.
Your proposed approach of a few people making decisions and the rest not even being allowed to debate, can not create trust, it can only create USSR or North Korea.
I’m not sure how the discussion got to this place. Please allow me a reset. My point is only that ideally people in charge of policy should be upfront that they’re making decisions based on imperfect information, and it should not require people “believing in science”. This would have the effect leaving the science, with all its messiness, to scientists, and will be better for everyone involved.
A sidenote on CDC, the null hypothesis is institutional inertia: there were no mask mandates during many recent pandemics so CDC was simply following the long standing policy. This doesn’t require lying: if you believe that a policy is useless and will actively harm supply of masks it makes sense to come out strongly against it. I’m open to evidence to the contrary though.
Not sure what you are saying, but I do believe that we scientist need an extra big dose of humility and we need to learn how to better accept when we are wrong.
Those are unsubstanciated statements about domains you know very little about.
These days science is so specialised and lacking narrative that you can reach PhD level without any notion of what's really going on in your field, and even less into others.
Wow man...totally nailed it on the head. And anytime you bring up questions around experimental design or anything, 99 percent of people are already yawning and not interested in anything you're saying at that point. Everything happening right now regarding the vaccines and COVID is alienating so many people and guaranteeing they will never listen to the government or any other "trusted" institution ever again.
Where did you study, what was your concentration, and did you finish? I'm not sure whether anyone on HN ought to care about your opinion. I'm not convinced that the sausage factory you experienced was a hot dog cart.
"The fact remains that if one were to assume that every single scientific discovery of the last 25 years was complete bullshit, they'd be right more often than they were wrong."
> The fact remains that if one were to assume that every single scientific discovery of the last
> 25 years was complete bullshit, they'd be right more often than they were wrong.
This is not a fact, it's a claim (may even by a hyperbole, we don't know for sure). But even at that, you're overlooking a very important perspective: there are more important and less important discoveries (results) and thus the effect of them being wrong should also be weighted. I'd also be inclined to think that the more important ones are scrutinized more and thus are likely to be right more often.
The great and unique thing with science, not ignoring the issues you've raised, is that it can correct itself. And it's better at it than everyday people are. So yep, the issues mentioned by you and others should be corrected (and some of these, like the replication issues and the publication bias, etc. are well known which means that I'm sure they are being worked on). But the general public also has to learn that science is still our best chance to understand reality.
Because most science (and thus reality) deniers usually just look at it as if it was a simple choice between "do I believe them or not" as in "do I let myself be lied to/mislead or not". Whereas the real choice is who do I rather believe? The science guy or myself (without knowing much about anything, basically). Which will give me the better results? Because when I have to make a decision (e.g. vaccinate or not) then it will be a decision between these too. (Just like "vaccinate or not" is really a decision of "contract the virus with or without being pre-immunized by the vaccine". And not "do I want to risk the side effects or not".)
feels like this is part of a larger pattern of corruption up and down institutions in western civilization. everything is gamified, monetized and rigged.
Respectfully, I disagree with you, it's mainly because of social media. Academic degradation definitely contributes (it give conspiracy theorists ammo) but if you've been exposed to the echo chambers that drive anti-science conspiracies I think you'll realise how toxic the movement and social media. Mainstream social media is the gateway and it surfaces these influencers to the curious with less harmful theories like Invermectin then to antivax and homeopathy and mistrust in doctors and traditional drugs.
A lot of these influencers have been deplatformed and flocked to alternatives like centralised telegram groups, social media. Non-conspiracy people treat conspiracy people like outcasts, no one likes arguing with no conclusion when the answers is 'I don't believe your source' so they retreat to these echo chambers for social acceptance. I've seen this happen to people I know, they just doom scroll, share anti-tech, -media, -pharma, -government memes. They become rabid in their beliefs and there is no pulling them out. I've seen this firsthand infect people who are "smart" university educated, career success. It's a shame, part of decent people I know are gone. There is a simmering undercurrent of I'm smart and you're an idiot and its probably felt both ways. I'm not from the US, but I've read accounts of what Fox News does to people and it feels similar.
Or, such narratives are created, just as narratives that all politicians are corrupt, that meritocracy is wrong, as an attempt to weaken western democracies.
Most nations spend a notable portion of their defense budget, on disrupting opponents via whisper campaigns, and dividing an opponent from within.
This has been going on through all of recorded history.
Disrupting belief in any important part of your opponent's society, is key. The goal is to cause societal disfunction, an inability to work cooperatively.
Why fight a war, if your opponent destructs from within? Or, almost as good, loses some capacity to make war?
Also good, is if your actions cause your opponent to change regimes, and become aligned with your goals.
So how do you think this gets solved? What do you think some of the root causes are?
Semi-related, I was reading about the history of some of the "US greatest universities"; Harvard, Yale, Princeton, etc. were actually Christian Universities? Wasn't even Cambridge founded in the 1200's have similar history?
I'm not sure of the complete validity, but it's an interesting perspective for sure.
I even remember reading Einstein working alongside Father Lemaitre, a Belgian Catholic priest (Founder of Big Bang Theory) on some groundbreaking work?
I've always joked that maybe G-d gave us the computer as he felt pity on us for it took 4000 years for man to calculate several hundred digits of Pi, and who knows how many great minds dedicated to hand calculating only a more "perfect circle". Didn't even Archimedes die calculating Pi?
"Nōlī turbāre circulōs meōs!" a Latin phrase, "Do not disturb my circles!". It is said to have been uttered by Archimedes—in reference to a geometric figure he had outlined on the sand—when he was confronted by a Roman soldier during the Siege of Syracuse prior to being killed.
The root causes are scientists are experts in operating tools, just like many programmers can write code, it doesnt mean they understand the process.
The medical profession is largely still the snake oil industry it was hundreds of years ago, and Govt legislation enables their monopoly.
I have as little to do with the NHS as possible because these so called experts are experts in minutiae and not the big picture. They prescribe what they have been taught. The Govt/NHS also limits a GP's prescribing ability so private healthcare should be the gold standard not a communist era make work scheme like the NHS.
You try getting a doctor to explain their thinking, most will not, and when you challenge them they go all jihad on you, in a western country of all places!
This just demonstrates cognitive dissonance in what they have been taught, but any country or institution or individual who believes in sky faeries just shows humans can hold some diametrically opposed beliefs which are irrational and should be cause to be struck off any so called professional register!
> The report says there is little evidence that calls for major platforms to remove offending content will limit scientific misinformation’s harms and warns such measures could even drive it to harder-to-address corners of the internet and exacerbate feelings of distrust in authorities.
Let's move towards reviewable moderation.
I'm happy to see this issue recognized by research. I believe secret removals, where authors of content are unaware it's been removed, are the primary problem. Removal in and of itself might not be a big deal if we could all review what's been removed.
> Investing in lifelong information literacy – Education on digital literacy should not be limited to schools and colleges. Older adults are more likely to be targeted by, and be susceptible to, online misinformation.
Is this a prank, i.e. obvious misinformation?
The prior point that surveilance is needed, following that individual platforms shouldn't be relied on for censorship would also widely be considered plain wrong.
It's really questionable how science is to be defined, as soon as social aspects enter the picture.
I think we cannot avoid another Dark Ages and deurbanization without reputable authorities, information, and criticism of popular dis/misinformation.
Deplatforming creates martyrs and conspiracy theories, as does the lack of communication or the lack of reputable authorities.
I don't know how we can scale the limited time of subject-matter authorities except to direct their cannons at the biggest sources and individual instances of nonsense.
The biggest problem is the labelling of anything outside the 'consensus' as unscientific 'misinformation.'
This is particularly problematic where it is the media and government representatives determining what is consensus.
Many scientific breakthroughs did not come from repeating the established consensus and a lot of scientific consensus has been horrifically wrong in the end.
I find it interesting that here on HN we seem to have more conversations on censorship than actual world problems. We have two countries on the brink of war, rampant poverty/homelessness and limping economies in just about every country, and all we can talk about is why we should censor misinformation. Anyone else feeling we’re hopeless as a species?
I wonder if AI can help better vet science to see if there's flaws. I was looking at a paper recently[0] about how Swedish men cause more emissions than women. I feel like it has big issues that could be fairly easily checked with an AI system to check for logical fallacies and statistical errors. I think most research is BS and it takes too long to tell if it's real or fake. The most common outcome unfortunately is that research is fake.
My issues are that more men than women live in rural areas[1] so on average men will have to drive more and use more fuel. This data also seems to come mainly from a study in 1998 but they continuously say "men today", the data is from two decades ago. It also suggests that buying local food is better for the environment then importing, but they probably don't grow food in Stockholm so it would come from regional areas. Men also need more calories so even though women emit more from their food, men are actually doing much better considering they are generally recommended to eat 1.25x more calories than women[2]. I'm also very sceptical about the energy used for clothing and furnishing in the study.
This study was widely shared in places like NPR[3], The Guardian[4], CNN[5, CBS[6] and The Independent[7]. I feel like the issues I've pointed out are super obvious and if there's a reason they are not problems then they should talk about why those issues are not relevant, especially in the news articles.
I haven't read the whole paper, I actually just looked at the study to see how much clothing emits per person(after an article on fast fashion was posted on here). It's very concerning to me that that is published in "pro-science" news papers. Perhaps even worse is the original paper is not even really casting blame but the articles all do.
I've never seen anything like mass formation psychosis until now. It truly is astonishing to see how many people can be so easily hypnotized with the right application of fear and hope.
I was still a rule following shepherd led church loving schoolboy back then. So no. I was the zombie back then, so I never formed those memories to member.
> The report says there is little evidence that calls for major platforms to remove offending content will limit scientific misinformation’s harms and warns such measures could even drive it to harder-to-address corners of the internet and exacerbate feelings of distrust in authorities.
We should never forget that Galileo almost burned for spreading what was "misinformation" back then. The science WAS in the hand of the church (whatever little science it was), so he was actually going against "scientific" consensus. I'm sure many more were actually burned, but I'm no historian.
Misinformation isn't harmful in itself, what is harmful are the choices people make based on that (mis) information. That's what legislation should fight, that's what society should fight.
Instead of censorship, how about more information, more education? People, even antivax and the like, have a good measure of cognitive capacities. We should strive to educate and inform them, and accept that some people are just gonna think differently. If they are 5%, well, that's alright. Maybe one of them is the next Galilee.
Science is being co-opted by corporations. They did the same with the so-called "socialism" or "liberalism".
Science is by its own nature meant to be challenged. Absolutists tyrants don't like this.
The only positive for me from this misinformation plague is that as soon as the next issue comes along all of the anti-Covid vaccine sentiment will be completely forgotten.
Don't believe me? When was the last time you heard about the MMR vaccine causing autism?
The "good" part of this is that rejection of science, politicization of science and the use of this issue to manipulate people who somehow claim to be "free" yet are the most easily manipulated aren't new issues. There was the MMR autism issue, resistance to polio vaccination and resistance to smallpox vaccines before that.
The damage is of course huge. The US is sitting at >3,500 Covid deaths per day currently. Annualized this is well over 1 million people per year (although it'll likely trail off), 90-95% of whom are now unvaccinated. These are people who are now steadfastly choosing to die. The government has moved on from this (eg the updated CDC guidelines on isolation to keep the economy going). People seem to have moved on (IME).
I don't think we should be worried about censorship of misinformation. Credible-sounding misinformation is easy to produce and hard to refute (aka the bullshit asymmetry principle). Leaving it out there does more harm than good (IMHO). You'll never completely get rid of it but the less of it out there the better.
As to what to do about this, the only cure is to teach critical thinking at a relatively young age so people can ask questions like "who is telling me this?", "what are their sources?", "why shouldn't I trust the CDC?" and "who benefits from me believing this?". Conservatives know this which is why they're busy passing laws to prevent such teaching.
It's why college graduates tend to be more liberal: colleges generally teach critical thinking and that tends to kill conservatism. Of course this is predictably painted as colleges being hotbeds of political indoctrination. So here we are.
A big problem with misinformation being out there is how people can fall into the trap of bothsidsing the issue (which is a logical fallacy, for the record).
> I don't think we should be worried about censorship of misinformation.
My parents coming from a communist nation (Yugoslavia) I strongly disagree with this. Who determines what is truth and what is misinformation? One of the major arguments communists used to oppress speech was to prevent panic and the spread of misinformation. (Misinformation being anything that wasn't pro-communist)
Imagine that the worst and most corrupt people you can think of are now in charge of controlling truth and misinformation. That is why the freedom of speech is a fundamental human right - no one should have that kind of power. I agree that we need better fundamental education and critical thinking, but not less rights and more power for large entities.
The situations aren't comparable because in your case (the former Yugoslavia) you're talking about state censorship and control of information by threat of criminal prosecution. The US specifically has protections against this via the First Amendment.
Instead we have private actors choosing not to be in the misinformation business. Various hate groups and such have been deplatformed by Twitter, Facebook, etc. Those fringe sources still exist. They're just in darker corners of the Internet. They're not entitled to free, mass distribution just to spew lies without recourse.
Lastly, it's worth pointing out that we censor things all the time as it is when there is an overriding interest in doing so. Examples: criminal cases involving minors (ie their names), naming certain victims in court cases, courts sealing records, publishing of military secrets, publishing of dangerous information (eg how to make a biological weapon), information to doxx/swat people and so on.
You're missing the point. Who gets to determine what is 'misinformation?' I think you'll find if you look back a year or so it was the government representatives and 'MSM' defining it, not leading scientists (some of which have since been 'discredited' because they supposedly suddenly turned to nuttiness).
And little seems said, but I can't help notice the quite communist behaviour to demand the state mandate 'choices' for the populance wholesale 'for the greater good.' Especially knowing the vaccines don't stop the virus or the spread.
Protecting information is not censorship, and your example highlights an incredible ignorance of what censorship (which is a government tool the first amendment is specifically designed around) actually is.
Oh I understand the point completely. A quick look at your comment history says everything (eg [1][2]). You have, for whatever reason, fallen down the of misinformation and propaganda that turns you into an anti-vaxxer and you've lost any objectivity or any ability to reason about the information you receive the supports or contradicts your view. If it supports your view, it's correct. If it doesn't, it isn't. The reasons for this can vary: they're hiding things, they're lying, they're misunderstanding, they've been bought by Big Pharma, they're trying to control you, whatever. I honestly don't care.
But what I do know (as evidenced by your history) that having reached that conclusion you now work backwards. You're against "censorship" of misinformation because that misinformation contradicts your anti-vaxxer beliefs and therefore it's bad. I'd bet money if we were talking about censoring information that showed vaccines are safe and work you'd have a completely different view.
Your right to promulgate propaganda isn't being violated here. Platforms just aren't required to distribute it and use their judgement about what they want on their platform.
See that's the thing, it's so easy to just say 'oh an anti-vaxxer' with a smirk and not think more. But I have done a lot of research and informing myself over time and things very much don't add up. Yes the vaccines do have some efficacy and effect for the elderly, but beyond that the research doesn't support it, unless you just listen to the hand-picked 'experts' the media chooses.
The data doesn't lie. Were you aware Pfizer finished their phase 3 trial after 3 months partly because 97% of the control group got vaccinated? Is that good science?
Are you aware many countries count a covid death as any death within 30 days of a covid diagnosis?
Are you aware in the US that doctors and hospitals receive compensation specifically for recording a covid case but not for other illnesses?
Are you aware that in the UK, out of 170K+ deaths, less than 100 were under the age of 20, yet vaccines are pushed to all with a higher side effect profile in the young?
Are you aware that in Pfizer's presentation for approval of their vaccine for under 12's that they didn't even demonstrate a clinical outcome?
Did you see Bourla himself say two doses do nothing against Omicron, three 'a little,' and yet they're still being pushed/mandated?
And look at Ivermectin. The response has constantly been 'more research needed' yet no big investment, even though the investment in the vaccine was unprecedented. Why wouldn't masses of funds and research be put towards it if it looked promising (which a number of studies, though not all, showed).
I change my mind if the evidence and data are strong. I don't sent the vaccine does work to an extent for specific groups, but you have to work really hard to not see massive problems with things like mandates and vaccinating kids based on the data. Locking down seems to have worked best, but that's political suicide now so 'just get more jabs' that don't stop spread or hospitalisation is the go...
What's the objective reality on the lab leak theory? Effectiveness of cloth masks? Has that objective reality changed? Has the perception of that objective reality changed? Where do you get your objective reality from and how is it infallible?
We don't know, but what we do know is that the truth of the lab leak theory is irrelevant to whether we should stop the virus, and is primarily only mentioned as a distraction from that.
Actually the interesting part of that theory being true or not is Fauci funding gain of function research in those labs through Peter Danszak (i.e. https://www.msn.com/en-us/health/medical/nih-took-scientists... - gets worse the more recent findings you come across) and the unwavering support the NIH have him. Certainly raises strong questions regardless of your starting position.
> The Shannon–Hartley misinformation theorem states: the rate at which misinformation can be transmitted over an analog channel is inversely proportional to the average IQ of the channel participants.
Not providing a platform for or promoting scientific misinformation is not the same thing as censoring scientific misinformation. The problems of science will not be solved by promoting pseudoscience or scientific information. That's just making a problem that is bad and making it worse.
The way I see it, this problem is unsolvable. The issue is that the human species simply isn't equipped to deal with the modern information age.
Biologically, we're still hunter gatherers, cognitively optimized to deal with tiny scale societies and its limited information diet.
Even in our area of culture (since only 10K years ago, start of agriculture) we dealt with almost no information at all. What we consume in information in a single day now is the information load of an entire year just centuries ago.
Even in my lifetime, early 80s, information was limited, and slow. When I grew up, we had 3 national TV channels, a few magazines, and a library of dated books.
The current age is not "business as usual", it's brand new. Information is unlimited. Information moves at breakneck speed. Information has broken free from institutions, all citizens are producing it. Information has no accountability, anybody can say anything.
We're not equipped for this. We don't "scale" in this way. That's why I don't see a true solution. But that doesn't excuse us from trying...
Academics should produce more idiot-proof takes that are proportional, connect with the real world and include counter points, rather than producing a simplistic paper for the sake of producing a paper to meet some perverted internal metric. The emphasis should be truth and the purpose of it, outcomes, not checklists.
Traditional media should stop turning themselves into a Twitter account or juicy Tumblr blog. Stop thinking in narratives and us versus them type of reporting as is so typical of US media.
Social networks should ban paid advertising for anything that has a societal impact. Further, social networks should stop rewarding the most extreme, dumb, controversial opinions to the mainstream as this normalizes outrage, division and minsinformation. The reasonables should win, not the crazy ones.
This last part is perhaps key. There will always be misinformation but the way it spreads is the real issue.
Say a controversial character has 100K followers. The character posts something new and it's now anyone's guess how many of their followers see it. My guess is that it's fairly low, perhaps 10K. If it would end here, it's fair game.
But that's now how this works. At all. There's a retweet button. This single button brings it under the eyes of countless additional audiences, and it spreads like wildfire. Because the take was controversial. As part of this effect, the character gets even more followers, and so on.
At the end of the day, you basically take a village idiot that would normally be completely isolated in their opinion, and hand them the microphone to address an audience of millions.
In fact, the only reason this person had 100K followers to begin with was because of this effect: richly rewarding controversy, bad takes, etc.
I think the US has caught this cancerous idea that the average person can't be trusted to think for themselves. The amount of condescension in that thought process is disgraceful. They think, someone has to tell the average person how to think. And the baffling thing is that supposed liberals think that thoughts should be policed by corporations, government, and self styled intellectuals (which seems so much against what classic liberals stood for). I think Matt Taibbi said it best recently, the most dangerous lies come from government and powerful institutions. People really need to stop focusing on dumb culture war stuff like trying to cancel Joe Rogan and really think hard about if they really want a society where google and facebook and whoever screams loudest on twitter are the deciders of societal truth. I don't want that.
No, it’s that the average person cannot make decisions about complex subjects.
This is not a new development: the boundaries of scientific knowledge - across all fields - have required a large amount of specialization for a while now. The barrier to entry is a PhD, and even then you’re only an expert in a small part of your field.
If you suddenly decide to jump in without context and make judgements about topics you are not qualified to talk about all while ignoring the state of the art, can you really expect others to take you seriously?
I think the real cancer is the rapid rise of distrust of subject matter experts.
It didn't actually take an expert to see that Daszak's paper in The Lancet was making claims that weren't backed up. Basic logical and scientific thinking with a little background knowledge were more than enough to raise serious doubts as to his bias.
Intelligent people who questioned Daszak's paper got memory-holed. Their posts were removed, had their reach limited, were placed under warning banners, etc, simply for raising the mere possibility of lab leak origin.
I could give many more examples of "subject matter experts" giving us ample cause to throw them in prison, never mind distrusting them - WMD lies, false flag attacks, staged incidents, etc.
All reported and repeated with zero gorm. No accountability. Dissenters get smeared, fined, jailed. Protestors get blocked off, boxed in, ignored, mocked, minimized. Rare politicians speaking truth get marginalized, smeared, mocked, ignored, cheated. Threatened.
Seeing this get worse and worse for decades, how can you seriously argue that our distrust is the problem? Distrust is more than justified, it's demanded. Anything else is ostrich shit.
By distrust, do you mean “blindly believe SMEs to be wrong just because they’re an SME” or “fail to blindly believe what SMEs say?”
If people don’t understand a complex subject, then I think they should neither endorse nor reject SMEs’ opinions on the matter. People don’t have to have a belief about everything. Maybe they’re right. Maybe they’re not. Blind trust is dangerous.
I don't know, I think we trust people to make decisions about complex subjects all the time when it comes to their personal lives.
I think the problem is this, in a reasonable society we would say: here are the relevant facts, here's the supporting data, make up your own mind. In reality what our society now does is says "_you must accept conclusion X, and if you don't you're an idiot_" with absolutely no real justification.
> In reality what our society now does is says "_you must accept conclusion X, and if you don't you're an idiot_" with absolutely no real justification.
I don’t agree with the “no real justification” part. But even if we set that aside, context matters! We are almost 2 years into a global pandemic with multiple vaccines widely deployed and extremely well-tested, yet a significant minority is still denying the vaccine’s efficacy.
To be quite frank, after witnessing how the world has reacted to this pandemic, I am now much less surprised by the ignorance depicted in zombie movies and shows :)
When I say no real justification, I don't mean that a justification doesn't exist, I mean that it's rarely given in a coherent way.
I think a big problem with the vaccine messaging (from people and institutions) is that it ignores human psychology. To give maybe a silly example, lets say I was hosting an office party and I brought a soft serve ice cream machine. The value of ice cream speaks for itself, so I imagine I would be giving out a lot of ice cream. There are probably a few people that wouldn't want ice cream though. Maybe they're lactose intolerant, or on a diet, or they have tooth sensitivity. Lets say I start pressuring these people, mocking them and generally getting in their face about it. People would start to think something is up right? What if I then said "everyone must have ice cream, or you're fired", people would suddenly start wondering: why is this guy so insistent that I have ice cream? Is there something in the ice cream? Is this some weird Jonestown thing going down? I imagine a lot of people would refuse the ice cream on principle.
I feel like with the vaccine and the mask mandates "not getting sick from a horrible virus" should be able to sell itself. But people are now really distrustful, partially because the messaging has been so over the top forceful, often contradictory or misleading, and partially because the vaccine has been overhyped (remember when Biden said if you got the vaccine you wouldn't get sick? That's just demonstrably untrue.) And now people think that if they just double down on this it's going to help, but like in my ice cream example, it's just going to make people wonder what on earth is going on.
Unless you have some sort of real power over people, attempting to bully them into seeing your point of view just is not an effective way to convince anyone.
Medical experts have had a worse-off time than, say, physicists or chemists though? And strains of economics experts are understandably not trusted depending on one's income class, but putting economics on par with physics, chemistry and biology is complicated in itself.
The most troublesome parts of expert mistrust was one scientific group going after anothe, as with scientists of one specialty turning into epidemiologists overnight and claiming that vaccines have contributed to thousands of deaths. As a statistician, I had to deal with junk statistics and regressions in just such a pre-print that caused a lot of panic in my little research community here.
The entire country is founded on the notion that the average person shouldn't be trusted to decide any individual issue, and instead should elect trusted representatives to do the research to understand the individual issue and decide according to how the voter would have decided, if they had researched the issue.
Joe Rogan, Matt Taibbi, Glenn Greenwald; these are the cancers, not the concept of representative democracy. Attempting to spread lies and unconsidered science on other people's platforms, and then pretending like they're entitled to those platforms is arrogant, completely wrong, and degrades what it means to own property. If you have to let people use your services, your property is no longer your own; it is owned by the government, it is nationalized.
I want private companies to be able to decide how their platforms get used. If Spotify doesn't want to deplatform Joe Rogan, that's fine, but they should deal with the consequences, which is also fine.
I've seen very little evidence that representatives have any special understanding of issues. Have you ever watched congress ask tech people questions? Taibbi and Greenwald do excellent real journalism and dive extremely deep into the topics they cover, whether it be wall street, nsa, etc. I can understand the backlash against Rogan to a degree because he's a comedian talking about things over his head at times, but I'm saddened by the way the woke mobs have turned on good journalism from Taibbi and Greenwald because they won't toe the party line. Ive yet to hear coherent criticism of those two other than an emotional "fuck those guys"
Both Taibbi and Greenwald are not good reporters. The facts they report are not accurate and often intentionally misleading, the predictions they make don't come true, they're terrible at overcoming their own biases, they prefer to say inflammatory things over saying truthful things, and they've been ostracized from the larger journalism community for these reasons, not their refusal to toe any given party line.
There are plenty of other people who do all of the things you seem to like about those two, but who are also good at journalism and haven't otherwise fallen victim to their own greed through moral turpitude.
And you may think representative democracy doesn't work, but I think you need to understand that it's more or less all the West has going for it. If you throw out representation, you throw out more or less every modern democracy in the world, and I'm personally not willing to do that. The alternatives are ghoulish and chilling to think about.
You say this with zero examples or evidence, and yet having read their reporting for many years they always have good evidence and examples. So frankly, I find them a lot more credible than you.
I'm not suggesting removing representative democracy, you're creating a strawman there, I'm saying that starting at the assumption that everyone is a moron that must have their information stream managed is bad for democracy; and that your average representative has more blind spots than we care to admit.
Besides, frankly your average representative doesn't actually do that much research into most topics, they either take the party line or do whatever their biggest donors ask them to do. So in essence you're not generally represented by Jim Bob, you're represented by oil companies or insurance companies or hedge funds etc..
What is 'representative' democracy in the age of weekly/(daily?) polls that break down your constituents' views on every last issue? The blind leading the blind...
For all his massive faults, I think Trump demonstrated that trying to appease people rather than having a strong point of view is a losing game. I live in a red state, so I've talked to quite a few Trump voters, and the one thing I've noticed is that a lot of them really sharply disagreed with a lot of his policies, but they liked that he didn't back down or equivocate. I'm not saying this with any love for Trump of course, but I think he demonstrated how modern campaigns tend to be pretty badly run.
The US has caught this idea that the average person can't be trusted to think for themselves because the average person can't be trusted to think for themselves. That's the problem.
So share the argument against if you are so compelled to reject it.
Please also reject all other scientific advanced that have made it possible for you to live a longer life no matter how small those advancements seem to be. For if you choose to reject the scientific community and their solutions than you should then refuse the rest right down to the safety of the roads you drive on now. Making this argument is a luxury of people who are privileged beyond their abilities.
In all honesty, we probably owe the existence of Covid to the scientific community doing gain of function research. We never did find evidence of it in a bat, either in a wet market or elsewhere in Wuhan. We do know it existed in the WIV lab though.
Do you know that the particular strain of Covid-19 was in WIV without a doubt or is it still speculation? You could argue that indeed this was a lapse of protocol through disposal of lab specimens but does that mean we need to condemn an entire group of people for the mistakes of a few within a group that was studying this gain of function? I just don't buy that we should somehow make this into a holy war against an entire country if indeed this was a mistake in the scientific community within that country. If this were to happen in the U.S. (I'm assuming you are American, but this can be applied to anywhere) than who would we have to blame? What about the mistake of dropping nuclear bombs on Hiroshima or Nagasaki? That in itself is a specifically horrific mistake. Even if it is proven that a lab was the origin of this virus will be ultimately forgotten like the people who dropped the bombs themselves and the people that made nuclear bombs a reality. We still pay the price for the mistakes of nuclear armament by the logistics of disposal and the logistics of multiple entities who are now proven to be unstable in the long run that hold that power. This is why biologic warfare is outlawed and why nuclear war is something to be avoided based on our first hand experience of what that could potentially look like for all of humanity. A mistake has many outcomes. Don't assume just because something may have happened it was with intent by proxy.
A healthy society must have a diversity of opinion, and that includes being allowed to be against vaccination and expressing that viewpoint.
The overton window is getting ridiculously narrow right now and it's definitely not for the benefit of society, truth nor probably even for the general health.
The evidence for the benefits of vaccines can speak for itself, so there is really no point in maligning anyone.
But there is definitely a lot of valid debate on who should be vaccinated and how much, and minimizing the overton window will stifle that debate.
You don't understand what is for the benefit of society then. In the case of western societies it's clear that they only ones benefiting from the anti-vax opinion is the healthcare industry. What do you gain from rejecting something that can elongate your immunity and also elongate against the schance that some for profit industry (western healthcare) will take advantage of you when you don't have the ability to object where the outcome of objection is death.
Do you also count people who have lingering symptoms for over a year (in the UK half a million, or ~ 1% of the entire population so far) or continue to be unable to do their day to day activities (quarter of a million in the UK, or 0.5% of the entire population so far) as recovered? [1]
The bias of this mathematician and the rest of the Royal Society working group regarding vaccination is irrelevant to the subject of these papers, which analyse how and why misinformation is spread, and how correct information can retain credibility.
Reflexively trying to discredit them while being too lazy to read beyond the headline is a great way of making your own position look highly dubious though.
Dubious assumes that my agenda is based on laziness which is in itself lazy. I read the article and my opinion is in alignment with the question everything that you don't understand camp which is the majority of anti-vax arguments regardless of the efficacy of the thing you are arguing against. No one ever claimed that these "vaccines" where a permanent solution. Everyone intelligent and capable of following along knew that getting "vaccinated" was not going to be for more than 6 months due to the fact that this was a rushed operation. No one claims that this was a tried and true solution and most agree that it was and still is a stop gap until we reach a greater immunity. How you choose to reach that is truly up to you. You can catch the virus multiple times and take a higher chance of possible death or take the vaccine and still catch it with a lower chance. It should be your choice but it also has ramifications for your pocketbook and further chance of "getting ahead" within this western healthcare system that so obviously is a bane to the income potential of individuals. When I argue for "vaccination" I'm not just advocating for your health but for your future income and that you should not succumb to the powers that be that will happily take your money downstream but in larger sums. I find it hilarious that people argue for questioning something that potentially could up their odds for getting bamboozled down the line.
EDIT: it boggles my mind when people argue against censorship yet they hit the down-vote button ever.
No and this is where I disagree. The people that disagreed with my opinion believed that I was trying to censor theirs with my own. I posted my opinion and a lot of people made assumptions about it without asking questions and made their own response. I have an extreme opinion according to some people. I'm also not the best authority on all of this but the people that pile on are assuming my opinion is an affront to their safety/security. Do I care that these people may have a chance of a high hospital bill if they catch this thing when they could just as easily get something for free to protect them against this disease that right now has high cost of care once that care is codified. Hell yeah! Keep arguing against one thing when it can ultimately keep you safe from catching a cold that is lucrative for the healthcare business at the moment due to political division. My opinion isn't based in propaganda and is based more in a concern for the long term financial ramifications for most. Watch how many people have to claim bankruptcy, furthering their descent into wage slavery after catching something that can be postponed until it becomes endemic.
> Can we get an idea of the bias of this mathematician concerning vaccinations? Specifically the Covid-19 vaccines.
> Be vigilant in finding out what the background and intentions are of people making claims such as these.
I swear every time anyone says anything interpreted as "anti vax" the very first thing people do is an ad hominem character attacks.
Do you say this about the people pushing mandates hard?
> When hundreds of millions of people have received a vaccination with minimal outcomes of death there is no longer an excuse to argue against it
That's just like your opinion. Even if that's the case people should still be free to choose. There are plenty of people who trust the data and are vaccinated and are against mandates. This will come of as rude but a don't care: people like you are a far bigger danger to society than people who don't believe in vaccine mandates.
If that's my opinion and people are free to their opinions han why should I or anyone else be downvoted? The mere fact of downvoting an opinion is in fact censorship. The exact thing that all of the arguments against mine are railing against.
You're taking an experimental prophylactic against a disease with a 99.9% survival rate. Probably higher if you're under 50. And we still don't really know how safe it is, because the 'gold standard' data (double blind testing) is absent, save for 2 months' of Pfizer data. Your benefit has apparently outweighed your cost. Fine. Mine hasn't. I like to read any and all opinions, regardless of 'bias'. We all have biases, whether we realise or not. I'm intelligent enough to figure them out. I don't need 'we' getting 'an idea' of 'the bias'. I'll probably get flagged/banned/cancelled for my opinion, but such is life.
Even if you're wrong, I think it's interesting that the only lever they think they have to change your mind is "force you." The idea of having a serious dialogue is so far from their minds (maybe because of decades of training in PR disaster scenarios - chemical spills, manufacturing accidents, drugs that kill - where the establishment is unambiguously in the wrong?) that even when they're right, they act like they're wrong.
If the CEO of CNN wanted to convince their toddler to go to the bathroom, they'd start by compiling dossiers to discredit anyone with large bladders.
('They', in this, refers to the media/corporate PR/press release system that is used to determine public opinion.)
What I'm a bit confused about is why they put so much focus on lecturing folks, demanding things, and so little focus on things that would help.
I wore a mask before anyone except folks in Chinatown where I am. I was told not to believe "misinformation" way back then. This was when they were telling us masks did nothing, only wear if you are symptomatic etc.
Hahaha I was the same way. I rebelled against scientific authority and wore a mask. It was only months later I was called a sheeple doing whatever the globalists wanted for wearing a mask.
Your bias shows heavily. Where's the double blind trial for covid effects and death rate (which seems to be more around 1%), not to ignore the very present long term effects that affect more people? Why put such a high level of doubt on the vaccine which billions of people took, but not on the disease that prompted its creation?
It does not matter how smart we are, the virus is dumb.
As many other people realized too late, you can't negotiate with a virus. "I'm smarter than the scientists" is the wrong poker bet here (and you are risking your life for a small vanity ego boost, none less).
Well, personal experience trump's this kind of argument. I've had two shots and a booster and have yet to catch Delta, Omicron, and it's sub-variant now. I happily go to venues and restaurants knowing I have a certain window to execute that freedom within the confines of science and that when the time comes to require a booster to give me another 6 months or so until a pan-coronavirus vaccine is validated, to be able to live my life more normally than unvaccinated people who have a higher chance of catching it and getting a worse outcome, I will choose the vaccine. Some will argue that they fear nothing and that is their choice. I would rather take any advantage that I can have scientifically to prolong my chances or make it hard for this virus to overcome me. If I can give an opinion that both protects me for a period of time and helps others to not be overcome by the completely unregulated healthcare industry of this country and the exorbitant amount of money they extort from people based on their opinions then I will continue to scream fromt he rooftops in spite of the short sightedness of the arguments against.
I agree, and I like to add there is a difference between censorship of information and stopping propaganda. Some groups make a claim (say against vaccination) which is fine by itself, but when people run with it, amplify it to influence others, then it goes into propaganda territory.
Healthy debates need a good context and a platform that allows for that. I don’t know of many platforms except here in HN and perhaps the debates I had in IRC channels.
In any context. If someone thinks or knows something is propaganda they should exercise their free speech to point that out.
Not even propaganda should be censored.
Then again propaganda often implies it’s coming from a government. In that case the people and their representatives should have the power to prevent or retract statements if needed.
Maybe not, it is certainly debatable. But propaganda implies (to me at least) it has gone beyond debate. It also implies a tool to influence others. Aka a form of weaponized information. I do think there is a gradient where one could say “this goes beyond free speech” , for instance as in your example.
This is so obviously likely totally false it's mind boggling.
We've been told by experts that
a) masks don't work - despite the fact that they've historically worked against other airborne pathogens.
b) that a cloth mask is fine.
...
and the list goes on.
I ignored this and believed in the "misinformation" of N95 with a vent which dramatically increases comfort and wearability.
Now we are are told that natural immunity from our immune system, which has handled MANY MANY influenzas and diseases of the past, doesn't work against COVID, despite evidence from things like SARS and MERS that the immune system remains pretty darn amazing with longer lasting protection.
You can't actually get hard data on this claim, despite insanely high case rates that should make analysis trivial.
So again a likely lie by the vaccine pushers, that if you've been infected with covid you are are not at reduced risk of re-infection.
I'll give it three months and watch them eat crow again.
Honestly, there is no excuse for this crap from those who insist on lecturing us about "dangers to society". We need honesty, integrity and facts, not lecturing, hectoring and lies.
They are not necessarily wrong.
(This is an anonymous account so I'll state my qualifications: none regarding medical stuff)
I think coronaviruses we knew before were not immunizing either, which is why people catch the cold over and over (I catch it every year, sometimes twice a year).
What's wrong is pushing simultaneously the view that natural immunity is not very effective against SARS-CoV-2, but that vaccines somehow are, which is indeed a disingenuous proposition on its face.
They are wrong because their trans is certainly a take. It's not a good take, and it's bigoted and the definition of an ad-hominem, but I guess that's what hiding behind throw away accounts is for.
I do not think you understood my point. That or you don’t appear to know what an ad hominem is. Here, take this scenario. A group of people come to me holding a paper that is white and has black writing on it. These people show me the paper and say “Behold, this is a black paper with white writing on it“. I look at that paper, clearly see they are saying a lie, so I dismiss them and avoid them. I do not wish to associate with liars. The same people come to me after and they say “the vaccine is safe, effective and necessary”. Based on my previous experience with them, I am highly skeptical of this claim. Could they be saying the truth this time? Of course. Should I be inclined to believe them seeing as they lied previously? I suppose that is for each person to decide. I am personally not inclined to believe people who lied to me previously. This was my point. I find it very hard to trust people who lie to me. Here, let me give you an example of what an ad hominem looks like. “You have a throwaway account, therefore your argument is invalid “.
[1]: http://en.wikipedia.org/wiki/Nullius_in_verba