Imagine a random guy posing as a construction worker, redirecting people away from some street, then laughing at everyone's gullibility for trusting his word instead of demanding proof of his credentials first. So what's the conclusion here? That people are gullible and should start demanding credentials from everyone in an orange vest? Do you really want a world where everybody mistrusts you and nobody believes your word without cryptographic proof or something?
Note that I am not claiming this is the situation with p-hacking, but I've heard a lot more about p-hacking than this article, and a prank wasn't really helpful for me either. A reader reading about a prank like this would have no way to know whether to view it like the above situation. For all they know, it could be a similar kind of not-so-funny prank. Shouldn't there be some evidence provided that p-hacking is a real problem, to actually convince people that the the vulnerability you've identified merits addressing?
If you were a mapmaker hired to make maps of the city for people, yes you should absolutely be shamed for trusting some random guy on the street without verifying it.
What the study exposed is that science journalists, whose literal job is to understand this stuff and explain it to the general public, suck at their jobs. You absolutely can blame them for just publishing whatever anybody says without verifying it.
> What the study exposed is that science journalists, whose literal job is to understand this stuff and explain it to the general public, suck at their jobs.
I was under their impression the job of a journalist was to get attention of readers to generate ad and subscription revenue.
Not that I disagree with the claim that they should explain things to a general public to accomplish the above, but that's not how our system is currently set up. If doing a poor job makes them more money - that's the real issue.
I’m related to a well-known science journalist. They absolutely consider it their job to verify.
There are irresponsible people who try to cram in more visible output by skimping on less visible quality. Everyone else grumbles about those reporters behind their back, like a developer whose code is undocumented and insecure.
Setting incentives to deter that behavior is management’s responsibility, as in any industry.
Since it’s an open secret in the business: The New York Times Science section has failed to do this in notable situations over the last decade.
This is not the job of science journalists. The job of science journalists is to make money for someone. The honor system is just the name that people who dont understand corruption give to no system at all. If you want journalism to be better, you probably have to make advertising and PR illegal.
I think you missed my 2nd paragraph. What I was saying was: no, this article didn't show that. The only thing it's exposed so far is that "science journalists don't watch out for this vulnerability". Well, so what if they don't? There are a million things people don't watch out for. Is this one actually causing problems in reality, or is this a hypothetical thing that doesn't occur in practice? To show this is a real problem requires providing evidence of real-world p-hacking in the wild that has had significant actual adverse consequences.
No, the prank pointed out multiple failures in different stages of the process.
> But even if we had been careful to avoid p-hacking, our study was doomed by the tiny number of subjects, which amplifies the effects of uncontrolled factors. ... Which is why you need to use a large number of people, and balance age and gender across treatment groups. (We didn’t bother.)
This poor design should have been reason enough not to publish the study - p-hacking or not. But...
> Our paper was accepted for publication by multiple journals within 24 hours. Needless to say, we faced no peer review at all. The eager suitor we ultimately chose was the the International Archives of Medicine.
It got published anyway in a "journal" with a fancy sounding name that doesn't actually do any rigorous peer review.
The team then issued a press release that the press regurgitated without any scrutiny. If they had bothered to look and if they were good at their jobs, they might have noticed that the journal was not reputable. Anybody familiar with any academic field can recite off the top of their head what the reputable journals they'd want to be published in are. If it's not one of those, it deserves a closer look. That's without even doing some basic googling about the team behind the research which should have been able to uncover that they were nothing but a website.
The problem here is a breakdown in society. Why do sham journals exist? Why do their press release reach the ears of journalists? Why is there no web of trust of credibility? Why have all our institutions (flawed as they may have been) collapsed?
That was the whole point. Science journalists dont do the jobs we think they do.
Our presumption is that they do not trust anything as true facts until they verify and vet the information. I.e do what a journalists do. Report whats true
But thats not actually their job anymore. Their job is to get eyeballs. Doesnt really matter whats true.
In no sense is this limited to science journalism. Bizarrely, the most reputable journalist are probably at TMZ, because they get sued for libel if they are wrong.
When there is no cost for being wrong, there is no incentive for truth.
There are analogies that make this seem bad (fake construction worker) and analogies that make this seem good (ethical security researcher).
We need to go past the analogy and make a deeper assessment of the factors that matter. For example, what does the bad actor stand to gain? How much harm can they cause? How often is this happening?
For example, if the fake construction worker scheme was used to rob people and it was happening regularly, it might be worth doing something about it.
And "doing something about it" doesn't necessarily mean throwing away trust. There are a bunch of other interventions when you look at the problem holistically, e.g. can we reduce the benefit, reduce the prevalence, reduce the net harm?
> Do you really want a world where everybody mistrusts you and nobody believes your word without cryptographic proof or something?
That's why we have a balance with indemnity/damages, which also can be punitive, which simply state: "Of course you can fool people. However, better don't otherwise you will regret it, buddy."
I side with you. Trust does not mean you cannot be fooled in the first place. That's why it is called trust.
A better analogy would be a fake construction worker lying about a street being closed and a bunch of people (the press) spreading the lie to the whole town without checking if it is any true.
The fake construction worker has to go, but that's only half the problem.
Any system (in this case the system of how information originating in science makes it into the general public brains) needs to be tested from time to time to see if it works.
This prank was a test, the system failed (to stop this from reaching headlines).
Your analogy would apply to someone posting a fake article on one of the news sites. The correct analogy would be the council shutting down a street based on any plan with no checking of credentials repeatedly for decades, and then someone pointing it out by submitting a provably wrong plan.
p-hacking and journalists publishing junk science or downright lies is a very well known problem. People have been pointing it out for years. Publicly shaming them via a story that will sell well is just using the systamic flaw to patch itself. It won't change the net number of sham stories these journalists publish and if anything slightly increases the quality by marginally diluting the harmful stuff being pushed by snake oil salesmen with something comparatively harmless.
Not going to try to pick apart your analogy or think of a better one. But the study was real. They did “real” science (poorly). They didn’t lot about results or invent data. What they did likely happens everyday by people who don’t even realize, or worse by people who do fabricate results.
What they showed was how easily terrible science gets published to the masses as if it were reliable.
Incompetence (negligence) has to stop being an excuse. Sure, punish it less than malicious lies, but it needs to be at least as bad as, say, speeding in a car.
Unfortunately, we all live in a world where we must verify whom to trust, based on corroboration and behavior over time, and be willing to be skeptical until something is confirmed. One study is never enough; maybe even a few are not enough, given history of studies & opinions being reversed. This is regardless of what people * should * be doing.
I want a world with massive disproportional punishment for people who do such thing. Make dishonesty a life shortening activity, were you spend a mintime of a year in jail.
> "I want a world with massive disproportional punishment for people who do such thing. Make dishonesty a life shortening activity, were you spend a mintime of a year in jail."
Having been raised to truly believe that "honesty is the best policy" I would in theory be inclined to agree with you, but we sadly live in a world where dishonest people would absolutely guaranteed find a way to abuse such a system to their benefit to harm every honest person they felt a random urge to, while avoiding any consequences to themselves (probably by pawning those consequences off somehow on some nearby unsuspecting decent person). I have no faith or hope that humanity as a whole is headed anywhere but straight to Hell (if I was foolish enough to believe such a place exists). The vilest of humanity have taken control of pretty much every aspect of society, and the rest of us are just their playthings until the miserable end they've wrought comes to pass.
One of the main points in science is that credentials shouldn't matter (lots of great research is done by people without a Dr title and lots of shit by those with a Dr title), the research was published so they could actually go and look at it easily. And given the timescales and stakes involved they should have:
Not following a construction worker direction can be pretty bad while doing so the most you lose is a bit of time.
This was also the actual job of science journalists and it takes roughly three minutes to look up a paper and check the basics like sample and effect size. They definitely have the time.
Also, there is a metric fuckton of junk science. Heard of the replication crisis? Your analogy would need 1/3 of men posing as construction workers to not be construction workers!
This isn't a bad construction worker OR an ethical security researcher story.
This is a straight-up exposé of "journalists" and nothing more. They print "experts say" and "studies show" stories when they don't or can't even do the most rudimentary verification.
So yes, trust IS a beautiful thing, and he's demonstrated that any trust you still have in "trusted sources" (looking at you, fact-checkers) is misplaced.
We knew it - but in a kind of theoretical way. Like you know that there is poverty in the world. (Yes, it kinda sucks, but it doesn't suck too much because it doesn't really affect you.)
Now we know it in a much more tangible way that affected us (perhaps you yourself fell for it, I at least hears it in my friends' circle) and that made us act.
Seems to me more like metaexplotation, clickbait "exposing" clickbait.
If this helped you I'm glad. I haven't seen this call to action you're describing but if you want to discuss what actions we can take, that sounds productive & I'm interested.
I'd like to suggest that your attitude towards poverty may be something to examine; your remarks about it were strange. Consider for instance that when you say to a stranger on the internet, "poverty kinda sucks but it doesn't effect you," it's entirely possible that that stranger is currently experiencing it (this is not the case for me, just a thought experiment).
I was going to say these results are a byproduct of a systemic problem in society, you beat me to it in a slightly different way. The problem lies on both sides of the spectrum, the deceivers and the deceived and it points to a problem with US, not THEM.
This is absolutely brilliant and a fascinating walkthrough of how to make bad science seem "legitimate". Kudos to the journalists here, this is good work. I'm especially amused that they mentioned that online commenters usually did a better job of fact checking than the publications!
It's weird, in some countries the tabloids aren't exactly respected, but they are taken very seriously, even though they're the same sort of absolute trash.
This is very much the case in the UK with quite a number of papers like the Daily Mail and the Sun, and it's the case in Germany with Bild in particular. They have a status which no American would give, say, the New York Post.
I thought the Daily Mail was in tabloid format and had been for a long time, but I think the Daily Mail is called "a tabloid" because of its market positioning, rather than its format. At the turn of the century, the high-circulation, sensationalist daily newspapers were all tabloids, and the more sober of the major daily newspapers were all in the larger broadsheet format. At least, in England they were; I don't know about the other nations of the United Kingdom.
The Guardian (via the Berliner format) and the Times have since gone tabloid (and the Independent went from broadsheet to tabloid before going digital-only), but the Telegraph and the Financial Times are still broadsheets. People still include the Guardian, Times and Independent among "the broadsheets".
Wikipedia suggests the Daily Mail was also once a broadsheet, and became a tabloid in 1971:
As much as the Daily Mail is trashy it usually has more details than the other papers. A month back there was a murder outside a Korean restaurant in London. The BBC and others had only passing details[0] (though they eventually got to same place 4 days later[1]).
> Witnesses said violence broke out near the restaurant. // One man who was in a building on the side road said: "There was shouting in the street and it escalated into a full-blown row. There was a weapon and someone fell to the ground." (BBC[0])
Compared to The Daily Mail's[2] article:
> The incident is alleged to have involved workers at the restaurant, which claims to be the first in the UK having opened in 1975. // A worker in a nearby shop said the violence had broken out inside. // Another onlooker said: 'I saw two men outside the Korean restaurant arguing. They were really shouting at each other. // 'I couldn't hear what they were saying but they appeared to be Korean. Then they both went inside.' (Daily Mail[2])
It's the Daily Mail that tells a more accurate accounting. With the BBC and no real details you might believe from this incident that you could get stabbed near Oxford St. The extra details in the Daily mail indicate it was an altercation between workers -- something that has been confirmed now that the Chef has been charged.
Given the propensity of my mother (in Australia) to read vague articles and decide that I'm going to get murdered overseas, sometimes more information is better.
Note: I know there plenty of examples where the Daily Mail has stepped over the line and not in the public interest.
I wouldn’t necessarily always conflate the Daily Mail’s “more detail” with “more accurate detail”.
I personally know the subject of one of the Daily Mail’s (and other red top papers) more lurid stories and most of the “details” are completely fabricated. The ones that aren’t are not newsworthy.
Unfortunately, their approach seems to be to print what they like, even if it’s total fiction, knowing that the average person doesn’t have the resources or inclination to go after them.
I think British newspapers have more in common with American TV news than with American newspapers. Don't Americans take Fox News as seriously Brits take the Mail and the Sun?
Millions read Bild every day, unfortunately, and other publications in Germany and the rest of the world also had the story. The hacker news submission is missing a 2015 tag.
This is why I actually don't trust newspapers or news sites for any medical advice for statements like "a glass of wine a day will help you live longer", or some nonsense like that.
In fact, the best place to find this information is through the primary literature. You can use Google Scholar and Sci-hub to do your own research. For most issues, it's actually not to hard to understand the primary literature, and it will be much more accurate than the crap you read on popular websites.
The primary literature is a huge waste of time in so many cases though. It can be really frustrating. IF there's not a problem in one of hundreds of representational practices, you can find out later there was some other compromise (and some are HUGE) of the data or methodology.
And then if you point out a problem with a research undertaking or study when it is brought up, you can get some set of true-science-believers looking at you like it can't possibly be. It's unreal sometimes.
It's really no wonder people spread their subjective experiences as if they are science. Institutional "science" is great as a set of models and perspectives, but when you dip into the product you can start to wonder if you are effectively doing the equivalent of product research by watching the home shopping network.
In so much of the world today, "science" is a brand, and everyone is told to listen to "scientists" because they "follow the science" and know the best way. The genpop couldn't possibly even begin to wrap their heads around text written above a 5th grade level, so they are told they need to listen to the "scientists" by "authorities"
Seems quite difficult. For example in his study the sample size was a problem:
"Here’s a dirty little science secret: If you measure a large number of things about a small number of people, you are almost guaranteed to get a “statistically significant” result."
"A woman’s weight can fluctuate as much as 5 pounds over the course of her menstrual cycle"
As a regular person how would I know what the correct sample size for this type of study should be? Remember, that number isn't some fixed value but a range that increases confidence (if I'm using that wrong sorry) and it's all subjective.
I would also have to think about the uncontrolled factors, like mensuration, as having an effect since he didn't mention it in the original study.
This is far too complicated for the majority of people.
Plus something being proven by a study, even if well done, still doesn't necessarily prove anything. It generally requires replication to confirm. People have this idea that if they find a study that says something and it looks very legit, it must be true. But studies come out and are later found to be flawed or straight up wrong all the time. Primary sources are not a great source of information for most people.
"But studies come out and are later found to be flawed or straight up wrong all the time."
I didn't even think of this. You could be looking at a study that turned out to be disproven or even fraudulent (though maybe those are removed from journals). This is why there are experts in particular fields who keep up to date and should be the source.
The problem is the current wave of anti-intellectualism which turns everything into black and white easy to digest situation. A person lies once, don't trust them. A newspaper printed one or more articles that turned out to false, don't trust them. Doesn't matter how long ago or how many other truthful articles, they are done. However, like the GP tried, they don't offer any reasonable solutions to how are you suppose to get reliable information.
I wonder if the true goal of some people who push equal mistrust of historically standard sources of information is to make it easier for people to lie.
The current problem is scientism not anti-intellectualism.
The number of times I was told that I should shut up about covid because I'm not a epidemiologist was astonishing. What's more I was criticizing their _computer models_ something I do have a degree in.
In his book The Atheist’s Guide to Reality, philosopher Alex Rosenberg defends his conviction that “the methods of science are the only reliable ways to secure knowledge of anything.” His philosophy is called scientism and is held by many of the world’s skeptics. In the spirit of his anti-supernaturalist leanings, Rosenberg asserts that “science provides all the significant truths about reality, and knowing such truths is what real understanding is all about.” In other words: if science can’t prove it, it’s not worth believing.
Blind support of what people think science is without the understanding of what science actually is.
If anyone asks you if you "believe in climate change" then you're talking to someone who does not understand science and instead has fallen for scientism. The actual question is "do you understand climate change" since belief is not required.
The defintion of belief is "an acceptance that a statement is true or that something exists."
I blindly believe in almost everything in science. For example gravity, I don't understand gravity beyond some of its effects.
You think that's strange? Should I say I don't believe in gravity? Belief and understanding are two different things.
Let me pivot here and say "in most situations should just believe whatever the majority of experts in a particular field say is true but that has the highest probability of being true"
I largely disagree with the term p-hacking as its not specific enough to the nature of the violation (omitting experiments, straight up [partial] fabrication of data, ...). A problem is that often it is odorless. What is the odor of omitted data, omitted experimental runs, ...?
Regarding "data dredging" i.e. testing many candidate correlations, I have mixed feelings. As long as all tests are mentioned or at least the number of tests provided I don't see a problem really.
A stream of 100 papers stemming from a 1000 different studies each truthfully based on a single test excluding the null hypothesis over its dataset at p=0.05 and a single study testing 1000 properties and finding 100 at p=0.05 have the same expected number of spurious and real findings. A scientist with more programming skills or support, more compute, ... will simply find similar quality results faster. Scientist B may intuitively feel jealous about the quantity of similar quality results that scientist A picks up with a dragnet approach, and the jealousy may or may not be justified (not having the same amount of compute, support, training, etc.), but that does not justify accusing scientist A of fraud, in those cases where there is none. Just like one factory worker might work faster than another (after part of his job on the factory line became automated) that doesn't mean its being done less well.
What disturbs me the most about the article is the following:
>Luckily, scientists are getting wise to these problems. Some journals are trying to phase out p value significance testing altogether to nudge scientists into better habits.
This worries me enormously. Plenty of non-scholarly articles, like news articles mess up units, especially when time is involved. Product information also often contains rather intentional unit confusion. I hope the solution wouldn't be to simply ban units!!!
If some people use erroneous arithmetic, should we collectively ban numbers too?!
In the face of malpractice we shouldn't do away with the theory, we should think of ways to force practice to adhere to theory.
Again, I fully agree with your suggestion to set tighter significance levels. When we observe readership complaining about the huge number of results, and of its low trustworthiness, the most obvious solution is to set stricter significance levels. At some point levels will become tight enough that explaining outright fraud will become so hard because the probability of an effect later discovered to be nonexistent resulting in a spurious false positive at the stated p-value becomes embarrassingly low.
If the dataset sizes are chosen large enough, having separate test and validation sets can help tremendously.
Doing a legit study is hard, and expensive, because there are a million things to control for. Finding even 30 people who live basically identical lives, and are physically identical, and who are prepared to actually do something specific to themselves is a major barrier.
But 30 is a bare minimum. 60 is better (ie 30 active, 30 control) and the ideal is in the thousands.
Even the fact of taking money to be in a study is already non-normative, and has to be controlled for.
So, take any study on human behaviour with a huge pinch of salt.
That said, some studies can be replicated, and correlate well with what we experience. Dunning-kruger and loss aversion are easily (and often) replicated, but also correlate well with everyday experiences.
When it comes to diet and weight-loss though, well, consider it all rubbish IMO - 99% is rubbish, and figuring out the 1% is hard.
Sample size needed depends on effect size. If there's a disease that has resulted in death within a year in every recorded case (assume its a common disease for the sake of argument) in history and your pill cures 2 out of the 7 people in your study? Damn, you've got a fantastic pill there!
My favourite example is if you have a pill you think allows people to fly and you administer it to 1 random person and they shoot off into the sky? Despite sample size 1 that's actually pretty strong evidence :D
That being said, with which I have some level of agreement, I have adopted chocolate in this way.
I started drinking coffee because I had heard that it was the primary source of antioxidants in the U.S., and I decided to start adding unsweetened cocoa powder to every cup.
I get all the health benefits of chocolate this way, but none of the fat or the sugar.
Is it also easy to tell if the author is a fraud? Or if the study was designed specifically to get a specific outcome?
How do you tell from scihub if an article was funded by a party with affiliated interests?
What you do is not trust a single article. You download about a dozen in the field, including review articles if you can. You look for journal titles and journal rankings. And check whether the authors come from well-known research institutions. Click on the "cited by" link and check the papers that cited them if there are any refutations. Looking through 30-40 papers sounds like a lot but it only takes a couple weeks at most and it's not a lot of work at all if the issue is really important to you.
I remember growing up in Canada, there were commercials about "house hippos" [1] and the first time I saw the commercial, I actually believed what was being presented.
Thinking back, I think the campaign actually worked on me because it made me realize how obviously fake things can be made to seem real if it's presented by a seemingly authoritative figure.
Ha, the Association of Professional Liars ("Concerned Children's Advertisers) put out an ad warning you about obvious lies, so they can deflect blame to the victim for believing their less obvious lies.
I really DETEST people like this. All they do is rob people of their trust and leave the world worse off. They are no different from the likes of Tucker Carlson who chuckle that they got a rise out of their critics. "We weren't being serious. We are entertainment, not news", when caught out.
Then they have their usual defenders, including in this thread, who take the stance that it serves the uninformed reader right, that one should do one's own research and not trust everything that is written or said. What does one base one's research on? What sources does one trust? What does one do when one is not an expert on that topic, but merely interested in it?
Trust is a beautiful thing. Let the trust-breakers not get a free ride.
I'm thankful for people like this. They make apparent the systemic flaws that we--the people of the earth who depend on those systems--may otherwise be unaware of.
I see this as a form of white-hat hacking. Telling off white-hat hackers leads to less white-hat hackers, which leads to less vulnerabilities discovered by white-hats, which leaves you open and vulnerable to grey- and black- hat hackers (e.g.: a student of dietary studies hoping to receive acclaim for their research, even if the outcome of that research being published is overall harmful).
This is one white-hat publishing one paper. I assure you there are far more grey- and black-hats out there producing a greater many more papers.
The people of the groups that we have delegated authority to (the governments, megacorps, and supporting companies/associations/schools of the world) are doing a rather mediocre job of managing all this stuff, so I don't mind an individual stepping up and trying to bring attention to something like this that has such negative effects on so many people.
I think if you hacked a bunch of people and then were like, "see, look, you're vulnerable," it would be seen as a violation of privacy and professional ethics. You'd be in serious legal jeopardy.
When I was young and foolish I did similar things, and I regard them as mistakes. On one occasion I accidentally endangered an untold number of people, likely thousands, and I'll never know what the consequences were; they could be nothing, or I could've gotten someone killed, and I'll never know. (Please note that I have grown & matured, and don't engage in similar behavior anymore.)
The same goes for spreading bullshit. It's not proving a point about a potential harm, it is knowingly perpetrating that harm. And you have no way of knowing what the consequences are; what people do with the bullshit you've fed them.
You don't need to create bullshit to illustrate this point, there's plenty out there in the wild to choose from and write about.
This definitely resembles black-hat much more than white-hat hacking. At the end of the day, the author is still spreading fake news, getting people trying to lose weight to eat more chocolate, and giving tabloids more advertising dollars. A white hat hacker doesn't cause any harm.
No, we don't need more ugliness in the world to point out ugliness. Let them spend their time and effort by shaming those who mislead, or gently correct bad interpretations of science, as the case may be. Those who are lazy or mislead must pay a price in society, through social and financial disincentives. Otherwise we have the Alex Jones and Tucker Carlson and the dictatorial politicians of the world run riot.
Then the scientists should spend their time naming and shaming other articles that those very journalists and papers have written, not contribute more bilge that others have to factcheck. This is not equivalent to ethical white hacking.
We need more people like Ben Goldacre ("Bad Science").
Bad Science was published in 2008. Whatever strategy it advocated has clearly failed to gain traction, as the problem is even worse than when it was published. Might be time to be open to some other, more bombastic, strategies. You seem to be saying the ends did not justify the means in this case -- I'm not so sure of that. We're living in a world where some of the most educated people among us put signs in their front yard that say "in this house we believe... science is real". Yikes!
This debunkint of fake news didn't go viral. Fake news goes viral. Now what? Planting more fake news as a troll doesn't help. 4chan style "spam lies for lulz" makes everything worse, not better.
In the general case, I agree. In this case, science research/publication, there should be no trust necessary. The issue here isn't that your average Joe reading an article was fooled. The issue is that the news journalists reporting on this were fooled into writing about bad science. These are the people/organizations whose job it is to curate what to report, and who's trust is lost.
I agree that journalists _should_ do due diligence. But society is full of lazy and evil people, who will not do it, or worse, manufacture a false trail of though unsupported by evidence.
But "scientists" like this who think of themselves as doing the public a service shouldn't have to pile on more shit on the world. If they want to do good, they should do exposés on lazy journalism.
In both cases they should pay for their misdeeds, at the very least by being shamed in public, or losing govt funding for some time, etc depending on the scale.
I don’t think that’s fair at all; he’s pointing out a critical flaw — the trust being given out today is given blindly; he’s just pointing out how trivially anyone can participate in that abuse, and with such a low barrier to entry, we should assume there are many malicious actors already executing on it.
Fine, he hasn’t given a solution beyond find better experts, but you’re basically just blaming the messenger; the trust has already been broken, you just don’t know it yet. And the trust-breakers are those experts and verification systems, that don’t actually do any verification. Your advisors aren’t doing their jobs.
And you’re cursing him for showing it to you? Or because the sham should continue uninterrupted? We don’t have a system of trust but verify; we just pretend to.
No, he's not a messenger. He's the source of a fresh new mistruth. I don't need more ugliness to know that there is ugliness in the world, so yes, I am cursing him. Bild does enough trashy journalism that it doesn't need help.
I think if it were obvious to all onlookers that the bild can’t be trusted anyways, then him acting as new source would be completely a non-issue; no one trusts the bild anyways. He’s not really a source of mistruth any more than a raving lunatic babbling away in the outskirts of town.
If that’s not the case, and people do trust the bild, then acting as source is problematic; but it seems to me that exposing the lack of trustworthiness is then important and unambiguously a net positive — at least if it reaches the same people who’ve been tricked in the first place.
There not happening. Th people who spread fake news (on purpose or ignorantly as A/B automatons) have no interest in publishing proof that they are misbehaving.
A little while back someone posted in a thread that people who couldn’t identify “obvious fake” currency deserved to be defrauded.
I am frequently appalled at the lack of compassion that some have for their fellow humans.
I can only guess that it’s a sort of survivors-bias, where “it hasn’t happened to me, so I must be doing what is right”
It seems to be a prevailing theme in the financial sector to blame people you con for being stupid and deserve nothing better.
I think the mindset stems from short term stocktrading, where you offload at the top to some sucker, or buy from some sucker at the bottom. There is no cooperation just zero sum.
And this mindset spreads from there to other areas where it is not very nice thing to do.
Before I checked the article, I agreed, but this is a bit more about calling out institutions (science reporters) who really ought to be doing at least the least amount of legwork before publishing.
Like, sample size 15. You can check that before you publish. You do have some responsibility to if you are a reporter, no?
I dunno compare it with someone releasing "left-pad-2", with the code actually printing that you should really audit you deps.
So yes, I agree, and might have hoped the plug was pulled a little faster, and the author really rides the newspapers into writing redactions, as it's really testing in prod. But like please, gatekeepers, publishers, trust the fad diet stuff a little less.
Bizarre that this is the top comment on this thread right now. I agree they've done some damage, but you seem to have dismissed their justifications without any explanation. Do you really believe we should trust random tabloids just because "trust is a beautiful thing"?
I was already plenty aware of fake news and p-hacking, but this article still proved useful to me by demonstrating exactly how far you can go with it and which sources to not trust.
I can offer you a thousand examples of "How far can fake news travel". Pizzagate, anti-vax, moon conspiracies ...
"_Falsehood flies, and the Truth comes limping after it_" (Jonathan Swift, 1710)
Not only don't I want anyone to add to a pile of mistruths, I want purveyors of misinformation to face social and financial disincentives.
> Do you really believe we should trust random tabloids
Of course not. "Tabloid" === trashy journalism, so no, no trust. Those of us who believe in being good and not misleading people should join efforts to make sure that misinformation does not get out of hand; at the very least we should not be manufacturing trash of our own to prove to others how badly it can go.
Trust is a wonderful thing... in a world where most people are trustworthy.
In a world where the replication crisis is a thing and a third of psychology studies replicate?
In a world where the vast majority of "science" stories published in news sites are grossly misleading or wrong?
It's important to destroy that trust.
Though I do have one problem with this article. It implies the main problem in the reality->public pipeline is the scientists themselves, when it's actually journalists, university PR teams and the general public in that order. (Though blaming the last is in poor taste as they're time poor comparing to the others and have no training in how PR teams and journalists mislead)
I saw this specific instance as somewhat differently. These were all people with inside knowledge of a system that they saw as broken, and wanted an effective way to communicate that fact to the world. They accomplished their goal.
So in a slight twist on your example, it'd be like if Tucker Carlson had actually been spending the past few decades putting together a documentary about how Fox News and the Republican party sold out democracy in exchange for political power and money.
You certainly might not be a fan, but it would absolutely change how you look at him.
> So in a slight twist on your example, it'd be like if Tucker Carlson had actually been spending the past few decades putting together a documentary about how Fox News and the Republican party sold out democracy in exchange for political power and money.
> You certainly might not be a fan, but it would absolutely change how you look at him
Um, no. You are hundred percent wrong. I'd be white-hot with rage at the destruction of society that his large-scale deceit had wrought in the time he was conducting his experiment.
Yeah, so without trust this research would take a long time.
First of all, you'd need to know what to learn, which is pretty hard.
In this case that would be what? At least basic and some advanced understanding of human anatomy, metabolic processes, neurotransmitters maybe, some pharmacology, nutrition, that's too much for most people.
It would be easier to just learn how to tell honest people from scammers/liars/propagandists and then trust some of them.
If they had falsified the design or results of the study, this would just be a hoax. But the design of the study was obviously flawed in a way that does not require specialized knowledge to detect. What knowledge it does require should be a basic requirement of the job of science journalist.
One of the first things that an elderly friend of mine noticed when he visited me in the US is that all the currency notes were of uniform size. He said, 'that's the sign of a trusting/trustworthy society. Visually impaired people trust others to not cheat them".
There are countless things that you trust by default. Think of all the Bayesian priors. It is too tiring to be distrusting and cynical all the time.
The trust-breakers are already getting a free ride by way of people not knowing. The point is that "trust-breakers are in much greater supply than one might think, so think/learn critically instead of simply trusting". Or, one can continue basking their head in sand.
I am thankful for this study. I haven't trusted diet news articles about studies for at least a decade, and since the "Venus has life", I am very sceptical of any paper claiming "novel" things (I am a researcher myself).
So what do you do when someone who breaks that trust for money/control/nefarious purposes? Like say hiding the fact that sugar is killing more people today than cigarettes ever did?
Where I live, a chocolate bar is ~600Cal. I would definitely lose weight living on 600Cal a day (it won't be healthy/sustainable but the weight is going to be lost for sure).
Donate the lost money to the charity of your choice.
It's simple supply and demand. The modern NPC requires constant firmware updates, the content of which is immaterial. If the NPC goes for too long on the same firmware version, s/he runs the risk of "having a think" and corrupting the execution.
Would you please stop posting unsubstantive and/or flamebait comments to HN? You've been doing it repeatedly, unfortunately. It's not what this site is for, and it destroys what it is for, so we have to ban accounts that do this, so please stop doing this.
It's fine not to use your real name on HN but it's not fine to break the site guidelines. If you'd please review https://news.ycombinator.com/newsguidelines.html and stick to the rules when posting here, we'd appreciate it.
People act like "Oh my god this guy did a thing that no real scientist would do". Yet, my cousin was in a biotech lab where she was unable to replicate the result she was relying on to do her next work. The reason? Original result photoshopped the outcome. She quit and went elsewhere.
Meanwhile, I'm sure some online expert has gone and linked that study saying something like "There's evidence that this gene does that" and whatnot.
There's wholesale fraud in science. And if you guys want to act like it's not then fine. Go ahead and believe what you want. It's your funeral.
This was weird to read personally. I used chocolate in all kinds of ways for extreme weight loss, so "fooled millions into thinking" certainly has the subjectively-not-correct feel to it. But the headline also does a great job of leaving this open question of, "but in the end chocolate is really not a weight loss tool, right?"
"Right?" (Anakin stare)
When I was wrapping up my first big cut, I was feasting on boxes of chocolate bars that I found on clearance at local stores. There is nothing quite like watching the scale numbers go down while you eat junk food. My BMI entered the "normal" zone while I was literally eating like a kid in a candy store.
These days, I still buy chocolate bars and try to keep them stocked up. I also buy cacao powder and add it to my coffee with some salt and low-calorie sweetener, in lieu of breakfast. It can provide a really awesome calorie-efficient uplift for those of us who are sensitive to cacao.
So, stuff like this really stood out:
> Testing bitter chocolate as a dietary supplement was his idea. When I asked him why, Frank said it was a favorite of the “whole food” fanatics. “Bitter chocolate tastes bad, therefore it must be good for you,” he said. “It’s like a religion.”
It just seems like the kind of anecdote you'd want to avoid. Like why not use some science to identify a population & a practice? Or is part of the follow-up also going to point out "plus we just ran with somebody's anecdote to get the ball rolling lol"?
This is the first time I'm hearing about this...and I have so many questions.
What did this do to your body aside from the weight loss? Did you have anything anything else to supplement? How did you poop (there's almost fiber in junk food)? How did you manage your blood sugar? Did the caffeine in the chocolate have an effect?
IMO it's a fun topic. Look up dirty cuts on /r/loseit, you'll probably find some funny pictures. I remember a photo of an absolute bean pole holding up his food for the day, IIRC something like a king-size oreo chocolate bar.
(Eating all the king size stuff like it's a "meal in a bar" is an amazing feeling btw)
The worst effect I had from any food was from salad. This was at a different point, when I was cutting down on fat and lining up my macros. I ate so much spinach (along with my almonds) that I over-oxalated myself and went to the ER with massive pain from my fresh new kidney stones.
Pooping is very nicely taken care of by hydration. This is a surprising one to a lot of people. It's not a problem. I drank over 100 oz a day. BTW never do that with mainly pickle juice. Funny experience.
I also sprinkled some other foods in there for variety. But maybe at least 2/3 of my meals were junk food for a substantial period. And for some solid period of time it was all junk. As much junk as possible. It's a very extreme state of mind when you are cutting that much so you need the emotional support of a pretty candy wrapper and child-like attitude sometimes.
Blood sugar is not a problem for me or in my family history--it's a safe zone. Subjective factors like this are very, very important btw. I was also exercising lots, like 2 hours hiking every day and the BWF routine from reddit, gobs of pull-ups since I could never do them before, etc. Resting pulse was in low 40s, always got grilled on that one when I went to the doctor.
> And just how much chocolate are we talking here?
IDK, how would you measure it going beyond the powder mix-ins? I consume at least 5 tbsp of raw cacao powder a day, plus all the other times when I'm eating chocolate, but that's less straight up chocolate.
(I also have a weird theory that chocolate and sweets are a very good match for my subjective psychology and it's kinda related in general to people who eat like Warren Buffett btw.)
I've done extended fasts and can see how they can be healthy if you don't push yourself too far and eat properly before and after. But why eat a chocolate bar in the middle? Once you get a fast going, you go into ketosis and start to lose your appetite.
Losing weight is a matter of consuming fewer calories than you expend, but weight is only one indicator of health. The problem with junk food (like high-sugar candy and soda) is that it is sparse in micronutrients and fiber. A person can lose weight with a diet high in junk food and low in micronutrient-dense food if they restrict their calorie intake, but that diet will result in nutritional deficiencies that adversely impact health in ways not related to body weight.
If by "periodicity", you mean temporarily switching to a junk food diet before reverting to a balanced diet, your health still suffers during the time that you are on the junk food diet. It would be healthier to restrict calorie intake while sticking with the balanced diet, which would not cause nutritional deficiencies.
Supplements can help make up for some of the deficiencies in a nutrient-poor diet, but no set of supplements will include all of the micronutrients, antioxidants, and other compounds found in a balanced diet of nutrient-dense foods. Most people are better off eating a balanced diet instead of eating junk food and then taking pills to try to fill in some of the gaps.
> Food is a complex mix of vitamins, minerals and phytochemicals (plant chemicals). Phytochemicals are an important component of food and help to reduce the risk of conditions such as heart disease, type 2 diabetes and some cancers. Vitamin and mineral supplements do not provide the benefits of phytochemicals and other components found in food, such as fibre.
> Whole foods usually contain vitamins and minerals in different forms – for example, vitamin E occurs in nature in eight different forms – but supplements contain just one of these forms.
> Vitamin and mineral supplements can’t replace a healthy diet, but a general multivitamin may help if your diet is inadequate or where there is already a well-supported rationale for you to take one. If you feel you could be lacking in certain vitamins and minerals, it is better to look at changing your diet and lifestyle first, rather than reaching for supplements.
“CICO” so yes you can lose weight (fat and/or muscle) on any food as long as you leverage the first law of thermodynamics and don’t eat too much. But I find chocolate kinda addictive and crave
inducing. So I would make it a rare treat and on an already full stomach.
Yes, CICO for sure. Plus a bunch of other models. I had never realized how much variance there was between BMR & TDEE estimation models, etc.
I ate it enough that I didn't crave it anymore and that was how I knew I had enough ;-) I had weird/off-the-beaten-path theories on "normal" food by this time though so that concept wasn't interesting to me anymore. There was no competition for what to eat and no set "healthy food" model at all.
I think a lot of it is each person is different. You have to experiment. I find low carb (possible because it basically becomes high protein for me) reduces hunger and cravings and the non-sugar sweeteners keep me sane by letting me have treats.
For real. I admit I'm a simp for nearly anything Latin-American with chocolate in it. From Milo to yeah, homebrew-style copycat recipes of something my neighbor's grandma brought over to try.
It has been well established in a wide variety of studies that cocoa (chocolate) significantly reduces blood pressure. While this is unrelated to weight loss, one might come away from this article with the idea that studies that proved the health benefits of cocoa were all nonsense.
> The key is to exploit journalists’ incredible laziness. If you lay out the information just right, you can shape the story that emerges in the media almost like you were writing those stories yourself. In fact, that’s literally what you’re doing, since many reporters just copied and pasted our text.
Did you really fooled them? One common theme that I have found when looking into bitter/pungent/spicy substances was that they all had in common was that they improved insulin response and helped blood sugar regulation. Cinnamon, sumac, pepper, chillies etc. My intuition tells me that probably cocoa has some effect in that department.
“Bitter chocolate tastes bad, therefore it must be good for you,” he said. “It’s like a religion.”
This is exactly what I think too.
If I read the fake article, I would easily fail his trap.
Fast food tastes good, and it's bad for health
because a huge amount of money has been spent on how to make it addictive.
Use addictive and unhealthy ingredients like sugar, salt, and fat
as much as allowed by regulation.
So it's tasty and unhealthy.
But this doesn't necessarily mean not tasty food is healthy.
Trader Joe’s 72% Cacao Dark Chocolate is about right for me. Per the label, a square is 170 calories, of which 44 is due to carb (added sugar), and the rest is fat and protein.
This you doesn't really know what he's talking about. You can take any valid result, add in 20 totally irrelevant measurements, and then claim that the valid result is "p-hacking".
There's an obvious reason to suspect that a low calorie tasty snack would help with appetite control and weight loss.
I don't think it's particularly fair to push the responsibility of peer review onto journalists. Given that the state of diet science is in bad shape, I suppose journalists should be conscious of this fact and do their due diligence, but ultimately this is a failing of the academic world more so than the journalistic one.
This is sort of like saying “the problem is not that our security is bad, it’s that people post malware on the Internet. Someone should put a stop to that.”
There is no way to prevent anyone from starting a fake science journal. There is no worldwide registry of which academic journals are legitimate. They are just different groups of people who do their own thing. “Academia” is not a sufficiently coherent group of people to set standards in this way.
Maybe there should be a worldwide registry for which journals are real, but it wouldn’t be any easier to set up than a registry of which newspaper websites are real.
Journalists operate in a world where people try to trick them. That’s just how it is.
> There is no worldwide registry of which academic journals are legitimate.
I never thought about that before, and it genuinely surprises me.
Educational institutions like universities need to be accredited in order to be able to attract decent students, investment, etc.
It seems genuinely odd that there's no equivalent version of accreditation for journals and academic publishers. That ensures a certain level of qualification for board members, rigor in review process, peer review etc.
The academic world is so much about gatekeeping... that there's no gatekeeping around journals?
Has anyone fact-checked this story? Did the author actually do a bogus study, or did he save time and money by photoshopping some fake news articles and write a fun story about not believing things at face value?
“Bitter chocolate tastes bad, therefore it must be good for you,” he said. “It’s like a religion.”
I am pretty sure that it was the opposite idea: feeling bad (stressed, depressed, angry, apathetic) is unhealthy, and chocolate can make you feel a bit better. (Same with coffee, wine, and such.)
I feel like a lot of people are missing a big point. This isn't some "annoying person" or someone causing harm. If its this easy to do it _intentionally_ imagine how easy it is to do something with subtle, unconscious (or conscious) bias.
With every fad there are people like that behind that benefit from selling not too toxic and pretty much inactive material to gullible people. Take turmeric, raspberry lactones, maca, lion's mane, the list is long...
Curcumin is actually a mild anti inflammatory. Not as effective as aspirin, but still useful for many things.
I found either of those as well acetaminophen significantly more effective than the opiates my doctor kept pushing on me for post surgical pain relief.
Unless I'm missing some systematic flaws, this seems credible enough to say 'probably a good low side-effect nsaid similar in effectiveness to aspirin':
Imagine thinking that eating chocolate daily in limited quantities as the most carbsy snack of the day (90% chocolate might have 8% of sugars) equates to eating any chocolate snack daily on top of a shitty diet.
Note that I am not claiming this is the situation with p-hacking, but I've heard a lot more about p-hacking than this article, and a prank wasn't really helpful for me either. A reader reading about a prank like this would have no way to know whether to view it like the above situation. For all they know, it could be a similar kind of not-so-funny prank. Shouldn't there be some evidence provided that p-hacking is a real problem, to actually convince people that the the vulnerability you've identified merits addressing?