1: GWAS to look for genetic association with chocolate consumption.
2: Mendelian randomization to assess whether an inherent predisposition for increased chocolate consumption leads to increased cognitive function.
The same approach has been used to assess whether there is a dose-response of alcohol on mortality (the answer appears to be "yes," and there is no U-shaped curve).[2; non-paywalled]
This is mostly just to say that I'm not convinced by such tortured data in an association study, but if there is enough interest in the hypothesis, then the path forward to assess causality is clear.
Food companies distort nutrition science by funding studies. That's why we have very specific titles like "Concord grape juice, cognitive function, and driving performance," or, "Walnut ingestion in adults at risk for diabetes." Usually the results are positive.
True and important. But this particular study doesn't seem to be an instance of that phenomenon. So far as I can tell, no one involved was funded by the food industry, and what they did was to look broadly for connections between diet and cognitive function and this turned up.
Now, there are other problems with that sort of data-mining and determining whether they've successfully avoided them all is highly nontrivial (and I haven't tried). So this might still be bad science. But it doesn't appear to be funded-by-vested-interests bad.
Self-reported chocolate intake? There are so many possible confounders here. It seens quite implausible that chocolate can acutely enhance cognitive ability. Alternative explanations are more likely, although not newsworthy.
I also think that nutrition science, which is deficient enough as it is, should not be mixed with epidemiology, which seems designed to detect spurious associations.
I think the point here is not which prior to use for effect of chocolate, but how to account for the differences between actual and self-reported chocolate consumption, which itself might be correlated with cognitive ability.
Maybe people who ate chocolate but did not report it (because "eating too much chocolate is bad for you") tended to have lower cognitive ability - they did not realise that by reporting incorrect intake they introduced bias in the study.
There's caffeine and sugar in it, seems reasonable, plus the pleasure of it means more endorphins. That's like three direct up-regulators of brain function.
Well the researchers say in the article that sugar should impair cognition.
Caffeine and feeling good are confounders, as they aren't unique to chocolate. The researchers are proposing that it is some kind of special ingredient in chocolate you can't really get anywhere else easily that enhances cognitive function.
From personal experience, every bite of chocolate provides me a moment of bliss and happiness. I think happiness is the real reason for improved brain function. I find myself most productive and most capable when I am happy.
That's called dopamine. The chocolate you eat probably has sugar in it (otherwise it tastes horrible; try raw cacao), and sugar is a drug associated w/ the release of dopamine in the brain.
Every luxury food or drug tastes horrible until you've learnt to enjoy it. Remember your first beer or coffee? My guess is sugar would initially taste somewhat foul ('sickly sweet') if one was brought up without it.
EDIT: I'm not recommending that people start snorting or imbibing cocoa powder. However I think that if this were the only form of chocolate then one would quickly learn to enjoy it.
From my personal experience, I have to agree with wapapaloobop. I put a spoon of raw cacao in my morning coffee, because I love the taste. However I drink (and eat no) sugar whatsoever and found the taste disgusting (in the sense of spitting it out) when I accidentally drank from my girlfriends coffee some time ago.
I agree with you. I enjoy straight espresso and good black coffee. If I want to 'sweeten' it I add some milk. When getting coffee at shops I have mistakenly received one of those super sweet caramel, sugar whatever they are called and almost had to spit out the sip.
I wonder if it is learned or if people who like sweets have different taste buds than those who do not. I also like really hoppy beer and bitter candy.
It is partially learned, partially a built-up tolerance.
Anecdata: About three years ago I drastically reduced my intake of carbohydrates, including all forms of sugar and starch. After around two months, I tried tasting some things that I had stopped eating -- fruit juice, a bit of pie -- and discovered that they were horribly sweet to my tongue.
The amount of sugar going into foods has increased. I remember there being around 20 grams of sugar in most sodas. Now, there is typically about twice that amount, and Honest Teas are "lower in sugar" with 19 grams. The US food industry's tactic is this: foist lots of sugar on people when they're kids and don't know any better. Then, when they're adults, you can get away with plying them with lots of high fructose corn syrup, which is super cheap due to subsidies.
It's actually delicious, and each bean has its own flavor. If you live near Silicon Valley, head up to Dandelion chocolate and you can try some there (work up to it by getting used to the bitter chocolate first)
There are 90% and 95% cocoa chocolate that taste really good, you don't need a lot of sugar to take the bitterness off. Look for shops making their own dark chocolate, or smaller brands. The most well known brand people associate with dark chocolate tastes bitter and powdery (I suspect they use too much solid and not enough fat)
Note that cocoa itself makes the brain release large amount of dopamine.
It's better to wait for multiple, large randomized controlled trials to be performed before making claims about improved brain function.
But maybe, for now, we can eat more chocolate for another ~proven benefit: "Flavanol-rich chocolate and cocoa products may have a small but statistically significant effect in lowering blood pressure.."
This old chestnut has been published in mainstream media for decades, the Daily Mail in the UK much feature it at least once a year (on the odd day a cancer scare doesn't exist). I remember every exam been given chocolate to take with me, and we're going back to early/mid 90s here.
There is a radio program in the UK called More or Less, who''s (whom's?) motto is "Correlation is not causation". Admittedly, I have neither read the paper, nor did I fully understand the section on proving the direction of influence, but I can't help but feel this research makes pretty poor use of statistics and logic, however much I wish to believe the result. Is all nutritional research this bad?
You haven't read the original paper, don't even understand the methodology they used to reach their conclusions and yet you "can't help but feel this research makes pretty poor use of statistics and logic"?
It seems something is bad here, but I don't think it's the nutritional research. If you can't be bothered to actually read the paper, why do you think you're able to critique it?
That wasn't quite what I said. I read and understood the first part, but I didn't understand how they tried to justify it in the second half without a causal link.
Hmm, yes, that is true. But "science" doesn't pull explanations from the air (that would be religion or superstition, ya know).
Science first finds unlikely or unexpected correlations, unlikely or unexpected measurements, and seeks to understand from whence they came. Are they artifacts of the observation process? Tainting of the measurement process? Or spurious correlations, such as the one between height and spelling ability (both of which are strongly correlated with age, up to a certain point)?
This particular research announces an unexpected correlation and goes to pains to show that it is real, and not spurious. This is the observation phase, the oh, didn't expect that phase. Explanation will come later, likely after someone designs a more accurate and precise experiment to better correlate chocolate consumption and cognitive ability.
If you go outside with a measuring tape and a stopwatch you'll discover there's an unlimited number of measurements you can make. So that can't be how scientists operate. They are guided by prior theories.
>[science] doesn't pull explanations from the air
Well, people think about problems and then ideas sometimes pop into their heads. No one understands how yet. So explanations do come out of the air, metaphorically speaking! That doesn't make them false. It's what happens afterwards that counts. Most ideas are rejected by criticism and a few go on to be tested.
you'll discover there's an unlimited number of measurements you can make
Absolutely correct.
So that can't be how scientists operate
I'm torn between writing "citation needed" and "unwarranted conclusion from stated premises" so I'll go with both.
The head of the department where I did my physics undergrad spent years, decades, measuring everything there was to measure about plasma: Energy input, energy output, temperature of the phase change (plasma is a distinct phase of matter), spectral distribution, etc., etc., etc.
He wrote many papers, primarily on his measurements, and also on how he refined and improved the measurement process. I don't recall at all well, but I don't believe there were overmany hypotheses, let alone theories. He was about data, not models.
There are theorists who spend the bulk of their time with pens and imagination, occasionally - for some, rarely - checking in with the empirically real world.
A very few of these have names we celebrate for changing the shape of the world - Darwin, Newton, Einstein. And these particular luminaries knew the shape of the real world, knew existing theory didn't fit. (Newton is famous for saying he stood on the shoulders giants. Kepler was one such. Kepler measured and measured and measured and devised a relation. No theory, just a relation to link his measurements.)
Without the many scientists who do nothing but measure, measure, measure, scientists many of us would consider tedious drudges, these luminaries could never have known that theory didn't fit data. Or that data didn't fit intuition. Or that data was just plain weird.
Science does indeed proceed this way, bottom up, from measuring anything and everything of interest (to someone - maybe not to most of us, but to someone - I invite you to read The Map that Changed the World).
Science also proceeds from the top down, from theory to measurement. But often because there was another theory that failed to align with existing measurement. Think general relativity and its better prediction of the motion of Mercury, which failed to align with Newtonian Gravitation, which itself would never have arisen had Kepler not spent so many hours measuring, measuring, measuring. (And general relativity was a generalization of special relativity, which was created to explain a particular vexing measurement, the constancy of C. Einstein did not set out to reinvent gravity, he started simply explaining how C might be constant. He got to gravitation when he generalized relativity to non-inertial reference frames (that's what the special meant: inertial reference frames only).
We measure first. Just because we are that curious.
I have to take this with a grain of salt considering the guy who put out the fake research on how eating chocolate can make lose you weight was just revealed to be a huge hoax:
I remember there was a similar headline to this — comparing the high ratio of chocolate-eating to non-chocolate-eating Nobel laureates — in the Fall of 2012.
My Quantitive Methods lecturer used it as a perfect example of the correlation/causation fallacy; it was certainly a good way of lightening the mood at a 9am Friday lecture :)
The study is still 'correlation-digging'. Bonferroni won't fix the problem that these kinds of studies will throw up statistically significant associations that provide no insight into anything meaningful, due to confounders, biases, poor design etc. There is no equivalent xkcd however.
There's more to it than just the things you mentioned, and I'd argue that it's an even bigger problem. The fact studies are adaptively chosen and performed based on the results of past studies on the same (or related) data leads to increased instances of false discovery. This is something that is directly addressed in some very recent recent work (not uncoincidentally making reference to this exact XKCD comic):
Freedman's paradox refers to adaptive data analysis within one dataset. The reusable holdout is designed to permit adaptive analysis on the same dataset without inflating the false discovery rate. The problem doesn't apply if you are collecting new data and testing the hypothesis, that's the ideal scenario.
Yeah, Freedman's paradox is definitely part of what I'm alluding to. The other aspect is that for widely studied phenomena, some groups are likely to come to wrong conclusions just due to statistical error. E.g., if 100 groups independently investigating whether "Eating chocolate is associated with improved brain function" (and we'll assume it's false, just to make my point), a few are likely going to come to the conclusion that it's true. Even worse, some studies with "expected" results may not be published (either by choice of the investigator or by the reviewers); instead, the "interesting" or "exciting" results will get published.
I don't think that anyone would take the results of this article as a statement of absolute fact. In the discussion they mentioned some more (better controlled) studies investigating the role of cocoa flavanols in "cognitive tasks". Doing any longitudinal study with humans suffer from confounders, biases, and poor design because we can't do controlled, long-term research on humans. You just have to take these limitations into account when you are interpreting the results and use any findings as areas for possible further investigation.
Yes, that's fine, I was pointing out that correcting for multiple hypothesis testing does not deal with the issue cited (albeit rather abruptly) by yxitcti.
Correlation-digging studies absurd or otherwise have led to an awful lot of peer-reviewed literature detailing a raft of <<molecular mechanisms>> by which flavonoids may be implicated in improved brain function.
To pick one at random: "Flavonoids exert a multiplicity of neuroprotective actions within the brain, including a potential to protect neurons against injury induced by neurotoxins, an ability to suppress neuroinflammation, and the potential to promote memory, learning and cognitive function. These effects appear to be underpinned by two common processes . . ."
There seems to be a genetic component to how we perceive the taste of some of these artificial sweeteners [1]. Personally, I also think stevia is gross -- bitter and metallic aftertaste.
The best AS for me is sucralose (Splenda). I can't tell it apart from sugar. I've also never seen any results about it that would concern me, and it's been studied more than almost anything else we eat.
"Holt SHA, Brand Miller JC, Petocz P. An insulin index of foods: the insulin demand generated by 1000-kJ portions of common foods. Am J Clin Nutr 66, 1264-1276"
I highly recommend it as a source for glycemic response of different foods; some things are entirely counterintuitive.
I am not on a low sugar/flour/starch diet right now, but I'm trying to switch to one. I have a family history of diabetes. I've mostly cut out sugar. But cutting out flour/starch has been a bit harder.
Those with 'improved brain function' generally eat more chocolate. Now I can say with impunity "I prefer chocolate because I have improved brain function."
Chocolate improves brain function. Now I can say with impunity "Chocolate is improving my brain function."
I won't even argue if the study is bias or not. Let's just say chocolate is good for the brain. But is it advisable for people to consume them in certain amount daily?
Well, back in the days, there were numeric studies about benefit of tobacco too.
Would be good to make a longer study that takes into account that cocoa contains cadmium, which is quite toxic and probably might increase a chance of cancer.
Cheeseburger + chocolate milk shake.
Quinoa with kale.
What would you say is more healthy?
The comparison is idiotic.
If a person suffers anorexia their whole diet is unhealthy and has to become healthy. Their relationship with food is also unhealthy.
The former is much more suitable meal for someone having difficulties meeting their caloric needs. If they have it couple of times a day every day, then we know the diet is unhealthy.
The latter can be a nice addition to a healthy diet.
The amount of downvotes on my first comment means that quite a lot of community here have an unhealthy relationship with food.
Everything in moderation, I read a report a while back about the people who provide food for post operative care (iirc it was mostly eldery people) where they threw out the normal diet guidelines and went for "stodge", pasta, custard, pies, cake - basically anything that was easy to digest and high in calories as they found that gave the best outcome post surgery.
It wouldn't be the diet you'd eat in other circumstances but context matters.
in my personal experience, eating chocolate has pushed me into IBS trouble, which i realized such a disease existed after 15 years. I stopped eating chocolate for the past 5 years. Now i am happy.
Brain is a sugar sucker. Eat food which has natural sugar (like fruits) and you still be happy. Chocolate is just one sour junk which sugar added to please your bain (which is the trick). I think eat the chocolate without sugar and see if it sill improves brain function
Its kind of cliche to target excess consumption of a rake and claim its terrible since you took the extreme cold turkey. All pleasures are best used moderately. Its an adage as old as good and evil itself.
A series of follow-ups might look like this:
1: GWAS to look for genetic association with chocolate consumption.
2: Mendelian randomization to assess whether an inherent predisposition for increased chocolate consumption leads to increased cognitive function.
The same approach has been used to assess whether there is a dose-response of alcohol on mortality (the answer appears to be "yes," and there is no U-shaped curve).[2; non-paywalled]
This is mostly just to say that I'm not convinced by such tortured data in an association study, but if there is enough interest in the hypothesis, then the path forward to assess causality is clear.
1 = http://www.sciencedirect.com/science/article/pii/S0195666316...
2 = http://www.bmj.com/content/349/bmj.g4164