Hacker News new | past | comments | ask | show | jobs | submit login
Eating Too Much Rice Almost Sank the Japanese Navy in 1882 (medium.com)
152 points by vinnyglennon on Apr 16, 2015 | hide | past | web | favorite | 61 comments



Pcrh has already linked it below, but it's worth comparing this to the fascinating "Scott and Scurvy"[1], a similar article about scurvy, and how the widely-known citrus cure became "lost knowledge" for a while.

The general stroke of "Here's an empirical cure that people won't use because they have a differing theory (that this deficiency disease is actually due to infectious agents)" is of course similar. But the beriberi story seems more just like people foolishly ignoring an empirical cure due to, well, having a contradicting theory; it doesn't seem to have the same component of "here's a series of coincidences that seems to undermine the empirical evidence for the cure in the first place".

Also, note that while Takaki correctly identified beriberi as a deficiency disease, he misidentified it as a protein deficiency! Yet his cure still worked well enough, when he could get people to follow it -- which, note, was on top of the difficulty of getting it officially implemented. That's another difference -- if there was any problem getting individual sailors to drink their lemon juice once the British navy started serving it, Cegłowski doesn't mention it.

[1]http://idlewords.com/2010/03/scott_and_scurvy.htm


The OP article mentioned the British doped the sailors' booze with lemon juice, at least at first.


> Limes eventually replaced the lemons

The switch to limes resulted in many more cases of scurvy. The citrus cure for scurvy was lost until the discovery of vitamin C. I recall an article about this being on HN. The Brits switch to limes because they were cheaper than lemons.


Perhaps it was this article [1]. It's fascinating how the cure for scurvy had to be invented twice.

[1]http://idlewords.com/2010/03/scott_and_scurvy.htm


I have used this wonderful article for years now as a very good example of how science can arrive at a conclusion that is very, very wrong. In the case of scurvy, dangerously wrong. Science is still the best method of investigation we have, but the history of vitamin C and scurvy is a powerful reminder that in spite of our best efforts and intentions, nature can still surprise us.


That's one hell of a post. Thanks for sharing it.


A similar, little-known disease caused by a lack of dietary diversity is pellagra. It's caused by a lack of vitamin B3 (niacin). In the early 1900s it became an epidemic in the southern United States, due to (largely poor, rural) people's diets consisting almost exclusively of corn.

The most interesting part of the disease is that corn actually contains vitamin B3, but it cannot be absorbed by humans when eaten by itself. Native people always treated corn meal with lime, which allowed the vitamin B3 to be digested. When Europeans came to the Americas and learned about corn from the native people, they thought it was stupid to put lime in their food so they left that part out. Thus, causing a disease that afflicted 3 million people and killed over 100,000.

http://en.wikipedia.org/wiki/Pellagra#History


This is true. A few clarifications:

· the “lime” we’re talking about here is not the fruit, but the strong alkali slaked lime, i.e. whitewash or mortar, made by roasting limestone. You can apparently also use lye (Drano), and I wouldn’t be surprised if fermented urine (lant) also worked, but I really recommend that you know what the fuck you’re doing before you start trying to cook with Drano or construction materials. This may help understand the Europeans’ reluctance.

· nixtamalization is done on the whole corn, not on the corn meal. Nixtamalized corn is also known as “hominy”, and if you’re from the US you’ve probably heard of it under that name. It’s been common in the US since colonial times.

· When you nixtamalize corn, you don’t leave the lime in your food. You have to rinse it out after the long, slow nixtamalization process. Otherwise, the result is really bitter.

· Nixtamalization is indeed a traditional practice with a broad pre-Columbian distribution, but it is almost unknown in South America, even though the native cultures here in South America are much more intact and in many ways less assimilated than in North America; this suggests to me that it may not be true that “Native people always treated corn [...] with lime”. I mean, maybe Tawantinsuyu universally nixtamalized its corn, and then the Spanish stamped out the practice so thoroughly that you can’t find nixtamalized corn in the grocery stores there today, even as a special holiday product like chuño; but it seems much more likely to me that nixtamalization just was not universal or even common there in pre-Columbian times.

The pellagra epidemic is one of my favorite examples of how ignorance kills people.


This 1964 patent claims that you can nixtamalize corn with ammonia, so sufficiently concentrated lant should work: https://www.google.com/patents/US3117868


Why couldn't the cold, laboratory based, German "empirical" approach have lead to the same cure? A commitment to the notion that the disorder was bacterial/viral in nature could cause problems. But empiricism shouldn't, right?

They can just have a cold hard study where they isolate 100 people in a lab, and half only get white rice, while have get rice and barley. Then see what happens.


How do you get the idea of isolating 100 people in a lab, half with only white rice, and half with rice and barley? You could try hundreds other kind of experiment, so why this one in particular? Specially if you think the cause is bacterial/viral in nature...

This is were epidemiology gets very useful: you look at data for correlations, and from there you can design experiments to check for causation.


I think what you say makes perfect sense. And I agree completely with the question of "how do you decide what to test?" But in that respect, are the two approaches outlined in the article really at odds with one another? Would it have contradicted the "German" approach to use epidemiology to lead you towards hypothesis?

My main point is that it seems the real problem wasn't so much one approach over another, as one side having an unjustified commitment to bacteria/virus as a root cause.


The "German" approach could have got there in the end but slowly and with huge costs while a bit of common sense could have been quicker, cheaper and saved lives. It reminds me a bit of the obesity epidemic in the US which is probably down to too much consumption of sugary stuff but where the medical establishment only wants to look at FDA style lab studies which you can't actually do - you can't bring up a bunch of kids in a lab, one on sugary stuff, one on a more traditional diet. So millions die unpleasantly of something not much harder to fix than fixing beriberi by putting barley in the diet. At least not much harder to fix from a science point of view. Quite hard economically and politically as there are massive financial interests in pushing sugary junk on kids. I guess that's a bit like the situation in the article where politics were in play.


This is off-topic. But for anyone interested, the manga series "Jin", by Motoka Murakami has a very nice depiction of some of the health issues in Tokyo towards the end of the Edo period. The protagonist is a doctor trained in modern medicine, who gets sent back in time and we see his treatment efforts without modern equipment or medicines.


I wonder if Takaki would have had more success if he marketed his solution in the terminology of the dominant paradigm. For example: "Barley seems to contain some kind of chemical that kills the germs that cause beriberi."


That's not just a change in terminology, that would be lying. Please don't do that when marketing anything, least of all medical things.


It would be lying if Takaki knew that beriberi wasn't caused by germs.

But back in the 1870s, he didn't know that. He only knew that something in barley was helping to prevent beriberi, and he was in fact mistaken about what that something was. (He thought it was proteins when it was actually vitamin B1.) So the Antibiotic Barley Theory could have been a reasonable hypothesis.

Vitamin C is named "ascorbic" (i.e. anti-scurvy) acid because the people who discovered it were looking for a substance that prevented scurvy, no matter what the mechanism of action. The mechanism of action was only known for sure after the substance was identified.


I wonder why barley was/is considered a downscale grain compared to rice?


It's because polished white rice took more effort to make, so it was more expensive, so it became associated with the upper classes and success.

http://ask.metafilter.com/149004/History-of-white-not-wholeg...


Plus white rice tastes better.


Roman legionaries also disliked being being forced to eat barley and meat, but in lieu of wheat.

A lot of it must be to do with taste and quality, but also a desire to not eat like a poor person or barbarian.


Try making a "rice bowl" with 100% barley and you will understand. It will taste horrible.

(A little bit of mixed barley (and other grains) is fine, but it doesn't take a huge leap to see why people would prefer pure rice.)


Hah. It's funny how cultural taste is. Me? I strongly prefer barley to rice, and brown rice to white rice. White rice is... comparatively quite plain, to my palette. Sure, I'll eat some, especially if there is some sauce or something to put on it... it's not offensive, but there isn't a lot of flavour, and the texture is comparatively uninteresting. Barley, on the other hand, has a very interesting, rich texture, and a nice nutty flavour. I will eat a bunch of barley.

As a kid, my parents would make me Mochi. They were hippies, so it was the brown rice mochi. Lately, I've been craving the stuff, and every time I ask my asian friends, they get me white rice based mochi, a very different (and in my opinion, much less flavorful) food.

I eventually figured out where I could mail-order the brown rice mochi from. So good! but, again, made for hippies, so unlike the white rice mochi, which comes in individual plastic packs, this stuff comes in big sheets you need to cut up. Hippies. grumble.

But yeah, I think a lot of this is just cultural factors. I can't imagine eating tofu with nutritional yeast if I didn't have it for breakfast nearly every day of my childhood. Or nutritional yeast at all, for that matter. That stuff looks like fish food, and really, what else do you put in your mouth that is that color? But it's so good on popcorn.


Whoah hold up, barley being considered horrible seems like a cultural thing. Certainly in Scotland and Czech barley/kroupy is considered a damn tasty ingredient, I even use it in place of Risotto rice (I acknowledge that it's not the same thing as a rice bowl)


Well, it's cultural in the sense how you cook rice or barley is cultural, but if you cook 100% barley in the way you cook rice in Japan (or Korea), most people will agree that it objectively tastes worse.

(Well, I guess the way some people refuse to have even 5% barley is probably cultural...)

I can't speak for Japan, but in case of Korea, generally people don't consider barley horrible. They just find it a bad substitute for rice.


I think your last sentence is the one you might have led with in your previous comment. Barley is perfectly good, prepared plain or in other dishes. It just isn't rice and anyone expecting rice and receiving barley would likely be horrifically disappointed. I know I would (and I love barley and almost never eat rice)!


What? I occasionally make barley in the same way I would rice, and it tastes fine. I add some salt, butter and other flavorings, but I do that for rice (or noodles, or potatoes) too.


Historically barely was always considered animal fodder, not human food.


Anyone has a source for the comparison between white rice and barley vitamin B1 content? I'm just wondering of how great the difference is that it can help the B1 deficiency.


War is Boring has some really great writing both on current military journalism and historical. If those interest you at all the blog is a must read.


Summary:

- a diet consisting of white rice lacks in vitamin B1, which causes a debilitating disease called "beriberi"

- a naval officer named Kanehiro Takaki discovered this (indirectly) using epidemiology techniques, determining that those who ate more balanced diets did not catch the disease

- mixing barley in with rice basically eliminated the disease from the Japanese navy

- due to some combination of politics, and worry that the solution was "superstition" (ie, it was similar to a folk remedy, and was not laboratory-tested; many Japanese academics of the time did not trust epidemiological/statistical techniques) the Japanese army rejected the solution and continued to have serious beriberi outbreaks, up until 1905 when they finally decided to give it a chance


One underlying reason for the split between the Imperial Japanese Army, and Navy was that the Army was generally made up of members of the Choshu clan who were rivals of the Satsuma clan that was mostly responsible for the Navy.

The two clans put their rivalries aside temporarily to overthrow the Tokugawa Shogunate, but from the Meiji Era the rivalry was responsible for the lack of coordination between the Army, and Navy through the Second World War. [1]

[1] http://en.wikipedia.org/wiki/Gunbatsu#Hambatsu


the Japanese army rejected the solution

The article had a one sentence paragraph about this attitude:

   Stubborn and blind to the truth,
   the army was marching towards its
   biggest beriberi disaster ever.
I love those first words

   Stubborn and blind to the truth
Those words so perfectly describe so much of what's happening in the world today. Not 100+ years ago. Today. It reminds me of a quote attributed to Max Planck:

   Science advances one funeral at a time.


very similar story as the discovery and loss of the Scurvy prevention/cure: http://idlewords.com/2010/03/scott_and_scurvy.htm


All the way down this article I was expecting it to be about Scurvy as it hits many of the same issues. Only difference seems to be that Scurvy is vitamin C and Beriberi is vitamin B1. But same overall concept.


Greatly appreciate the summary. Thank you.


Thanks! TLDRs are looked down upon by some people. There are so many new articles you could read each day though that a short abstract you can read to decide if you want to continue in-depth is essential in my opinion.


Here's how I previously explained it: "HN has a strong bias toward allowing people to allocate their time based on their preferences. The tl;dr convention is tremendously useful because it allows people to make an informed decision as to whether or not to read the article, based on a summary which is (typically) more accurate than the headline."

It's clear HN appreciates this sort of thing -- the last time I had a comment as heavily upvoted, it was another summary. [0]

[0] https://news.ycombinator.com/item?id=8724105


Yeah, people equate TLDR with laziness but there's an infinity of new stuff to read so I want to be able to quickly evaluate if I want to read something before spending too much time on it.

I don't find articles that have a vague dramatic opening that gets you wondering where it's going works anymore.


Same here. If I see an in media res opening on a supposedly informative article, I skip down a few paragraphs. If I still can't see the meat of the matter, I skip it.

Inform me, don't try to "entice" me towards your embellishments. Ain't nobody got time for that.


I, or one, don't really equate TLDR with laziness. I equate with "get rid of all that unnecessary bloat and clickbait headlines".

The headline of this article, in my opinion, should have been "too monotonic diet", not "too much rice".


Summaries are great, and extremely useful. But, to me, “TL;DR” just inspires a feeling of contempt: “You think a 1700-word article is too long for you to read? What the fuck is wrong with you? Do you have the attention span of a goldfish? Have you ever seen a book?” (using, as my example, https://news.ycombinator.com/item?id=9359426.)


a 1700 word article isn't too long to read. But there are dozens of 1700 word articles (and longer) that make the HN frontpage every day, and most of us want to know which of those are worth reading for us -- whether the article has the kind of "meat" we're going for, whether it's mostly fluff, whether the technical details are relevant to our particular interests, etc.

One reason we see so many tl;dr comments even on a very technically inclined place like HN is that articles very often bury their content. We want a thesis out front; we don't want to be strung along for 1600 words and then find the interesting bit between 1601 and 1700.


So, I agree that summarization is very useful, precisely because it allows you to focus your attention on interesting things.

However, I also think that the phrase “too long; didn’t read” represents a dysfunctional attitude that turns this blessing that is the abundance of information into a curse: if you spend three hours reading 100-word abstracts (let’s say, 300 of them), at the end you will probably have learned almost nothing. If you instead spend three hours reading 1000-word articles (let’s say, 40 of them), you will probably have learned a small amount, which is an improvement. If, by contrast, you spend three hours reading 50 000 words (say, a few chapters of a Murakami book), or a quarter of Olin Shivers’s dissertation, you will probably acquire some new ideas that will stay with you for the rest of your life. There’s a point at which this becomes counterproductive; if you read a book in a day, it won’t stay with you as well as if you read it over the course of a week, and the week also gives you time to reflect on other connections.

My thesis: being able to focus your attention on interesting things is only valuable if you then do actually focus your attention on them. The attitude expressed in “tl;dr” is that focusing your attention is itself undesirable.


There are places where tl;dr implies a lack of focused attention. On HN, it often implies "I couldn't get the basic idea quickly enough to be sure this is worth focusing my attention on; can someone summarize for me so I can make a better-informed decision?"


What do you think of academic papers that all start with an abstract/tldr?

It's not about low attention spans, it's about there only being so many hours in a day. It's about helping people find what they want to read because there is just too much content being produced every day.


The first six words of the comment to which you are replying are, "Summaries are great, and extremely useful." That's what I think about the abstracts of academic papers. Academic paper summaries are not always exemplary, but they are usually very good. The only genre of writing I know that regularly has better summarization than the academic paper is the legal brief.


There's a lot of emphasis in this article on how that era's ideas of what constituted science and medicine blinded them from the truth and cost them lives.

What analogous contemporary ideas will people be laughing about in the 22nd century? What parts of how we view science and medicine are costing lives?


The thing that springs immediately to my mind for me is the anti-vaccination movement. I really do think it'll be looked back on in a century as a monumentally stupid backlash based on poor 'science' and scaremongering.

I think a lot of what is taken as gospel in the field of nutrition will probably be looked back on as barmy or the result of ignorance. The low-fat thing already seems to be going that way.


I thought the anti-vaccination movement was already considered a monumentally stupid idea...


It would be more analogous to the story if through a sheer number of improbabilities the antivaxxers are vindicated in the 22nd century.


Actually, the anti-vaccination movement has great relevance here. Much like Takaki's effective treatment was lampooned because of it's similarity to archaic traditional medicine and surely got wrapped up in politics of the time, the anti-vaccination movement has similarly become wrapped up by politics and can no longer sanely be discussed.

Takaki most likely had to contend with thinking his cure for beriberi was similarly "monumentally stupid" and "based on poor 'science' and scaremongering", which prevented an objective examination of his experiments.


There was a study done on survival rates of acute heart patients, and the effectiveness of cardiologists. The data showed that when cardiologists were away at conferences, mortality rates went down 7 percent or so.

Yet the American Heart Association treated this study as a curiosity, and doctors deemed it only worthy of snide comments about using it to justifying conference trips to hospital administrators.

http://freakonomics.com/2015/04/09/how-many-doctors-does-it-...

As for science, the two examples I'll trot out yet again are Pluto, and kibibytes. Pluto, is of course, not a planet, yet many cling to the nomenclature simply because that's what they learned growing up. More relevant for HN's is usage of kibibytes for 1024 bytes and kilobytes for 1000 bytes. Uptake on replacing the more traditional usage is similarly slow, for some reason.


> There was a study done on survival rates of acute heart patients, and the effectiveness of cardiologists. The data showed that when cardiologists were away at conferences, mortality rates went down 7 percent or so.

The ramifications of that aren't clear, though, as it could be something so simple as the risky/difficult procedures being delayed when the best docs are out of the hospital.


The difference with kibibytes is that they weren't discovered scientifically, someone (or some group) just decided to force SI prefix normalization on inherently binary units. Most people I know don't use power of two units anyway, though, so those two factors alone would definitely slow uptake.

Pluto is sort of the same, though at least there's an empirical justification in that there are other similarly sized bodies that shouldn't qualify as planets, so a consistent definition was chosen to exclude them and Pluto. But because it's a definitional change rather than a new discovery, people can be slow to adopt it.


I believe the point of this (and the highly-recommended article linked above about Scott/Vitamin C/scurvy), is that those involved are, as you say, blind to important truths. By definition, we cannot know which of our contemporary conclusions will be seen by future scientists as backwards or dangerously misleading.

In my opinion, the unsolvable nature of this kind of problem is a lesson in the importance of humility. After a lot of hard work and rigorous investigation, it is easy to conclude that some conclusion must be correct. Unfortunately, as science can only works within the range of known observations. New evidence can completely undermine supposedly "certain" conclusions. Much to the annoyance of the people reporting on scientific discoveries, or the people that demand certainty in any report so they can rely on it (e.g. for military purposes), there is no "final" conclusion in science.


I think it will be our ignorance of the harm caused by the new chemicals we've invented or put into widespread use over the last 50 years.


Over-use of p-values without understanding them? https://xkcd.com/882/


A good question.

I assumed the OP was submitted as a parallel to the (possible) connection of some mental illnesses to inflammation and infection, just seen on HN: https://news.ycombinator.com/item?id=9388176

Both feature doctors noticing weird chains of causation that accidentally cure mysterious ailments.


Personally, I suspect that psychology will get overhauled. It already has been to a significant degree, actually.


Really good read, thanks for posting.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: