These individuals probably also exhibit diminished cognitive function as well. Only recently has it been recognized that the cerebellum is also involved in cognition . It's interesting to note that you don't need a cerebellum to move or think, but the loss of it impairs both. Contrast this to damage to your motor cortex which can result in paralysis.
 Glickstein, M (1994). Cerebellar Agenesis. Brain, 117, 1209-1212.
I wonder how much of that is "the cerebellum is also involved in cognition" and how much of that is "with the rest of the brain picking up the additional load, there's less "processing power" available for other things".
I'm using wishy-washy words here, sorry. I don't know how better to explain it.
Did your twins get ultra sound checks?
For me, some of the way I think things through seems very physical to me. The sort of thing you do when you pick up lunch table objects and say, "Ok, this salt shaker is the web server, this fork is the firewall, and..." Except I'm more likely to do it with just gestures, or just thinking about placing things in an imaginary space.
I wonder how much of that is really enlisting the cerebellum versus it being an output-only thing. Perhaps nominally output-only devices do more. As in rubber-duck debugging or Flannery O'Connor's line, "I write because I don't know what I think until I read what I say."
"Problems in the cerebellum can lead to severe mental impairment, movement disorders, epilepsy [...]"
However this seems potentially much worse than the symptoms this woman without a cerebellum is experiencing. Is it theoretically possible that people with damaged but otherwise intact cerebellum to be identified at birth so that they can have their cerebellum removed completely? My thinking is that if you do it early enough, plasticity might allow other parts of their brain to take over and do the job better than the damaged cerebellum would be able to.
This is probably one of the reasons why I am not allowed to perform surgery without a license...
Indeed :) A damaged cerebellum at birth might still be useful because it can fulfill some, if not all of the tasks it is supposed to. We don't know enough about this yet to really make any kind of call about it. For all we know, many people are borne with malformed cerebellums but never experience any problems, thus we just don't know about them.
To tease you a little bit, I read this as
> This condition is known as being born without a cerebellum.
I don't know why we need a latin name for everything!
Plus it's more convenient to say. Not a big deal for us, but to people who deal with crazy medical conditions all day long, describing each one in natural language would be imprecise and time consuming.
Any live language would have accidental matches (even quoted) where it's just the obvious thing to say, a la "born without a cerebellum".
They keep having to invent new Latin words and phrases so they can discuss things like hotpants, which are brevíssimae bracae femíneae apparently.
Have a look here - http://usvsth3m.com/post/95991771713/hotpants-flirt-and-othe...
The cashpoint with Latin in comic-sans is awesome.
edit - translating from the latin, comic-sans is a pretty accurate font name.
It's better to have a shorter and noun form for referencing cerebellar agenesis. It's especially convenient for researchers who write about it and have to refer to cerebellar agenesis multiple times in a single paragraph.
Also, "agenesis" is greek, like most medical terminology ;)
Here are the titles of the wikipedia article "Cerebellum" in some other languages:
Parengephaliδa (Greek - Παρεγκεφαλίδα if you can read Greek)
Otak kecil (Indonesian)
Xiaonao (Chinese - 小脑)
They also use the word amongst themselves. Googling for 'cerebellum ugeskrift for læger' gives plenty of hits. Likewise for 'cerebellar ugeskrift for læger'.
They might write 'agenese' instead of 'agenesis', though.
This strikes me as akin to saying that you don't need a GPU to perform graphical processing, but not having it impairs your graphical processing capability. The brain wires itself throughout a human's development to take advantage of the specialization of its components and their parallelism.
Damaging an adult's cerebellum once those connections are in place would be the equivalent of removing a GPU before trying to play a game that has been developed to rely on it.
At least that's my simplified analogy drawn from my admittedly imperfect understanding of how the human brain and computers work.
"Studies have found no significant long-term effects on memory, personality, or humor, and minimal changes in cognitive function overall."
If you don't _really_ need half of the brain and you don't _really_ need the cerebellum, I wonder how little (and what part) of the brain we actually do _really_ need. And then there are so many people living just fine with lesions in so many parts of the brain.
It's just amazing. Imagine going into our code bases and tearing out entire classes or modules; That wouldn't go down well.
Ripping out code would be like... radiation. Mucking with DNA. And that can be really adverse, just like ripping out/changing code.
Edit: Steven Pinker, in "The Better Angels of our Nature: Why Violence Has Declined", makes it out to be 15% average death by violence, so, pretty common. Explaining his graphs, he says, “The topmost cluster shows the rate of violent death for skeletons dug out of archaeological sites.” “The death rates range from 0 to 60 percent, with an average of 15 percent.”
Maybe we need the software. The cloud provider sometimes has more available instances, other times less of them
So, naive question: Could be correct to think that if half the brain is "enough" to have a full life, then the other half is utilized (in normal brains) but is wasting in doing unnecesary work? ie: Could be the brain is truly under-utilized, under-performing all the time? Like, is lazy?
Again, I recommend Gattaca movie http://www.imdb.com/title/tt0119177/
Don't let medical science try to dictate your potential based on gender, race or anything about your DNA. They're only right until they find out they're wrong.
Most of the issues you're alluding to are not because medical science says something is impossible but because the media hears "X group has slightly elevated probability of Y" and reports it as "X group has Y".
thing is, we don't know what we don't know.
>My answer to him was, "John, when people thought the earth was flat, they were wrong. When people thought the earth was spherical, they were wrong. But if you think that thinking the earth is spherical is just as wrong as thinking the earth is flat, then your view is wronger than both of them put together."
>The basic trouble, you see, is that people think that "right" and "wrong" are absolute; that everything that isn't perfectly and completely right is totally and equally wrong.
This is one of the reasons I want to find more ways to get technology to poor cities and third world countries. Not just for their potential as programmers or IT people, but to give them the same access to information that we have right now.
The person who can cure cancer could be sitting in the poorest slums of America, a village in Africa or Asia. The sooner people realize that the more inclined they may be to help.
Sorry to divert so much it's just that your comment rings true in so many ways.
Gender and race moreso are lousy predictors for most things, but they're not useless.
The main reason it's not more common is that few people are willing to pay for it vs. getting redundancy via multiple cheaper servers, especially once you've got enough load that you need to scale out anyway.
So, sort of.
It's easy to continue working at increased costs when you don't have a cache. That's.. that's what a cache is for; it's not critical to the infrastructure, it's here to reduce the costs.
Not to rain on any parade of course, I'm just pointing out I'd like to see more cases of actual plasticity of an email server that puts all jobs on hold while it temporarily takes over for a database server that just stopped responding.
The motor cortex is still doing its original job, it just has to work harder.
However, such an architecture arguably means that any single part is not particularly important.
(NOTE: disconnected, not destroyed; the woman's cerebellum didn't develop; things would be much different if you tried to surgically remove it)
Losing the cerebellum would be like losing a DC for which you have no backup, which then would require you to attempt to duplicate the original functionality in other DCs.
Compare a favorite computer robustness technique: triple-modular redundancy (TMR).
The reptilian brain is responsible for basic motor functions, heart rate, temperature regulation, and balance, and evolutionarily seems to be the part of the brain that is most connected to that of ancient fish and reptiles, as the name implies.
A person who is missing a portion of this rigid subsystem should still be able to think, process new information, and remember it, but might suffer from imbalance and other basic health issues as in fact this woman does. Yet, she can do lots of stuff. Apparently the surviving portions of her reptilian brain are able to compensate for the loss of the cerebellum.
It sheds a whole new light on a phrase like "my cold reptilian hindbrain tells me to ruthlessly proceed". We think of ourselves having this sort of emotionless hindbrain that is moderated by the more modern brain centers for sympathy, empathy, emotion, and higher reasoning. But what if in fact there is no such thing as a ruthless, primitive hindbrain and we are all completely in charge of our behavior, ethically and emotionally speaking?
Is this woman the only one of the nine to have lived this long? Incredible given how critical the cerebellum is.
"Then I guess I'll have to tell 'em / That I've got no cerebellum."
Understatement of the century? I wonder, then, what parts of the brain (if any) truly are essential for conscious thought?
But if people can live missing massive chunks of their brain, is it really believable that tiny differences can cause such massive societal outcomes?
Congratulations, you are today's demonstration of 'proving too much': you have also just proven that things like lesions and scars cannot affect cognition, warp personalities, create agnosias and aphasias, and result in bizarre conditions like those Oliver Sacks has so memorably documented, because lesions're so tiny and such small parts of the brain - 'if people can live missing massive chunks of their brain, is it really believable that tiny differences can cause such massive societal outcomes?'
Do you believe that neuroplasticity remains constant throughout a person's lifetime? Unless you believe that, your statement is incoherent.
Lesions and scars are acute changes in the brain that happen after birth. That's different from starting out missing massive chunk of your brain.
OP: "Look at that person who was born with no arms due to a birth defect. They're able to live a fairly normal, happy life. Maybe arms aren't essential to human happiness."
You: "So you're saying if you got your hand mutilated in a garbage disposal, that wouldn't make you unhappy? A hand is much less than a whole arm."
(I'll leave what this makes you a demonstration of as an exercise for the reader.)
And what, pray tell, are the 'nature' of disorders like Cotard's syndrome?
> OP: "Look at that person who was born with no arms due to a birth defect. They're able to live a fairly normal, happy life. Maybe arms aren't essential to human happiness."
Try going back and reading what was said. Your paraphrase is incorrect. Here's a correct paraphrase:
"OP: look at that person born with no legs. They're able to lead a somewhat normal life. This shows that anyone claiming that there might be differences between the fingered and the fingerless such as in fine motor control is a moron - because a leg is so much larger than a finger!"
> I'll leave what this makes you a demonstration of as an exercise for the reader.
Well, it demonstrates you can't paraphrase or follow the logical structure of an argument. I'm not sure what I'm demonstrating; hopefully something good.
But good job on sticking it to THE MAN. I'm sure that your social media activism is revolutionizing the world.
However, I think you can agree that you are being a bit extreme by saying that my post "contributed nothing to the conversation". Just because you happen to dislike what I said or how I made the point doesn't invalidate it.
I'm sorry that you find my post distasteful, but given the initial post I think that it was fully appropriate.
This is probably why you got downvotes here. HN doesn't tolerate this negativity. Please keep it where it belongs, on Reddit or 4chan. thanks.
Yes it is believable, as evidenced by numerous people that work on, get funding for, and continue to do research on a large slew of topics that contain "tiny differences", such as <insert any sort of engineering>.
EDIT: (Hint: "tiny differences" is fundamental to the notion of calculus.)
As an illustration, think of your computer. If you open it up and remove one RAM die, it will most likely run fine, albeit slower. Were you to introduce a "tiny difference", say, swap two pins on a die, you risk getting everything from machine not booting up, to crashing constantly, to running fine but spewing out nonsense and corrupting data every now and then. Were you to introduce a little bigger change - say, saw the die in half, you'll likely fry the whole machine.
She's missing a huge portion of her brain and managed fairly well. Wouldn't that lead you to believe that it would take quite a bit of structural difference to cause noticeable societal outcomes?
On one hand, this article thrills me to think that at some point, her daughter might lead a relatively normal life. It's heartbreaking to see the way she suffers right now - like there's more going on in her head than she can tell us, and you can see the frustration on her face when she tries to do things or get her point across. On the other hand, I'm hesitant to send the article to my cousin because I know everything related to her daughter's problem is deeply depressing to her as she's dealing with a frustrated child who makes very little progress from day to day.
Perhaps a more tasteful lead-in was in order.
> The space where it should be was empty of tissue. Instead it was filled with cerebrospinal fluid, which cushions the brain and provides defence against disease.
Quote: "Empirical evidence (also empirical data, sense experience, empirical knowledge, or the a posteriori) is a source of knowledge acquired by means of observation or experimentation. The term comes from the Greek word for experience, Εμπειρία (empeiría)."
The definition goes on to contrast empirical evidence with reasoning and other ways of approaching analysis -- all the non-materialist approaches.
Empirical evidence is experiential by definition - it can be material evidence if it relates to claims about matter, but it can also be evidence about other domains.
Ex. If God exists, then religious experience is empirical evidence of this. It is not probably not material evidence however.
Yes, perhaps I am to some extent.
> Empirical evidence is experiential by definition
I would have said it relies on tangible evidence, material evidence. Its status as an experience by an observer, if present, is secondary. I say this because evidence can be gathered without anyone experiencing it directly. Consider Curiosity on Mars. If we read a mass spectrometer's results radioed back to Earth and draw conclusions on that basis, it's a stretch to assert that we've experienced the evidence. Its interpretation certainly involves an observer, but not the evidence gathering itself -- that is often automated, even here on earth.
> Ex. If God exists, then religious experience is empirical evidence of this.
No, I think a spiritual experience contradicts the direct, physical sense of empirical. I usually regard empirical evidence as that kind of physical evidence that forces different, similarly equipped observers into agreement on its meaning.
Example -- when the CMB was confirmed in the mid-1960s, it killed off the last hope for a steady-state universe. Until then the Big Bang's critics were theorizing that the universe created new matter between the galaxies, so even though the universe was clearly expanding, this didn't mean it had a beginning or an end. The CMB detection, which wasn't really anyone's direct experience, falsified this alternative to the Big Bang. And it's objective in the sense that anyone can set up and detect the same evidence using indirect means -- not by direct experience.
I think most people will agree with me that we have the sensation of control over our own actions and yet it's not something that's directly testable per se.
This is a quite big assumption, and has none to do with describing "the world as it is".
Let's say that science doesn't try to analyze those parts of the world not accessible to empirical observation. That could be described as modesty or reticence.
> This is a quite big assumption, and has none to do with describing "the world as it is".
Those who do try to describe the non-material world have a pretty terrible record for reliable results.
by "rest", I assume you are a scientologist talking about theatans.
"This is a quite big assumption, and has none to do with describing "the world as it is"."
I disagree, my favorite thing about the scientific method is that it does its best to drop assumptions. I am not a scientist, but resort to this kind of thinking when I have to debug code. Suggest a better way and I will try it out.