I used to work at a lab as an undergrad that worked on extremely similar topics, though the cancer link was more implied: http://research.fhcrc.org/shou/en.html
We had four yeast systems: Two cooperative yeast systems (exchanged metabolites the other needed to survive), and two cheater systems (needed those same metabolites to survive, but just took from the system and never gave back).
One experiment I assisted with was basically putting them all together at differing ratios and seeing what happened over time - the different systems released different fluorescent proteins, so we could measure their relative populations.
Common sense might dictate that the cheaters would rapidly outgrow the cooperators and eventually crash the whole system, but that actually didn't happen until very high ratios of cheaters.
The cooperative yeast systems somehow managed to extinguish the cheaters pretty consistently (all the experiments were repeated and done in triplicate). This happened even at fairly absurd ratios like 3:1 cheaters:cooperators. Unfortunately, I left before any of the follow-up work to determine why that was happening.
All that to say, it's fascinating how much cooperative systems really are favored in nature. Glad to see someone explicitly applying this idea to cancer.
The article seemed to me a zero information fluff piece for a researcher. Your comment does a much better job of explaining their work than the author of the article.
Slime molds are another excellent model for studying cheating. They're typically single cell organisms except when starved. Then they chemotax an aggregate into a multi-cellular organism. At this point they undergo sporulation, producing a stalk and a fruiting body. Cells in the stalk die while those in the fruit are hopefully whisked to greener pastures. When the slimes are homogeneous this works just fine. In a heterogenous setting all bets are off.
Cheater mutants have been widely observed and fall into two categories: obligatory and opportunistic cheaters. The obligatory cheaters refuse to participate in the stalk, while the opportunistic cheats preferentially segregate themselves into the fruit. It's easy to see how these behaviors might be beneficial. There's a double edge to obligatory cheating, though: in they cannot undergo sporulation.
Maree and Hogeweg did some really cool research on Dictyostelium (one of those slime molds). For those interested: http://www.pnas.org/content/98/7/3879
Look at the functional call stack. Its a slice of the linear address space that’s actively being used and it is strikingly similar to the promoter regions of the genome. Like some pathogens, hackers randomly try to hit that sweet spot that will jump to the special program.
Operating systems are like simple biological organisms that evolve in response to various influences (cybersecurity it seems these days). My favorite is OpenBSD, they invent the cybersecurity. They were one of the first to implement address randomization for kernel and library loading. This is actually strikingly similar to what goes on with the genome with retrotransposons during early embryology stages.
People really like to repeatmask (aka SINE/introns) genomes because it makes BLAST go faster. Ironically, arithmetic/statistical compression algorithms work really well with repeats (I don’t think this compressed state can be examined, but compression algorithms are seriously under appreciated).
Not sure how much ASLR would be relevant w.r.t. genomes and cancer...
Malicious hackers are on the one side very smart (the cancer genesis process is not smart at all), but on the other side they make poor use of evolutionary strategies, mainly because human hackers can't easily spawn "an army of clones of their mind" (cancer otoh is all about evolutionary strategy).
Since the information about "what mutating a codon in a gene does" is not really available to "the attacker", it's as if the whole "address space" is already randomized by default...
The "cancer problems" sound like it's more like being attacked by a really really really dumb hacker, who has access to an utterly insane amount (relatively speaking) of computing power!
(...kind of hard to defend against such an attacker unless you invent modern crypto stuff like hash-signatures, blockchains etc. And our "DNA computer" is not at all designed for these types of math, recoding it to do this would probably be harder than developing mind upload capabilities or superhuman-AI :P Hence not likely to happen.)
At one point during the cancer progress process the cancer is characterized by genomic instability (high mutation rate) and extremely high metabolism(high replication rate). The immune system is actually attacking the cancer cells successfully(in generalizing here in regards to cell types), and this may lead to an optimization process where the immune system is selecting for a cancer cell able to avoid the immune defenses. This interplay creates a microenvironment which is as you described, evolutionary in nature.
As for cryptography, I would argue the biology contains hash signatures and other configurations common to cybersecurity(blockchain turns one problem into many IMO). While there may not be asymmetric cryptography, there is also no pseudorandom business.
IDK about papers per se, but as a computer scientist who took classes in bioinformatics, there are parallels. At the base level, cells are machines for constructing proteins, and information on that process is encoded in DNA. Silicon uses 2 bits, DNA has 4 base pairs. Those base pairs groups in sets of 3 called a codon, kinda similar to a byte. Word size is normally a multiple of 8 for computers, but the 6 bit word in DNA is more than sufficient to cover the 20 amino acids, leaving room for both a stop sequence, and some rudimentary error correction -- a mistake in copying DNA somewhere might not end up changing the resulting protein.
There's also a level of programmatic structure in terms of transcription promotors and gotos, and probably some level of conditionals. DNA is also special in that it can be read forwards or backwards, and the base pairing means you always have two potentially transcriptable sequences, each starting from one of 3 offsets. Imagine if busybox implemented all of coreutils in one binary by shifting the word alignment 1 bit to produce a completely different program.
Finally, at the intracellular level, we have a whole host of message passing between cells using hormones and other signaling molecules, and organs like the pancreas that try to regulate a system using them.
As far as 'computation' I don't think the system is Turing complete, and the systems aren't designed to answer decision problems, just construct things, possibly in response to an environmental factor.
Doubtful. You can to computing with DNA http://dna.caltech.edu but it is still debated if the actual role of DNA is being a code (some recently weasel out form that stance by saying it's an “app”, like there's a difference, machine code being a code). Coding theory applied to DNA yields inconclusive results. Galois theory usually has power over any kind of information encoding, cryptographic, computable or not. One constructed one for DNA convinced mathematicians it's not the way to go at all. If it's computation it's nothing like what we mean by any model computation, you may as well say it's magic instead of making stretched analogies.
About OpenBSD and evolution, that's so fetch... these folk invented the attack in the first place, not evolved a response to some market force in the early oughties. Broadly I don't think designed systems are in business of evolving, otherwise living organisms could perhaps evolved electro-hydrostatic instead of hydraulic power system with a pump being a single point of failure.
Linking biology to specific concepts in computer architecture or worse, with OS design, is indeed a stretched analogy but biology is rich with computation. I'll try to be short.
A cell must sense and respond to its environment. One aspect of this is regulation of gene activity by interactions of for example, transcription factors. When interaction types can be well enough approximated as either of inhibition or activation, you can model them with boolean networks and when levels matter, some have found moderate success with recurrent neural networks under a restriction on what NN nodes represent, to ease interpretability.
That is not the same thing as saying your genome unrolls neural networks. What it is actually saying is that the complexity of the best performing neural network indicates the richness of the underlying computation, and the predictive accuracy of the model captures the functional equivalence of the respective computations (in the cell and in the neural network). The learned model is the instance and neural networks are the class, it is a mistake to place emphasis on neural network, they are merely a way of packaging the computational model of interest.
There is a precise correspondence between that formulation and one done in terms of online sequential prediction or the so called experts algorithm. All of these are models and still quite limited at that but biology is complex and the future lies in the development and leveraging of these rich connections.
Can you explain how the analogy is stretched? It’s my personal belief that biologists tend to be fairly implementation focused from how they learn about the systems involved.
In biology there is 2 factor authentication to protect against parasitic spoofing, intrusion detection systems, protected kernel memory, memory hierarchy, policy based permission. It a very nice operating system IMO.
Analogies can help but if taken too far, will provide only illusionary knowledge from superficial similarities. Consider, if you trace out the hierarchies induced by interactions in biological networks and then do the same for an OS's call graph, you find quite different topologies reflecting their different priorities. In computers, in part, efficiency and reuse. In biology, robustness, minimized interdependencies, developmental stability and more.
There are things that concern our electronics that do not matter for biology and there are many things biology must allow for that our hardware cannot. Tying things down to the peculiarities of our systems too narrows the scope of applicable models and understanding. Sometimes it is more useful to think in terms of stochastic differential equations than in terms of operating systems.
Well all info can be encoded as bits (Shannon’s theory of information) and the genome isn’t very far from that encoding. You can divide with bitshifts with computers, it’s possible it could be occurring in biology(this area has been looked over in analysis until recently).
The properties of the genome are similar to the storage memory in a computer in that malicious code is inserted into the path of the biological interpreter. Early during human embryonic development (and only during this time for the human) you have a sudden increase of retrotransposon activity. The DNA itself is like stored polymorphic computer virus with a compiler and code to interface with the environment. It’s very unusual the timing and genes carried around by this circular process are vital for life. That is my basis for the comparison to ASLR. There also exists 2 factor authentication, signature based intrusion detection, intrusion prevention systems.
Here is an interesting paper on an organism that has high rates of this DNA recombination..read the motivation section and you decide if this is computation.
> (I don’t think this compressed state can be examined, but compression algorithms are seriously under appreciated)
I once attended a lecture at a physics conference by a guy who basically used gzip to do sequence alignment (using compressibility as a way to measure Kolmogorov complexity). Might be interesting to you. The title was: Blasting and Zipping: Sequence Alignment and Mutual Information
A future where cancer is controlled but not cured is just as good and might be easier to achieve. I hope science gets there before I get cancer which is inevitable, unless I die from a heart attack.
Agreed. My wife is going through this now, and the primary issue is the uncertainty around it all. Nobody can tell her what her future holds, just that there's a very high likelihood that 'success' will be fleeting and this won't be our last series of chemo.
I may be misunderstanding what the author proposes, but if the threat of cancer was reduced to the physiological impacts of dormant tumors, I'm pretty sure we would consider that a win.
Methinks future lies in between. Many people say that we all have cancer, at different scales and different reach. So just avoiding massive tumors and metastasis is probable the goal.
How accurate is the comparison with plant fasciation, for anyone who knows more? My experience with owning/growing fasciated succulents (usually referred to as "crested" by growers and collectors) has been that they're more challenging to grow in terms of risks of things like overwatering, but as far as I know that behavior doesn't necessarily impact their overall lifespan. Sometimes a crested specimen can also revert back to normal growth. Maybe that's tangential to the article itself, but it's something I'd be curious to find more information about.
Regardless of whether you think the specifics presented in this article (e.g., NSAIDS-esophageal cancer) are right, it’s great to see this kind of philosophical approach to biomedical research. We need more people like this professor.
That she started with a plant and that made her think about humans was a great insight. She rightly recognizes the key role of cell autonomous vs. cell non-autonomous processes in how multicellular organisms work.
That she is a woman and someone not at an Ivy League school is also very refreshing to see getting publicized by Harvard. There are so many great scientists aground the world doing great stuff. Let’s keep finding them and reporting on them!
Please media, more of these types of stories and less clickbait please! I loved this article. My faith in humanity was restored just a touch here.
> She rightly recognizes the key role of cell autonomous vs. cell non-autonomous processes in how multicellular organisms work.
This is an odd comment: This realisation is quite literally at the core of cancer research, and every researcher in the field recognises this. Heck, it’s implied if not mentioned outright in the seminal Hallmarks of Cancer paper [1] that every cancer researcher (and probably generally most human biologist) has read.
Don’t get me wrong: it’s a well-written article explaining the biology to a lay audience. But it doesn’t hold any striking new insights. All of this is not merely known but in fact fundamental knowledge.
> Don’t get me wrong: it’s a well-written article explaining the biology to a lay audience. But it doesn’t hold any striking new insights.
This "article" in the Harvard Gazette is basically a press release about an invited lecture that happened at Harvard. If it manages to clearly and correctly explain any science, then it's already going above and beyond its main purpose.
> But it doesn’t hold any striking new insights. All of this is not merely known but in fact fundamental knowledge.
A lot of "fundamental knowledge" in medicine is wrong. The genetic theory of cancer has always been a dead-end, because it reverses cause and effect. The cancer industry has encouraged "kill all the cancer cells" treatments because it's convenient for their business model. The cancer treatment situation is similar to how the use of bloodletting persisted for hundreds of years [0], in spite of the low rates of success.
Your comment is misleading. Certain cancers are indeed 'caused' by genetic changes, and researchers are actively looking for ways to prevent those genetic changes from occuring in the first place. But do you really tell someone with a familial mutation that we know causes thyroid cancer that they should be on a ketogenic diet?
Most therapeutics today are developed knowing that they will not have a substantially more beneficial effect than current therapies. We still develop these because even if they are not better they could reduce adverse events caused by our current best meds.
As far as I can tell in the last 80 years the Warburg effect has led PET scanning a viable way to detect the progression of many cancers.
To suggest that increased research into the Warburg effect is going to be the thing that leads to an explosion in novel therapeutics for cancer is wishful thinking, as far as I'm concerned. Cancer is complex, and while we could stand to learn more about metabolic changes in cancer, there's not going to be a magic pill for this one.
Your comment history suggests that you've been through medical school.
I like to say that sometimes doctors do good work, and sometimes they make work for themselves. Please take this response as a gentle and well-intended effort to protest the standard practices that hurt patients by 'missing the forest for the trees', such as I've observed in my own family (who themselves are employed in the conventional approach to medicine), and my girlfriend (who is being harmed by allopathic psychiatry).
Did you know that the term 'allopathy' was coined by a homeopath? At the time bloodletting, calomel, and blister agents were the standard of care. Homeopathic medicine held that the body can fix itself when properly supported (diet, clean environment, etc). Allopathic medical practitioners engage in the heroic struggle against disease [0].
Allopathic medicine never went away, it just changed form. While conventional medical practitioners are doing much better than 200 years ago (w.r.t. not killing their patients), from my perspective, conventional psychiatry and conventional cancer treatments leave much to be desired.
> To suggest that increased research into the Warburg effect is going to be the thing that leads to an explosion in novel therapeutics for cancer is wishful thinking, as far as I'm concerned.
I have some great anecdotes about cheap metabolic therapies. But they're just my non-rigorous anecdotes... The one person who I'd most like to experiment on, trusts her doctors because they wear the halo of 'conventionality'.
> Cancer is complex, and while we could stand to learn more about metabolic changes in cancer, there's not going to be a magic pill for this one.
Consider the possibility that cancer is much simpler than the geneticists have led us to believe. Sometimes 'we must unlearn, what we have learned'.
Edit: google scholar turns up some papers about 'spontaneous remission' of cancer. To figure out how to induce spontaneous remission of cancer cells would be a revolutionary advance.
> This work is being done in "allopathic" medical schools and hospitals and in biology labs that are the source of most allopathic medicine.
'Allopath' is considered a derogatory term by the modern doctors who are aware of its origin.
> If this work succeeds, it might lead to a cancer vaccine: the kind of treatment sometimes opposed by homeopaths and naturopaths.
The core of the resistance to vaccines is an opposition to the idea that a disease's context doesn't matter. Improved sanitation, refrigeration, insights into adequate nutrition, and other improvements in technology have done just as much as vaccination to vanquish the diseases of antiquity.
> Improved sanitation, refrigeration, insights into adequate nutrition, and other improvements in technology have done just as much as vaccination to vanquish the diseases of antiquity.
I don't understand the relevance. Will sanitation, refrigeration, or nutrition protect you from polio, measles, or other diseases commonly vaccinated against?
I always notice that when people talk about homeopathy they love to pick and choose which parts most appeal to them, and try to pretend the rest doesn’t exist. The core of homeopathy is the idea of “Water Memory” which is, to put it bluntly, horseshit sympathetic magic. The idea that water can retain a “memory” of a substance that’s been totally diluted out of it is ridiculous, so people tend to lead with some New Age, wish washy “let the body heal itself” crap.
Homeopathy is pure, unadulterated garbage, and proof that people will repackage old ideas such as sympathetic magic in new forms. You might as well rub a toad on your head. Homeopathy capitalizes on the same simple fact that most medical scams do, which is that most illness will indeed resolve with time. If you then take credit with placebo interventions for the body’s natural processes, you can fool a lot of people.
It’s funny, until someone decides to forgo treatment for a curable illness in favor of that magical thinking.
If one thinks that their ideas on alternative therapies are worth investigating, I would strongly urge them to connect with a scientist that shares similar views, and apply for funding through the NIH. There is a significant amount of funding for complementary and alternative medicine. It's not well known that such funding exists, so I try and let others know about it.
I have been tangentially involved with pain researchers that teamed up with chiropractors to study some of their maneuvers in a rigorous manner.
I also have some conventional/non-controversial ideas that ought to be verified - interventions that I have personally verified work as expected. I will look into the NIH's programs for funding these investigations.
Uh correct me if I'm wrong, but "kill all cancer cells" is less lucrative than "put you on a maintenance regime that you need to stay alive", and is a major part of the ethical debate surrounding gleevec.
Reading between your lines, perhaps incorrectly, it seems you are in the camp that cancer treatment is ineffective and it won't get better because it is making people money they way it is.
But the fact is there have been great strides in the survival rate of many kinds of cancer over the past 40 years, even though some are more stubborn than others. It hasn't been dramatic advances for most of it; most of the gains have been slow, incremental improvements that add up over decades.
Improved cancer survival rates may be related to shifting diagnostic patterns. "Pre-cancer" might not have deteriorated into full-blown cancer, but since it was caught it was treated, and marked as a success.
> most of the gains have been slow, incremental improvements that add up over decades.
I read something 1 to 2 years ago about how breast surgeons aren't recommending prophylactic mastectomies anymore. The message hasn't spread very far. The official approach to breast health is at odds with what's actually been discovered by physiologists.
I know a lady who had "colon cancer". I think her actual problem was emotional stress about her son... But she had her colon shortened, and is probably counted as a success by the official medical statisticians.
I was talking to a friend in med school, and I feel the difference between what exists in biology as we understand it currently, and what is clinically relevant is gigantic. That paper definitely educated a generation.
> This is what cancer cells do: Proliferate without limit, avoid cell death, monopolize resources, co-opt the labor of other cells, and destroy the surrounding environment.
I was thinking the same thing. Those locust plagues are hugely devastating...
Thinking beyond the snark for a bit, though, locusts might be an example of a species-level cancer cheating the “rules” of a habitat.
Humans? We just throw the rules away altogether and have historically written new ones that only apply to us. The difference between us and locusts, though, is that many of us fight back against our own short term interests and argue “hey, we’re a tumor right now. We should consider not being one of those.”
There's an important difference between a tumor and an organism: one can sustain itself, the other is a parasite. So what would obviously put humans over the line? Establishing self-sufficient colonies off Earth seems to be beyond the reach of any swarm of locusts, but possibly viable in a 1000-2000 year timespan for people.
And if nature/world/universe was sentient, it would be angry at us and it would want to fight us. And if tumors were sentient, they would assert their rights to live and write their own rules.
Alas, we're the only sentient beings in this comparison, which makes tumors bad and universe a resource, by our standards - i.e. the only ones that matter.
Corporations are more multicellular organisms than cancer. They are capable of achieving far more than a single cell or person, for better or for worse.
Only under certain circumstances. It's not inevitable that we be this way. Our most powerful societies cultivate a condition where our economic activity is this way.
I used to work at a lab as an undergrad that worked on extremely similar topics, though the cancer link was more implied: http://research.fhcrc.org/shou/en.html
We had four yeast systems: Two cooperative yeast systems (exchanged metabolites the other needed to survive), and two cheater systems (needed those same metabolites to survive, but just took from the system and never gave back).
One experiment I assisted with was basically putting them all together at differing ratios and seeing what happened over time - the different systems released different fluorescent proteins, so we could measure their relative populations.
Common sense might dictate that the cheaters would rapidly outgrow the cooperators and eventually crash the whole system, but that actually didn't happen until very high ratios of cheaters.
The cooperative yeast systems somehow managed to extinguish the cheaters pretty consistently (all the experiments were repeated and done in triplicate). This happened even at fairly absurd ratios like 3:1 cheaters:cooperators. Unfortunately, I left before any of the follow-up work to determine why that was happening.
All that to say, it's fascinating how much cooperative systems really are favored in nature. Glad to see someone explicitly applying this idea to cancer.