Interesting publication by Sinclair back in 2021: Reprogramming to recover youthful epigenetic information and restore vision
> Using the eye as a model CNS tissue, here we show that ectopic expression of Oct4 (also known as Pou5f1), Sox2 and Klf4 genes (OSK) in mouse retinal ganglion cells restores youthful DNA methylation patterns and transcriptomes, promotes axon regeneration after injury, and reverses vision loss in a mouse model of glaucoma and in aged mice.
This doesn't test the "epigenetic information theory of aging."
First, there is no survival analysis. How is the mouse younger if it doesn't live longer? Similarly, the OSK "rejuvenated" mice display lower lean muscle mass.
Second, the causality is (willfully?) misinterpreted. The endonuclease used to causes DNA double-strand breaks does NOT directly alter the epigenome. Instead, it induces DNA-damage repair stress. One consequence, of many, is epigenetic (chromatin) dysregulation. DNA damage stress is well known to accelerate aging phenotypes. In fact, David published on how p53 stress from repeated DNA damage - using the same endonuclease setup - initiates a DNA damage response in turn promoting cell-cycle exit and cell elimination [0].
Third, cutting "non-coding" DNA in this case involves cutting specific ribosomes (cell translation machinery). Given that this pressure is constitutive, it's likely that these ribosomes evolve resistance to the nuclease by mutating functional sequences. However, the authors never assessed the mutation and function of these ribosomes.
Lastly, the in vivo AAV transduction efficiency isn't measured. This makes the OSK "rejuvenation" result hard to interpret. All cells get DNA damage (germline edit), but only transduced cells (<10% at best in whole organism) get some OSK exposure. Yet, the whole organism is "rejuvenated"? Is there a positive spill-over from OSK expression?
All the core claims about epigenetic information are either incorrect or grossly misleading. The perturbation, site-specific DNA damage, does not cause only loss of whole-cell epigenetic information. Hard to imagine how this got into Cell. I guess a big name and 20+ figures is all you need these days?
I'm interested to see where this line of research goes. Given that this paper was published in Cell, and the preceding one was published in Nature, I think we can rule out gross negligence in the writeup. I can't speak to all your points, but I can speak to a couple.
There was no claim that the whole mouse was rejuvenated, so far as I can tell. The only metrics they presented on actual rejuvenation were some chemical markers in a couple of organs - heart and liver, IIRC. In a CNN interview published ~5 days ago, Sinclair points out that he hasn't yet figured out how to deliver the OSK to the whole organism - which is presumably why he's only demonstrated rejuvenation at very localized sites. He also mentions that another team has figured it out, and did actually manage to extend a mouse's lifespan (see my other post with the CNN link). Thus,
> How is the mouse younger if it doesn't live longer?
It isn't, because it doesn't, because the study didn't aim to show that.
> Yet, the whole organism is "rejuvenated"?
Again, no. That was not the claim, according to the actual published article.
So I think you missed a couple things. Probably not "willfully". I may have the advantage of you, though, because I did manage to find a copy of the Cell article itself (and I did a bit of additional digging).
Well what you need is more like an easily accessible source of low entropy.
You can substitute a big concentration of energy (the concentration part is important) to some extent. However the energy usage of a computer (the simplest example of a device that lowers entropy in a flexible way) is many orders of magnitude higher than the theoretical limit.
So I'm not entirely sure how many bits of entropy it takes to reverse ageing, but just the computational power required may release enough heat to melt a few cities. And that's before figuring out how to practically do anything (though arguably with enough computing power you can probably just do some advanced version of percussive maintenance).
Edit: Huh turns out the Landauer limit at room temperature is more or less the reciprocal of Avogadro's constant, so the theoretical limits could be within the realm of what's humanly possible. Still it means you probably need a couple of Watts per bit of information for each molecule, divided by however efficient the system processing the hundreds of zettabytes is.
> Increasing entropy in a system can be avoided with an external energy source.
There was a study probably cited by Sinclair in his Lifespan book with mice genetically engineered to have 0.5-1 degree less than normal body temperature.
And they lived like ~30% or more longer.
Such hypergeneralised physical thought inevitably leads to necrology in the end-point measurements, rather than biology.
That implies Schrödinger's thermodynamic theory of negative entropy, which involves the holding off of death by means of a metabolic burning of food.
At stake in this paper is the cybernetic theory of negative entropy, whereby it is information that is negative entropy (Wieners rather than Shannons definition). Information that sustains life, it's lack which means death.
Inevitable entropy increasing is a red-herring in the presence of free-energy, no? What you point out though is very interesting, what would it take to inject energy into conserving this information? And what would its effects actually be? Could you pause aging, reverse it? Now this gets hairy, what happens to society? I dare say not everyone can live forever. What are the consequences of having a select few continuing to live youthfully and reproducing? Can we wind up with a loss of genetic diversity in our species? Entropy strikes again!
Since urbanized populations have fewer kids, the world is facing a huge demographic crash that will probably last through the rest of the century. So there's plenty of room to apply this technology broadly.
Also plenty of incentive: stock markets won't do well with shrinking populations, and governments will save a lot of money if they don't have to spend as much on treating the expensive diseases of the aged.
Isn’t aging engineered into an organism? This seems self-evident because rats and humans are made of essentially the same biological stuff, but rats live for 2-3 years (and then die of old age) and humans live for 25x that. Maybe this theory explains how organisms “create” aging. It also implies that aging can be slowed or reversed, and the mechanism for doing that is already present in any organism.
It’s better explained by the reverse. Human evolution needed to fight aging harder in order to be successful.
Also, we aren’t made from the same biological stuff, or we would be rats. It’s easy to think of biology in overly simplified terms but cells aren’t legos.
I think its a cool theory and seems the most reasonable from all the other theories of aging. However I am a bit disappointed when I only see David's name on every paper published on this topic. Why dont other institutions start similar studies to verify this claim?
So he's saying they didn't fix aging, they just fixed some simpler damage they did themselves. But in a previous experiment, the same treatment improved aging damage to the vision of actually old mice:
> Sinclair and colleagues injected the reprogramming-factor genes into the eyes of 1-year-old healthy mice, roughly the mouse equivalent of middle-age. By this stage, the animals had visual acuity scores about 15% lower than their 5-month-old counterparts. Four weeks after treatment, older mice had similar acuity scores to younger ones.
This is exactly the technology Altos labs is working on, and they have 4 nobel laureates on their staff. There are other companies too. There's going to be an explosion of progress in the next few years.
All the top postdocs would rather work at the well-known lab with lots of grant money and experience.
I'm sure someone else is trying, but there's an agglomerative effect here, where a competing lab would be starting at a disadvantage and playing with a B-team.
It seems like it should be possible to build a drop-in replacement system for DNA that adds a more robust error detection/correction capability. Each gene gets a checksum at the end and the transcription/translation processes are amended to validate these prior to progressing to building proteins.
Obviously it would be more complex than just that but it would be interesting to see how it affects biology. Evolution would now be done primarily though gene mixing vs random mutation, it also seems that things like ionizing radiation could be much more directly harmful, but cancer and autoimmune diseases would seem to be substantially diminished.
No idea how it would affect aging. Seems like it would slow it down but I’m sure it’s more complicated than that.
Who says you want more robust DNA transfer? I can imagine there was a time before the LUCA where life was a competition between these gene transfer mechanisms. DNA exists for a reason imo, gene mutation is a feature not a bug
From what I remember of The Selfish Gene, the individual's genes are always trying to maximize copying fidelity. Genes are trying to preserve themselves and take up a larger share of the gene pool; "intentionally" mutating into a different gene seems like a terrible "plan".
Also, the number of mutations that are actively beneficial are lowish. Even on the level of a single human, the whole organism prefers its genes stay intact.
Isn't the term "epigenetic" about non-DNA information, like in mitochondria etc, i.e. not something recoverable by just scanning DNA more precisely? If so, it doesn't seem reversable?
Different genes are activated in different cells. That "gene expression" is done by the placement of particular molecules on different parts of the DNA. That's the epigenetics.
Those molecules can get out of place, and not line up correctly anymore. This research shows that that's a major cause of aging, and that it's possible to reverse it.
All your cells have the same DNA but each cell type deactivated certain DNA regions that don’t apply to it.
The theory here is that as we age the cells epigenetics fuzz out, so that skin cells start to look a bit more like neurons and other cells and vice versa.
Why would that make it less reversible? Honestly it's not like it was ever easy to reverse DNA damage. Reversing other kinds of damage might be harder, but unless you have more information it's not clear that it _must_ be.
If you have a skin cell that incorrectly differentiated into a mole, how do you reverse it?
You can't just fix the DNA, it must also figure out whether it should become a hair follicle or one of the many subtypes of cells that make up your skin layers. We know that this differentiation seems to be controlled by ion/electrical signals early in life.
So a key question is: Why does differentiation accuracy seem to degrade with aging, and is there anything we can do to stop it?
CRISPR is pretty good at fixing DNA, we definitely need to optimize our use of that tool but at least there's a path. We really don't have a clear path to fix the differentiation/epigenome problem.
It's theoretically more reversible, but dependent on identifying epigenetic factors that can practically be counteracted or at least lessened (and don't have significantly adverse side effects.)
In reality, it will probably just mean a bunch of snake oil 'epigenetic health supplements' on the shelves that don't actually do anything.
Not an expert here, but it seems like they introduced a specific flaw and saw effects similar to aging, then they corrected for that flaw and saw those effects reversed. This is an interesting result, but it doesn't, imho, prove that actual aging can be reversed in this same manner.
Yamanaka factors seem to be a reinit mechanism, however that's not enough.
Cell differentiation seems to be similar to Conway's Game of Life where cells differentiate based on neighboring cells. Significant ordered complexity can emerge from simple rules. Now if you go and reset random cells, often it is either a no-op or they pick up the correct state from neighbors, but if you keep doing it, you eventually reset an important cell and break the functionality.
A study in primates/human glaucoma patients is underway (~next 2 years).
Randomized clinical trials are not as golden a standard as they seem and methodologically they are designed for human cattle, who do not choose what to consume and do not care to read a Wikipedia page or more about the drug they're taking or its target receptors they have, let alone any research on it.
Thus, in an instruction label for an approved drug you'll see dozens of adverse effects, established in randomized clinical trials, yet close to zero information about the probability of you personally being affected by these adverse effects.
Likewise, there are ~thousands of "biohackers" taking the safe metformin off-label, for "anti-aging" purposes. And any interested individual can choose an experimental treatment, taking all the risks etc.
As per Hippocrates Oath [1]:
1) I swear to fulfill, to the best of my ability and judgment, this covenant:
2) I will apply, for the benefit of the sick, all measures [that] are required
3) I will prevent disease whenever I can, for prevention is preferable to cure.
How would you ensure a physician is employing the best of their ability? By their brain oxygenation level? Or dopamine receptors occupancy in prefrontal "executive" cortex?
Thus ~90% of physicians regularly violate at least 2) and 3) statements of the oath, by not offering the possible experimental treatments in relevant sets&settings.
> Using the eye as a model CNS tissue, here we show that ectopic expression of Oct4 (also known as Pou5f1), Sox2 and Klf4 genes (OSK) in mouse retinal ganglion cells restores youthful DNA methylation patterns and transcriptomes, promotes axon regeneration after injury, and reverses vision loss in a mouse model of glaucoma and in aged mice.
[0] https://www.nature.com/articles/s41586-020-2975-4