For one of my friends, metformin was the key for finally losing weight. It got their insulin resistance under control, which had been causing them to crave food constantly throughout the day, way more than someone with normal insulin sensitivity. They never realized most people are not constantly hungry and thought people who were thinner just had way better impulse control or something, and that giving into that craving was some kind of personal failure of willpower, etc etc. When in reality it would've taken drastically more willpower for them to try to eat the same amount of calories a day that I do, because of this underlying biological situation.
The thing that gets me about all this, though, is how long it took for them to find a doctor that recognized what they needed in order to lose weight was treatment for the insulin sensitivity, rather than admonishments to "eat less." There must be so many other people with a treatable condition being written off in the same way.
When someone "eats less", insulin sensitivity also improves, in theory. But the catch is that situation represents a vicious cycle: if you just eat less, you also get less nutrients, and it may be not sustainable if you eat highly-processed foods lacking the nutrients in the first place.
So a better idea is to eat less + consume more vitamins + consume more natural foods (to get those vitamins). Junk food + metformin combo works to some degree but in the long run it is more like a hack than a proper fix IMHO.
They can't because some people have a block in their thinking process that prevents them from rationally contemplating any thought that goes against their own prejudices.
They'll scream CALORIES IN CALORIES OUT until they die from apoplectic rage rather than accept any concept that considers that there are more fine-grained factors at play in weight loss.
Saying CICO is the only thing that matters is like saying "Apples fall from trees because of gravity" is the grand unified theory of the universe.
Learning to eat healthily is a skill. It's a separate skill to learning to eat less.
I don't think any amount of junk food would get you adequate nutrition. So reducing the amount of junk food is a great first step.
You can take a multivitamin, it'll be enough to survive. Later, once you've established a decent calory baseline, you can start to optimize your good for nutrition.
Re: longevity, ITP didn't find any significant impact from metformin. Afaik they're the most rigorous testers of this sort of thing, so that's pretty significant to me.
>Metformin (0.1%) combined with rapamycin (14 ppm) robustly extended lifespan, suggestive of an added benefit, based on historical comparison with earlier studies of rapamycin given alone.
Tried metformin as part of a Hims package and cravings went away quickly.
I’ve wondered about ongoing use since they push other drugs as part of their cocktail of weight loss drugs, but not gone down the rabbit hole on dosing and maintenance plans.
I also did not understand why it would lower blood glucose, but that is due to my lack of understanding cells. I asked some help from ChatGPT and got this info:
"Inhibiting a mitochondrial complex (e.g. one of the electron transport chain complexes) would decrease a cell’s ability to generate ATP through oxidative phosphorylation. As a result, cells would rely more heavily on glycolysis to meet their energy needs, which increases their consumption of glucose. This heightened glycolytic flux leads to higher glucose uptake from the bloodstream and a corresponding drop in blood glucose levels.
[...]
Cells generally prefer using oxidative phosphorylation (the mitochondrial pathway) over glycolysis to generate ATP because it is more energy-efficient. Oxidative phosphorylation can produce around 30-36 ATP per glucose molecule, while glycolysis alone only nets about 2 ATP per glucose."
(Standard disclaimers as to LLM hallucinations apply)
> "This heightened glycolytic flux leads to higher glucose uptake"
This is a dubious statement. While glycolysis indeed consumes glucose, the amount of that consumption is expected to be significantly lower than through oxidative phosphorylation.
For example, if you deprive cells from oxygen, oxidative phosphorylation gets inhibited and glycolysis kicks in as an alternative metabolic pathway. As a result, blood glucose level goes through the roof. This is what can be seen in patients with acute respiratory distress syndrome.
But it is more complicated than that - when oxygen level drops, the nervous system starts gluconeogenesis as an attempt to compensate for the lack of oxygen by increasing the levels of glucose in the bloodstream. So we have multiple parallel effects going on: lower glucose consumption by oxidative phosphorylation due to the lack of oxygen + higher glucose consumption by glycolysis + higher glucose injection via gluconeogenesis. The net result of that formula is that blood glucose level goes up for almost all patients with hypoxemia.
Still, glycolysis alone cannot explain the effect of metformin. If it was really a glycolysis with such an amplitude caused by metformin intake, people would start to develop lactic acidosis, air hunger, cellular damages, neuropathy, dementia, cancer.
Honestly speaking, the only viable explanation so far is that metformin may cause mitochondria training by mildly and temporary putting a strain on ETC. Like a mild physical activity would do. Anything more impactful than "mild" would lead to an excessive oxidative stress, cellular damages, air hunger, suffocation, and tons of dangerous consequences.
Some papers also claim that metformin inhibits gluconeogenesis thus lowering glucose levels, which is another argument in favor of the "biochemical training by metformin" theory. Mitochondria seem to become more robust after a mild stress environment created by metformin, as if you visited a gym.
Sounds like it's exactly the same "magic" as exercise and calorie restriction. If the body has more energy available than it needs for normal function, it'll use as much as it can and eventually that use goes to things like general inflammation, cancer, and autoimmune disease. If you use available energy to exercise, simply eat less, or disrupt your body's ability to actually use all the excess food you feed it with drugs, then it won't be able to do those bad things and will only maintain essential function.
Obviously, if you restrict too much, you starve. Animals in a state of nature seem to automatically find the right balance and eat roughly exactly what they need. Animals placed into situations in which food is nearly costless, effectively infinite, and you don't need to be active except to the extent you do it freely for recreation, seem to struggle. Humans suffering from diseases of civilization are one such example, but human pets and livestock seem to have the same problems. Presumably, lab rats are basically like that, too. Life in a plastic cage with no predators and food given to you directly by gods is not very similar to life in the wild.
It would indeed be bad if it was systemic. The link seems to suggest it's only certain cells that have influence over blood sugar (the gut, liver, etc).
It lowers the level of glucose in the blood. For certain conditions such as some cancers mentioned in the article the low glucose harms the cancer more than the regular cells.
> metformin blocks a specific part of the cell's energy-making machinery called mitochondrial complex
Keep in mind that any mitochondrial complex I inhibitor that is sufficiently strong is going to cause PSP if you take it for long enough, which is basically a fatal and untreatable version of Parkinson's disease. I haven't seen much research on this being a risk specifically for metformin, but I'd be careful about trying to use it prophylactically for longevity purposes rather than for a specific medical condition.
This comment may unnecessarily discourage people from using metformin.
- There is no evidence linking metformin to PSP, let alone a causal relationship.
- PSP is also very rare, prevalence ~ 7 per 100,000 [1].
- Metformin is used by 100+ million people [2].
- It has been safely prescribed for type 2 diabetes since the late 1950s [2].
Metformin is a highly effective and widely used medication. It would be unfortunate for people to avoid it based on speculative claims. If there’s specific evidence suggesting metformin as a risk factor for PSP, I’d be interested, but the leap from "mitochondrial complex I inhibition is associated with PSP" to "metformin causes PSP" is unwarranted.
As for the prophylactic use comment: we are all going to decay and die, trying to mitigate those risks with metformin is not unreasonable. There is evidence supporting its potential benefits, though some of these may reflect its established role in managing diabetes. (talk to your doctor, etc.)
> the leap from "mitochondrial complex I inhibition is associated with PSP" to "metformin causes PSP" is unwarranted
The mechanism of action is relatively straightforward: the inhibition is caused by a building up oxides within the mitochondria, which makes them less efficient at producing energy. And if the mitochondria go long enough without a mitochondrial antioxidant clearing out the oxides, they eventually die. And if enough of the mitochondria within your brain die, you get PSP.
My best guess as to the reason we haven't seen an association between PSP and metformin is that metformin is actually a mitochondrial type I adaptogen rather than a mitochondrial type I inhibitor. I'm definitely not telling people not to take it, but if you are then I think setting a couple Google scholar alerts would be prudent.
This seems unlikely but possible. Metformin is widely prescribed worldwide and as far as I know, no correlation has been shown. Granted, what its prescribed for - diabetes and prediabetes - are themselves risk factors for neurodegenerative diseases.
The hypothesis of developing something like PSP after consumption of a mitochondrial inhibitor is plausible, but in reality metformin is not strong enough to cause such damages.
Before something like that even starts to develop, a person who consumed the substance would feel a pronounced air hunger. Which does not happen with metformin, indicating that it is too mild to lead to negative consequences like that.
Cyanide, on the other hand, is a strong mitochondrial inhibitor and it causes serious consequences (neurological, metabolical) including death.
That's very common actually! The mechanism of action for many drugs is unknown. And then it gets even weirder with psychoactive drugs: eventually the answer is "this drug effects this neurotransmitter which gives you this subjective first person experience, and we don't really know why". It brings up a lot of philosophical and existential questions!
In the case of metformin, the direct line is from guanidine in french lilac being used since the middle ages for diabetes. Guanidine is effective, but long term liver toxic.
Once actual chemistry broke open our ability to analyze that drug, related chemicals could be synthesized and studied, which led to biguanidines, of which metformin is one, which were tested in animals.
Many drugs are found this way by looking at chemicals related to ones we already know about, and then testing on animals and humans. Self testing of new drugs by researchers used to be very common.
How did folk medicine come to know the effects of guanidine? Dunno.
> How did folk medicine come to know the effects of guanidine? Dunno.
The same way we know a lot about other things in the chemistry domain - somebody tasted it, just to see what happens. "Mendeleyev's Dream" is a great book that contains many fascinating anecdotes of people going "let me just lick this, just to try". (Or, if you want to go more modern, the history of hallucinogens is pretty fun too)
Humans seem to have a strong tendency to stuff unknown things into their mouth, just because they can.
Indeed. I would assume that this is one of the traits that survives due to group selection [0] - it would be bad if everyone in the population was such a risk taker, but groups that have a small percentage of these people will adapt better.
That's my belief on why ADHD has such similar levels of prevalence (between 5%) around the globe despite drastically different cultures and varying genetics. It's great for the group as a whole as the ADHD members tend to be natural entrepreneurs, inventors, etc. However it often sucks for the individual ADHD folks and leads to higher rates of, well, everything. So the equilibrium settles to a stable fraction of the overall population.
A fabulous book on this is Phenethylamines I Have Known and Loved (PiHKAL) which can be found as a PDF online (the link I had has died but try Google and filetype:pdf).
Summary: a chemist is interested in psychedelics so he starts creating candidate psychoactive substances, and he then tests them on himself and friends. He wrote a lovely book about his journey.
Sometimes by accident. The original serotonergic antidepressants were antibiotics that happened to have MAO inhibition as a side effect. Initially developed to treat tuberculosis, doctors noticed the drugs had a stimulating side effect. A year later, a psychiatrist decided to trial them for depression, and found some success.
It's actually the other way around. They didn't have very serious theories about the etiology of depression back then, the only clue they had was from the pharmacology of these drugs, which led to the situation where for many years the leading theory on depression was that it was caused by low levels of monoamine neurotransmitters(serotonin, norepinephrine and dopamine). The main evidence for this initially was the fact that these drugs seemed to work, and that they increased the levels or effects of monoamines. This theory has since become discredited as we've learned more, although the connection between monoamines and depression is still strong. It's just not the whole picture.
I think it’s more often «we know that this drug lowers X in blood results, cause we saw that when we trialed it for something a few years ago that didn’t work out, now we’ve found out that disease B shows high values of X…»
I've submitted just such PRs before, usually with a comment along the line of "this hack is dumb and I do not yet understand why it is necessary, but it works now, tests prove it works, and I have more urgent things to do." Sometimes this is unacceptable but often it is just what the doctor ordered.
Of course, unlike with biology, it is usually not beyond my skills to eventually understand the whole system, but it may be beyond my time availability. With biology, the whole thing is just too complex to grok.
Biology is a result of billions of years of constant evolutionary struggle on a planetary scale, without rhyme or reason, very much not intelligently designed. Whatever random mutation worked, worked.
We the IT people love to complain about esoteric interplay of hardware, OS, apps and network. There is a lot more phenomena in constant interplay even in mere amoebas, not to speak of human bodies.
Not to mention that you just cannot put breakpoints into a living organism and read its current status on screen.
I find it positively crazy that we know something about biology at all. The intrinsic obstacles are just so much higher than in software.
We have some clues but not much real, deep understanding of how general anesthetics work. Process that for a second.
They've been in use since at least the early 19th century. We have had a bunch of them (though in humans, at least, we pretty much only use 4 or 5 in developed countries these days). We do this every day in surgical suites around the world. People expect to be unconscious during surgery. But we don't really know how they do it.
Or take antipsychotics: the companies were looking for antihistamines and noticed psychotic patients got better on some of them. If it works... you keep using it until something better shows up.
It's the same with neurology. We pretend we understand how the brain works, but much of what we are understanding is our current conceptual knowledge, so it's theoretical.
We work with Phase-targeted auditory stimulation to enhance deep sleep (https://affectablesleep.com). We know we can stimulate the brain during sleep and measure the increase in electrical activity which is the result of increased synchronous firing of neurons, but the reason why we are able to create this result is just a theory. However, the behaviour itself is consistent, and replicated.
It's completely rational just inverted -- Our understanding of biology is based on testing hypotheses of how drugs work, mostly. Some of the best practical understanding comes from "phase 4 clinical trials" i.e. those performed informally by the market
Does it sound crazy? Most startups don’t understand why their product gains traction. They have less comprehension by this standard than the scientists with drugs. One of the beautiful things about knowledge is this: you don’t need knowledge of components to know the whole.
The process for approving a new drug essentially requires proving that they are safe and also have sufficient efficacy, and usually also that the balance of the two (aka benefit:risk) is an improvement over a reasonable existing comparator.
There are plenty of drugs I can think of (even relatively modern, specialised drugs) for which we kinda understand some parts of the mechanism of action, but not other parts.
Also, given how difficult and competitive developing new drugs is, the incentive is to do the minimum (as above) for approval in the shortest time possible.
Yes! I am a "glaucoma suspect" meaning that short of going blind, I have the eye and eyesight degeneration characteristics that suggest I have the disease - no known causal factors. So I am treated with an expensive eye drop drug, rhopressa. In the fine print of the paper that comes with the drops is, and I paraphrase, a disclaimer stating the drug's active mechanism is "unknown."
But my ocular pressure is definitely 30% lower than when not using rhopressa! (shrug)
What's the alternative? Never develop or deliver a life-improving medication until we can fully understand it from first principles? Gonna be waiting a long time IMO.
If your pr deploys a machine learning model then that would be a suitable comment. I suppose the idea is similar, we can determine that this drug helps so a (hopefully) statistically significant degree so even if we don’t know how it works we want to use it.
Welcome to medicine and biology. We have no idea how a lot of stuff works (ask an anesthesiologist how anesthesia actually works, if you don't mind being freaked out). We just know it seems to reliably work.
This extends all the way to surgical procedures, which often amount to "should fix your issues, IDK" especially when it comes to soft tissue.
I like to think of it as a stochastic discipline. (Which means that you want doctors who understand probabilities. Many don't. Filter well)
Tell me you've never worked on a huge production codebase without telling me you've never worked on a huge production codebase. Sometimes you need to fix a problem and you just don't have days to spend finding out the root cause.
Yes, I remember learning about this issue with psychiatric medications and being similarly dismayed.
There is a sub-field called Computational Psychiatry [1] trying to do better. And interestingly, a person could argue that randomized controlled trials for medicine are only really necessary because we don't have good enough theory and/or good enough measurement devices. If we did, we could reliably predict the effects of a medication without the trial.
I take a 1000mg metformin tablet twice a day. I wouldn’t call it huge. It isn’t significantly larger than my daily Allegra and is much smaller than the vitamin capsules I take daily.
The thing that gets me about all this, though, is how long it took for them to find a doctor that recognized what they needed in order to lose weight was treatment for the insulin sensitivity, rather than admonishments to "eat less." There must be so many other people with a treatable condition being written off in the same way.
reply