But I think this image - that pharmaceutical companies are constantly risking their financial future pouring resources into research for new drugs which will finally cure some disease that has plagued humanity - is also a mischaracterization of reality.
For instance, drug companies also spend a large portion of their research budget developing "novel compounds" which are based on small chemical changes to existing drugs, just for the purpose of extending their patent and increasing the profitability (read: cost to consumer) of an existing treatment. So it's not as if all the resources they put in to research represent some altruistic good.
To be honest, I think the incentive structure for drug companies, like many things in healthcare, is just not ideal for producing the best outcomes. Their ideal business model is to own expensive drugs which sick people have to take for their entire lives. If they produce truly curative treatments, that's actually a net loss for them.
As some people have suggested, it would make a lot of sense to nationalize drug research in the US. The idea is you bring the top pharmaceutical researchers into the NIH, and while this would require a big up-front investment, you might even be able to operate at a profit by licensing the drugs they develop to the rest of the world.
Also, in a single-payer world, you would have the healthcare provider under the same financial umbrella as the people creating the drugs. As a result, the incentive would be to drive treatment costs down.
This recently happened to a prof in my department. He's an organic chemist, d evisde a way to prevent plaque buildup, and had the rights to the compound contracted out by an unknown pharmaceutical company
My boss also had a meeting with Pfizer about a thing
In this case the research would have been publicly funded and been obtained by a private company who bypassed the need to spend money on development
That's a shady technique, IMHO, because it's our tax money directly enriching a corporation
That would be an incredible rate of success, depending on what 'first level' means. If you mean "reached human clinical trials", maybe, but dunno.
> It also seems that academia is not any better than pharmaceuticals at either developing the first step or pushing something to the end.
"it also seems" lacks rigour, but let that pass for now.
If academia develops 85% of the drugs that are released (I'll try to find a source for that, it's from memory) then the vast majority of failure costs (85% of them) are soaked up by the taxpayer. The few that are bought up by the drugs companies, are then sold privately. Very profitable.
"According to BBC News, in 2013 pharmaceutical companies enjoyed higher average profit margins than carmakers, oil and gas companies and media companies. Only banks had a profit margin comparable to Big Pharma, but Pfizer’s 42% profit margin blew every other company out of the water, prompting even a member of its own industry to say, “I wouldn’t be able to justify [those types of margins].”
In 2013, five pharmaceutical companies exceeded 20% profit margins: Pfizer, Hoffman-LaRoche, AbbVie, GSK and Eli Lilly."
It is misleading to say that the pharma industry is hugely profitable because 5 companies have high profit margins. There are thousands of pharma companies. Most will never make a profit -- they just do research. Most of these fail, the ones that succeed sell their drugs to those big pharma companies bc they can't commercialize them on their own.
What about all the companies that no longer exist because their drug failed and they went bankrupt?
It's disingenuous to limit your analysis of profit to the most successful companies.
Not remotely surprised, because you're misrepresenting my quote. That's not what it said. Read it again.
Obviously rats aren't people, but a company obvious be edits when a molecule is discovered, synthesis optimized, and method of action/LD50 is paid for by the public before every cent of profits are privatized
Even if they don't market it, they own something tthat the public paid for
If the study/compound structure has been made public in some form, I think the public has gotten its fair share. The process that will take the compound from works-in-mice to works-in-people is quite difficult and risky. Pharma companies are very good at administering those things and (usually but unfortunately not always, hello Novartis) run those programs in a manner that ensures safety as much as possible down the line.
The Pharma company will spend its billion (roughly the cost per approved drug these days) and will recover many more times that if successful, but eventually the compound will become generic, and all of us will be able to get it for cheap.
I think there are many many problems with pharma, and I say this as someone who is trying to start a biotech. But the fundamental process above doesn't strike me as grossly out of line, it's playing to each players strengths. The reforms I would make would be around some of the pricing excesses, some of the patent/generics loophole abuses, and to find ways to address areas that are hard to make work financially in a private setting (antibiotics for example) through some sort of public/private partnership/prize system.
The most expensive and most risky part of drug development is getting a promising drug candidate all the way to market. 99% of those fail, yet can cost hundreds of millions.
And not sure what country you’re in, but if it’s the US, your prof would likely stand to make a ton of money if the drug is successful. The company doesn’t get it for free, they buy it from the patent owners.
Finding a prospective molecule is a huge part of the process - this is why computer binding models are such a hot topic
The fact that we paid for discovery, synthetic development, and PoC is a pretty big deal whether or not it ends up being a marketed drug
I know the guy who sold the patent was paid very well (aftet the university cut, of course) ... but who paid the bills to make it possible?
So if this drug you discovered does make it to market, the drug company itself is paying for the vast majority of the costs and all the risk that goes along with spending hundreds of millions and having nothing to show for it. In return, we dangle the carrot of profit in front of them.
You are correct that the public paid for that initial discovery, but the ownership is given to the lab that found it. The other alternative is for the gov't to keep ownership of the patent.
He found something, and a drug company willing to put up the other 99.99% of the money required to try to take the drug to market (even though the odds of that happening are miniscule) paid both the university and the researcher for the rights to use that discovery (and, as I understand it, usually to get a share of the profits should the drug be successful).
So for a nominal tax-based fee (funding research, not bringing drugs to market), private companies get access to research results that a private company may not have even tried and the institutions involved get a funding bump. And maybe some malady cured as a result. Unless you're a serious "screw you I got mine" libertarian, that seems like how tax supported basic research should to work.
I don’t understand this point, though I’ve heard it before. If the original patents expire, the original drugs can be manufactured as generics right? If the new patents provide no new benefit, don’t use them?
> The idea is you bring the top pharmaceutical researchers into the NIH...
Projects I’ve seen suggests that technology development carried out in an academic context often requires significant refinement (or complete reworking tbh) when being brought to market. So, it’s not clear to me how well the model you suggest would work.
Perhaps it would be best to have a public organization that manufactures genetic drugs first and build from that. But... I think that may also have issues.
Have a look at Avastin (Bevacizumab) and Lucentis (ranibizumab).
AMD is a leading cause of blindness.
Avastin is a cancer medication that was found to be effective to treat Age-related macular degeneration (AMD), although it wasn't licenced for that usage.
The makers reformulated and came out with Lucentis which they licenced for use with AMD.
Lucentis is considerably more expensive than avastin.
> Lucentis typically costs about £700 for an injection, but the price for Avastin is about £70.
In the UK the drug companies not only forbade off-label use of avastin, but they went to court to prevent this use. They lost.
Also, the billion dollar number is for developing a new medicine, and includes the costs of all the attempts that failed along the way, not for running tests to get an already approved drug certified for a new usage.
Doctors thought to themselves “if the target is the same, why not use Avastin for AMD”? There was no clinical data to support it, so the docs had to do it themselves. There was a ton of use even before any conclusive data.
The company said “we do t recommend using Avastin in the eye as we didn’t manufacture it for that use”. Legally, they couldn’t support it other (off-label).
Eventually some large scale trials were run by doctors that showed they were basically equivalent, but there were also cases of blindness due to improper injections of Avastin.
They didn't have to support it. But they actively opposed it by taking the NHS to court.
So it's because of a loophole created in the Hatch-Waxman act, which basically stipulates that if a drug company creates a "new chemical version" of a name-brand drug, then it's possible to extend the patent for 5 years and block the FDA approval of any generics. 
> Projects I’ve seen suggests that technology development carried out in an academic context often requires significant refinement (or complete reworking tbh) when being brought to market. So, it’s not clear to me how well the model you suggest would work.
So the problem, as I see it, is that the people who profit from the production and sale of drugs also have an essential monopoly over the talent pool which is capable of developing new drugs. As a result, the incentive is for them to direct their R&D resources in ways which maximize profitability rather than maximize health outcomes.
In such a system as I have described, there would still be room for drug companies to do the work of producing drugs, but the incentives for the R&D process could be aligned with the general population rather than drug company shareholders. The current system represents something of a conflict of interest.
That’s not a small tweak, that’s an entirely new drug. And the older drug can still be sold by genetics companies.
It may come as a surprise, but most pharmaceutical advances come as incremental changes. I can't find it, but there was an interesting chart showing life expectancy of colon cancer over time.
Each of the new advances only added 2-3 months of life. However, after a decade, life expectancy had grown by almost 5 years.
So my comment for you is that they wouldn't be putting R&D money into something unless there was enough demand for it.
What they found is that the strategies used by drug companies to "evergreen" their patented drugs resulted in massive unnecessary expenditures with negligible differences in outcomes.
And I take issue with their claim that pregabalin is an evergreen strategy for gabapentin. Those are two very different drugs.
The only way “making small tweaks” works to extend a patent is if you can convince doctors and insurers it’s worth it.
Not really. Drug development is so difficult that you're not going to presented with a scenario where you have one drug candidate that cures the disease and another drug candidate that merely abates it. The bias that happens a lot is towards common diseases and away from rarer diseases or diseases that occur largely in poorer countries, although that's still no guarantee of success (hello, Alzheimer's).
On the other hand, something like Solvadi (an actual cure for HepC in 90 days) really is a miracle cure for a chronic problem, and cost effective even at its very high price point (1k/pill, 90k total was its initial list price).
Your Sovaldi example shows that - a one-time cure can command a price that is bigger than a chronic treatment.
And the reason nobody invests in antibiotics (although there are several very profitable ones) is because any new drug is reserved for last line. Doctors don't want to bring out the big guns until all the others have failed.
Antibiotics are special and should be incentivized using a different system. I've heard ideas like an international government compact that awards a large, multi-billion $ prize in exchange for the patent and full generic production immediately. This would allow a consortium/company to do very well for itself but the whole world to benefit in the aggregate for very little cost.
Historically, the sort of structural change you suggest is what gets you branded a communist and hanged.
Apparently most treatments for depression fall in this range, which may be why new treatments arrive so rarely. Depression is a "wastebasket" diagnosis, with many, many causes. Any given treatment only helps with one or a few. Which variety you have can only be discovered by seeing which medications fix it, so putting together a set of candidates who have what one treatment would fix is impossible without actually trying it on them first.
If fibroses have this character, the only medication we can hope will get past the gauntlet will be palliative treatments, that reduce the body's common response to all the causes, without addressing any of the causes.
It's as if we were treating "ague", and not differentiating flu from malaria or bacterial pneumonia. Since a candidate antibiotic may only help the few with gram-negative bacterial infection, it looks not sufficiently effective, and is dropped. We are left with aspirin for fever and decongestants, while the infection rages unchecked.
This seems problematic to me. Wouldn't it be generally preferable to publish everything about their work so others can learn from it? Obviously big pharma would be against that, but in the general pursuit of advancing medicine it would help, so federal regulation?
Someone could potentially make a killing managing a marketplace for buying and selling failed industrial results. We learn from mistakes. If it were cheaper to learn from others’ mistakes than make them anew, this could have huge benefits for everyone. But it would have to be very enticing for the established players.
> word has just come out that the study has been terminated with only about a third of its patients enrolled.
Was the study terminated because they couldn't find sufficient guinea-pigs? Or was the study terminated for other reasons before recuitment was complete?
They might be looking for very specific types of patients that are just really hard to find.
Or, there may be plenty of patients out there, but the consensus is that the likelihood of the drug working is so low that doctors are recommending to patients they don’t enroll.
I’m guessing it’s the 2nd and it’s just another nail in the coffin for this program as clinical trial delays add a ton of costs.