-- J. J. Thompson"
There was a point where this made inherent sense - no one but research experts could educate new students, and students were attempting to quickly reach the state of the art. That's long past - a complex analysis expert with no training in education is a terrible candidate to teach basic multivariable calculus.
And yet the whole thing makes a sick sort of sense, because paying and tenuring a professor to teach classes justifies their presence without demanding publications. They'll muddle through intro classes without causing any real problems, and can go back to being brilliant on no particular schedule in between lectures.
Or, on the flip side, set up something like the Institute for Advanced Study. It only took already-proven brilliance, so it could justify paying people for years without any breakthroughs to justify it.
Nothing happens because there's not enough real activity and challenge: You're not in contact with the experimental guys. You don't have to think how to answer questions from the students. Nothing!"
Richard Feynman, "The Dignified Professor"
The tenure-and-teaching model is a better one than it was ever given credit for; the seemingly irrational connection between elementary teaching and cutting-edge research has been a powerful one.
Even so, I think the IAS model is better than a lot of what we're seeing today. Now, introductory classes have been offloaded onto (poorly paid) lecturers, and professors are expected to publish consistently, even without funding. The results are hideous: not just inaction but counter-productive action.
The replication crisis is, to a real degree, a function of this paradigm. "No results" is not an acceptable justification for "no publications", so we see people using statistical trickery that has been bad practice for decades to keep up their publication count. I'd really like to see them go back to teaching and researching, but even a movement to sinecures that don't demand any results at all would be progress.
Expecting ideas to appear, no matter what you do, is not a good idea. One difficulty is that is very difficult to estimate the probability of rare events. Perhaps having more free time can increase the probability of having new deep ideas, but that is not easy to measure. Furthermore, another key ingredient is communication with people full of energy, like P. Erdos, perhap some people can generate a hub of ideas.
Edit: Added A little more of money in the table not in the hands of political but directed to those that can do real research is a real innovation.
The full context there is that he spent a semester exempt from teaching to have more time for research. He produced very little and was deeply embarrassed, until he eventually realized that many of his insights came from rehearsing fundamentals and talking to people in other subfields.
So he returned to teaching, and when the IAS came knocking he refused on the grounds that it was the worst environment for him to work. Certainly I think he's overgeneralizing (the IAS has seen some pretty good results since that time), but this wasn't about jealousy!
Tenure's more important because it means you can say naughty but true things about the powerful without losing your job. As structured right now, in the physical sciences, a base salary won't support research.
As for tenure, it qualifies one for all kinds of service. So job security is nice, no questions about it, but one has even less time for serious work (unless you are willing to say no to requests to serve -- and some of these committees do matter).
Math is not a physical science -- and you can get stunning breakthroughs without anything except a quiet room, a pad of paper, a pencil, and a trash can. (old joke: /s/math/philosophy and remove the trash can!)
This is not true if you're a microbiologist.
Money vastly expands the scope of work you can do, but breaking new ground is often less expensive than making minor refinements to well understood areas.
PS: The real risk is will likely will go 20+ years without finding anything, but if you have tenure that should not be a problem.
Kitting out a lab to do publishable work in a lot of fields right now is just jaw-droppingly expensive.
Further, used equipment can often be vastly cheaper.
Honestly, time and consumables really are the biggest issues.
You will not publish anything good enough to get tenure if all you have is widefield and a lamp in basically any biomedical field.
Further, the assumption is this is someone that has tenure but not funding and is thus able to take long shot risks. Think designing an artificial intestines for microbiome research, not yet another paper using E. coli.
Florescent microscopes are in the low six figures and others are much more expensive. A confocal microscope is typically several hundreds of thousands of dollars, potentially approaching $1M if it has all the bells and whistles. Super-res and high-throughput imaging can also consume essentially infinite amounts of money.
You could probably get a usable brightfield microscope for around a $1,000 but realistically, you're not publishing much with that alone.
Just ask Peter Higgs
You can always do some variant of "we tried this, it didn't work; in the future, you no longer have to spend money to see if this will work." Right?
 Removing the triple negative: it should always be possible to prove the money accomplished something.
[Of course, some may disagree with this assessment; just throwing it in for your consideration.]
When discovered, there was nothing particular monitizable about quaternions, and they were not found as a consequence of any business enterprise looking for new tools -- but Hamilton didn't starve to death because he failed to capitalize on them. 80 years later, suddenly we rediscover the fact that we've solved gimbal lock and can easily describe quantum states.
you were saying?