I think we also overrate the significance of the corporate labs. There are not a lot of successful examples where the host company actually profited from the invention. Just a lot that bungled them or prematurely killed them (like when AT&T almost invented the internet). Or snuck out accidentally (like the Xerox Alto).
Imagine if Alphabet was broken up, Google search can still afford to run a research lab, as can youtube.
The problem I think is how easy it is for large companies to acquire smaller companies. It's how they expand or enter a market, they refuse to be bothered to bootstrap a new org-unit. They just devour smaller, innovative and creative companies. Look at Google, they couldn't be creative and patient enough to compete with youtube so they gulp up youtube. It's the bigcorp M.O.
So why bother with R&D when you can just buy a smaller company that does R&D, tests the market and builds a brand for you? My answer: you will suffer from brain drain.and reputation loss,when you buy a smaller company,consumers assume that brand is now dead. You become a cemerery of dreams and ideas. You become an IBM,HP,Xerox and AT&T. Once the damage is done it becomes nearl impossible to recover from. I like IBM as the best example, they are doing superb amounta of innovation even today but look at all their initiatives lack any traction or competitive edge. They have a ton of smart people working on brand new areas of tech like quantum computing,but their reputation and overall culture has not been great. They've been declining consistently. Look at yahoo, yahoo!! They had legitimate means to compete with google toe-to-toe,they relied too much on aquisitions as did Verizon that recently aquired them for a meager $4B.
In the end I blame all this on how publicly traded companies prioritize quarterly profits as opposed to multi-year growth. Acquiring bumps up the stock value for a while, spending billions starting from scratch competing or developing a new concept is risky so stocks go down.
It is far more likely that YouTube will go bankrupt, since running a free video service where videos are almost infinite and cache hits much lower is very costly. Couple that with all the other advantages of being with Google like access to talent and infra, I don't see YouTube surviving with Google. It's possible that the tab will be picked up by Facebook video, and that is an even worse platform and more closed.
I don't see a cheap way of solving the YouTube problem. (Yes P2P exists but has too many problems like battery consumption on phones and NAT).
Now that might be fantastic depending on where you come from. If you have disposable income, it is a huge win to have a non ad supported video platform which might even be more privacy protecting. If you are a poor kid using YouTube to watch MIT OCW and khan academy and innumerable other resources, well you are screwed.
MIT and Khan Academy could easily host their content using BitTorrent and reduce their bandwidth usage by 95% ( https://nrkbeta.no/2008/03/02/thoughts-on-bittorrent-distrib... )
What youtube gives to the poor kid is instant and free access to a huge audience. Yes, it's comes with an (admittedly not really high) risk of a false positive and blocking for copyright reasons. I suppose the kid is smart enough to also have copies of the performance on other services, and safely stored locally.
But it’s simple, video is more versatile, and by removing youtube’s monolithic dominance, you remove the single point of failure that copyright cartels have been able to attack.
As an example, imagine a separate youtube-clone dedicated to education that was actually willing to fight for fair use! Or a youtube that didn’t automatically demonetized you for swearing.
Internet already gives these kids (and the rest of us) both access and audience, we have just been stupid enough to lock large parts of our cultural heritage into corporate silos, protected by “intellectual property”-laws. Competition wouldn’t solve this completely, but it would make it harder to distort the market.
YouTube can afford to offer a lot of value to the customer, much more than it would have been able to sustain by itself without the Google support.
Without offering all that value, definitionally, Youtube would not be as attractive to consumers.
Therefore consumers would be more likely to choose other alternatives like PeerTube
Therefore projects like PeerTube would have more support and therefore would develop more quickly.
An unrelated company to their business model is something like Waymo
They have got problems, yes , but there is far too much good content on youtube with a very very long tail. In lockdown, I have tried almost dozens of different cooking channels for different things. It is the magic of YouTube that so many people are able to take their content to so many viewers seamlessly at no cost or loss of convenience to them.
That pays for a lot of storage. And storage is very cheap at that scale.
This isn't exclusive to Google though - most large bandwidth users have peering agreements.
b) Google does have edge caches. It's true that the long tail of YouTube videos is longer than Netflix, but since Google doesn't pay for traffic (see (a)) this only affects speed, not cost.
Would Apple be a counter-example? Multitouch, PA Semi, Siri, TouchID, etc etc lots of Apple technology was acquired but nobody laments the death of those “brands”
But honestly, who cares but geeks about what the general population thinks about who invented it?
Umm, this is exactly the problem the article talks about.
The fact that you had to pay obeisance to Apple to get access to private APIs and data is a failure of the anti-trust mechanisms.
The fact that Apple could just buy up Siri without anti-trust mechanisms kicking in meant they can just sit in their castle and let somebody else do the hard work.
VCs don’t invest in startups to help them build low profit “lifestyle businesses”. They fully expect them to be acquired or rarely become public. Just as an anecdote that we are all familiar with, only two YC companies have gone public — Pagerduty and Dropbox - and Dropbox is still not GAAP profitable.
If the governments takes away the ability of companies to get acquired, investments in startups will dry up and $BigTech would be the only ones with money to do research.
This doesn't make sense to me. Google search and youtube are just front-ends for Alphabet to make money through advertising.
If Alphabet truly was broken up, Google search and youtube would not be able to use google ads jointly in the same way, and they don't make money otherwise.
Not so sure about that. Maybe if you read HN a lot, but I think most users won't even notice.
Fun example: Star wars fans and Disney aquisition.
> they couldn't be creative and patient enough to compete with youtube so they gulp up YouTube
So what can they do? If they had launched "Google Videos" they would have surely been accused of squashing small companies... What option do they have?
Is Youtube even profitable?
Doesn't Google make 90%+ revenue from ads?
Google (and most large bandwidth users) doesn't pay for bandwidth because they have peering agreements. Storage is cheap.
Revenue != Profit.
Hence my comments pointing out that the things that most people seem to think stop YouTube from being profitable (bandwidth and storage) aren't really factors for Google.
I dont think Youtube would have survived without Google.
A big reason Bell Labs was created and perpetuated was because AT&T feared being broken up. Bell Labs was effectively a PR vehicle they used to show the US government that they were giving back to the community, and an excuse to continue operating as a de-facto state-sanctioned monopoly. Today's toothless FTC and DoJ don't really inspire that kind of fear.
Source (sort of, paraphrased) - This is a big theme of The Idea Factory which describes the heyday of Bell Labs.
(Edited to further develop my thought)
It's difficult for academic labs to keep up, actually.
Whilst modern economics and the startup model makes it easier for companies to let startups do the actual idea testing (and then buy up the successful experiments), there's a ton of companies that are still driving entire industries with their corporate-funded research.
We should also probably note that a lot of the funding for some very successful academic labs comes from these large corporations as well.
That's probably more an intrinsic problem with academia (lower wages, limited career opportunities, limited resources compared to companies).
What makes you think the advisor's projects were genuinely more speculative or more fundamental vs just dead ends that nobody cared about? It's not like corp labs don't fund fundamental work. Google was funding quantum computers for a long time now.
Further, I feel that if it's paid for by the public, it belongs to the public.
Due to the resource and scaling requirements (as well as inherent skill needed), replication of efforts for expansion purposes is difficult and that's one reason (not the only reason) these sort of modeling approaches are being pursued.
A lot of technology choices are being chosen because of an inherent high barrier to entry, limiting competition risk.
Google was founded, and mostly run, by people with a pretty broad academic interest in science and technology. Pioneering ai breakthroughs is what they wanted and they find ways to make it logical for google to do. Both the existence and the nature of their R&D reflects that. Apple or Microsoft were not. They're interested in products. What rocks Zuch's boat is is reach. I don't think he'd be that excited about purely technological breakthroughs. He'd only be excited about ways these can extend FB's footprint.
Aside from the issue you mentioned, the article seems to be making the claim that we have less innovation today. Which just seems like a ridiculous claim to me.
It also mentions “pervasive short-termism” as an obstacle. It might be in large public companies, but it’s certainly not in the private equity markets. Innovative companies have access to huge amounts of capital, and can stay unprofitable for such a long period of time that many people think we’re in a bubble. Multi-billion dollar valuation and “may never be profitable” is perfectly ordinary these days, and certainly doesn’t lend any credibility to the authors argument.
looking at econonmic growth it's extremely hard to deny. The years ca. 1945-1975 constitute an exceptional period in innovation far surpassing anything today, both qualitatively as well as quantitavely. (new marginal innovations in tech or pharmaceuticals today tend to be about 10-50x more expensive than just a few decades ago).
Relevant reading: The Rise and Fall of American Growth: The U.S. Standard of Living Since the Civil War Robert J Gordon
In 2020 you need 10,000 scientists and $9bn to build a hadron collider, and 99.9% of people won't even understand if your result is important or not.
Are we less good at science now, given that we're spending far more resources for far less impact? Or rather, are we better, but the lowest hanging and juiciest fruit got picked first?
And yet those discoveries weren't made in the previous thousands of years during which people had been standing around in thunderstorms. The enlightenment represented a paradigm shift where a new innovation - the scientific method - was applied en masse for the first time and it led to revolutions in science, technology, and society. Whenever a paradigm shift happens, be it the practical steam engine or long range electricity transmission, you invariably have a lot of suddenly accessible low hanging fruit. The mid 20th century had a surprisingly large number of such paradigm shifting innovations such as solid state transistors, integrated circuits, jet engines, nuclear reactors, antibiotics, solar cells, the internet, lasers, credit cards, the list goes on. Entire new fields of science and sectors of industry popped up in a very short period of time.
We continue to innovate, but our innovations are mostly evolutionary and there have been comparatively few paradigm altering inventions in the past 30 years, especially when one considers how much larger the population of sufficiently educated and affluent individuals who could produce such innovations has become and how much easier it is for such innovations to be communicated. It is impressive how good we've gotten at picking higher fruit, but we should still be asking why we are struggling so hard to find low hanging fruit.
There’s plenty of room for innovation in green energy tech, housing materials, ag science, etc... We’re objectively spending more to get those than we used to need to. And that’s WITH google and scihub! Something is rotten.
The idea that the "big problems" can be solved by small teams pontificating is missing all but that last step.
My point (which I understand is controversial) is there's no requirement for billions of dollars and thousands of scientists to make the next big breakthrough. There's still low hanging fruit. Less than there used to be, but it's still there.
The LHC and big $$ funding is _one_ way to do it. So far its been unable to solve the biggest problems we have in physics (agreed that it's helped disprove some).
We're at a point now that is similar to what people were saying in 1900. Some people say "we've almost dusted this physics thing off and figured it all out". I'm fairly confident something is going to come along from some small corner of physics that will knock all our socks off the same way Quantum Mech and Relativity did... it may arrive this year or it may arrive in 120 years more to get there though.
The big problems in science can be solved by much smaller teams is my point. Sure it will take an army to completely work out the implications of the breakthrough. But the breakthrough "aha" can still come from an individual or small team.
In case anyone wants a quick link to the LHC dark matter program, check the ATLAS  and CMS  summaries.
TL;DR: The LHC has ruled out a good chunk of a few popular dark matter variants already. There are a lot more searches in the pipeline. They haven't found anything yet, but of course nature doesn't just drop particles in the first place you look.
There are also some crazier plans in the works  which use the existing beams to look in other places.
What’s rotten exactly? Take a look at this lovely graph  showing the cost of solar cells going from $76.67/watt in 1977, to $0.74/watt in 2013 (the price of installation is obviously more than the cell production costs, at about $2.5-$3.5/watt today).
Which one fits your definition of important scientific question that we should answer?
What you're describing though is a small number of companies making significant technological leaps, not the quantity of innovation. People also argue that the invention of the internet, and the subsequent global proliferation high speed internet represents the same sort of revolutionary leap forward (one that we're going through right now).
One key difference is that inventions like the microprocessor or the steam engine are deeply fundamental. They're at the 'backend' of the chain. Internet based digital services are at the consumer end. Which has produced some growth, but could be argued to mostly fuel hedonistic consumption. (It's hard to figure out what deep growth watching hours of tiktok creates)
In his book Gordon points to the thought experiment of going to sleep in 1870 and waking up in 1970 in NY. You'd be in an entirely transformed world. Cars, buildings reaching into the sky everywhere, electrified subway stations, computers, drugs that save countless of millions of lives, modern agriculture and so forth. Go to sleep in 1970 and wake up in 2020 and what's changed, other than people staring at tiny screens?
It also never crossed the Iron Curtain into Warsaw Pact countries.
In reality (aside from tremendous change of political landscape) the 80s and the 00s weren't particularly different to live in.
"tech startups" seems to be synonymous with what the author call the "successors of corporate labs", and that seems to be why VC:s seems to be OK with pouring moneys for a semi-long term payback.
People/Companies that play this expectation (such as WeWork and Theranos) get the moral shaming from mostly tech people, while the rest seems to not get the point of the hate. (Not much indifferent from why some people do/didn't see the point of a corporate lab more than a cost sink)
Come to think of it, Company/CEO:s that overly focus on the "tech" image might be a warning sign that they might be just snake oil (such as Theranos CEO mimicing Jobs)
Even if you were only concerned with technological innovation, referencing well funded non-technical companies doesn’t negate the existence of any of the well funded technology-focused companies.
Also, Theranos is a perfect example of private equity funding technological research. The fact that it was a scam doesn’t take away from the interest it’s investors had in new technology.
A lot of the technology I use at work is funded innovation. Linux (largely Intel), React (facebook), Rust (Mozilla - kinda), Golang (Google), Objective-C, Kotlin, Typescript... There's also a huge number of AI and some very well-funded self-driving-car startups. For hardware, it's unlikely that a startup is going to disrupt companies like Intel, AMD, nVidia, Samsung... but that's only because they're already innovating at such an impressive pace. But even those companies don't come up with all their own innovations. For instance, the technology in the new displays Samsung recently announced comes from a company called Nanosys, which has been funding reliant since 2001. Theranos also comes to mind, which while it was a scam, it was a very well funded one.
Wasn't MSFT the biggest $$ donor, and RedHat the one with more devs commiting?
I think we (I, maybe) default to the mistake of thinking in terms of "one big systemic explanation." Ultimately, something like investment in or success at innovation doesn't follow strict rules. The factors that go into it tend to be local. Any practicable "Theory of Innovation" is likely to be true locally, at best.
A few years back Neil Degrasse Tyson, advocating for a re-funding of the NASA space shuttle, made the argument that private companies will never pioneer space exploration. Mars missions and such. He had Elon in his sights, but also others who were starting to "invest in space" at the time.
It was a good argument. It was logical, consistent with 50 years of space exploration experience and parsimonious with analogies from other industries, history, etc.
Neil was also wrong, and I think he'd agree to moderate that argument significantly today.
There is a huge difference between a 1.4% failure rate and a 40% failure rate.
NASA got a whole lot of science done with those missions.
Also, it's not as though the Russians and Soviets didn't have their share of in-flight fatalities as well.
The shuttle was capable of missions that wouldn't have otherwise been possible. It also ferried more people at once.
That said, the shuttle was severely underutilized and hence costed a lot more than it should have.
I don't think so, really.
Columbia didn’t explode in the sense of a massive detonation of something onboard. It had a hole punched in the wing during take off. During re-entry, hot gasses penetrated the wing and led to failure of the control surfaces, and the resultant loss of control (it was gliding at this point) caused heating and dynamic pressures that eventually led to vehicle break up .
Strictly speaking, the Challenger also wasn’t destroyed by an explosion. The failure of an O-ring on one of the Solid Rocket Boosters led to pressurised burning gas destroying the attachment hardware the held the SRB in place. The SRB ripped away causing the entire stack to tumble. Challenger ended up at the top and had its back broken by aerodynamic forces at the same time as the thin skin of the external fuel tank shredded, leading to a sudden massive burn of the fuel it contained (which caused the ‘explosion’ effect so visible from the ground). The crew compartment 'survived' all this and continued ballistically upward before falling back to the sea where it was destroyed on impact, killing the crew. 
This stereotype of the corporate executive who only cares about increasing profits is based on a kernel of truth, but only because the world is full of mature companies whose leadership are devoid of ideas for where they go next .. so why not focus on optimisation ("shareholder returns")? But then again the world is full of NGOs with no vision, it's full of academic departments churning out low quality grant applications just so they can expand their labs. Lack of vision and a focus on money is hardly unique to any one kind of group. At least shareholder returns are about making money for other people rather than yourselves!
In the monopoly position they are no longer concerned about competition so don't need to worry about their position ever more. Unlike startups which are constantly in survival mode so basic research with long term benefits is like the last thing they would spend money on.
Long term basic research in academic settings is also not very viable. It seems to favor vocal people with grant hunting skills instead of true researchers. Also academic environment seems to diverge away from practicing of science to more like practicing of religion by studying and promoting either topics completely ungrounded to reality or promoting outright politicized topics.
Monopoly companies on the other hand have a good track record of amazing research which is still practical but with long term vision which can have great benefits to the entire society.
So monopoly companies should not be broken up but forced to allocate percentage of revenue to this research like the examples of Xerox PARC and AT&T Bell Labs. If the companies have the privilege of no competition they should pay heavily for it.
Also since monopolies often fall into caregory of too-big-to-fail companies necessary for economy they should be paying a bailout tax - because you know they will need bailout money in crises, so better to fund it upfront instead of being so surprised that we need bailouts every time the market wobbles.
Monopolies are not very well studied by political and economics circles - everybody fears them and think of them as bad. They are inevitable anyway - infrastructure networks can't be form a competition unless a new technology appears. Maybe monopolies have a natural role in society.
Embrace, not fear, the monopoly. But understand it is a monopoly and you grant it expensive privileges that they would otherwise use to extract rents from the society. Let them pay for the privileges appropriately and you can form a nice symbiotic relationship with monopolies.
Sure startups are in survival mode, but they are trying to survive while getting someone else to pay for testing an idea. Aren't startups like VC funded research in a way?
>Arora et al point out that the rise and fall of the labs coincided with the rise and fall of anti-trust enforcement:
Historically, many large labs were set up partly because antitrust pressures constrained large firms’ ability to grow through mergers and acquisitions. In the 1930s, if a leading firm wanted to grow, it needed to develop new markets. With growth through mergers and acquisitions constrained by anti-trust pressures, and with little on offer from universities and independent inventors, it often had no choice but to invest in internal R&D. The more relaxed antitrust environment in the 1980s, however, changed this status quo. Growth through acquisitions became a more viable alternative to internal research, and hence the need to invest in internal research was reduced.
A good question would be, although there was a decline in these kinds of research labs, did the market make up for that decline though the current strategy of investment in entrepreneurs and startups?
Maybe we should not hold on to the bureaucracy patents bring. Especially considering in some countries international patent are used as recipe to copy innovations.
Tesla has moved their patents into open source because they are under the opinion that competition is won by attracting the best engineers.
> Elon Musk: "After Zip2, when I realized that receiving a patent really just meant that you bought a lottery ticket to a lawsuit, I avoided them whenever possible."
Could it be the other way around? That government-supported research in the universities had (at least to some degree) the effect of crowding-out the corporate labs? I mean, if the government is going to foot the bill, then why would I want to spend my own money on it?
Only on HN would someone argue that we've overrated the significance of the invention of GUIs because it didn't make the company that invented it enough money.
I worked for a large R&D lab a while back after completing my PhD, but the organization turned out to be completely directionless. Funding went to snake-oil salesmen who charmed executives with flashy proposals that they could never deliver on. I don't think there were any major successes during the time I was there, or since.
They also paid their Bay Area researchers about half the salary of FAANG senior engineers, so they really couldn't retain top talent. This is the downside of being a corporate R&D lab that's not funded by a near-monopoly.
The silver lining is that since doing R&D is necessary for progress and innovation, it will happen elsewhere. I foresee that the majority of R&D exercises will move to university and industry sponsored research will be the norm rather than the anomaly. This is also fueled by the fact that graduate students' stipends are much much lower. You did mention that in Bay Area the researchers are paid half of the FAANG senior engineers. In most developed countries, the salary of university graduate students probably a quarter (4x lower) of the company's researchers. The developing countries has the worst by being paid 10x lower than the BAY Area researchers.
Anecdata: My team of 12 or so at Google had (I think) 4 PhDs, and what we did was the usual "turn one proto into another" Google work that barely required a CS undergrad degree, to say nothing of a PhD. My wife has a PhD, works as a software engineer, and also does pretty routine data plumbing work.
Under different leaders my lab flip-flopped from "blue sky, do what interests you" phases to "we need to focus on value" phases.
Neither produced anything revolutionary. But the blue skies phases did produce some useful work.
Research is a risk. Funding it is like gambling. Don't bet more than you are prepared to lose.
I think this is part of it. But als,o there were just so few companies who were doing serious R&D for technology. Breakthroughs were new and unique. These days there are so many different companies constantly innovating that breakthroughs are the expectation/norm.
Many of these companies that have beneficial monopolies/oligopolies for a while they know if they don't innovate others still will even if there are no major competitors at that time. They know timing/technology capability, hardware, software, design or other are at a point where they must progress.
Amazon is a great example of this with how much they reinvest in research and development. It would hard for another non engineering/product innovation company to compete. Right now the Amazon market leading position is beneficial, in some cases they over step, but mostly it is beneficial. Google might be another. Both have flashes of abusing their position but mostly they are still innovating and pushing forward. These companies also inspire small startups to make products that are extended research and development divisions and select the best outcomes. While some acquisitions are bad, mostly the fact that being purchased by one of the larger companies shows they are into R&D and it leads to more of it.
ISPs and banks for instance are two areas where they are stifling innovation, growth and ISPs in particular our network suffers due to this monopoly/oligopoly grip they have on this needed utility.
Bell Labs was back in a time also that had more engineer/product/creative people with leadership roles and the ability to influence the power structures. So direction has changed quite a bit with that. R&D is very hard to justify to the value extractors even if the value creation is clear or maybe isn't as obvious yet.
The mere fear of missing out on technology/timing and potential competitors is the only thing that drives innovation at all. Once monopolies/oligopolies start to use their power position for holding others back by stifling competition rather than them moving forward and using their power position as a booster for product/innovation value creation, that is when anti-trust is needed.
Microsoft for instance in the 90s started to abuse their position, so the anti-trust started. What the world got out of that was Apple resurgence (even got a loan from Microsoft at zero hour of $100 million to stay afloat), Google, Amazon, etc. It even turned out good for Microsoft as they are a much better company today, recognizing innovation over limiting competition is the way forward.
Without the mere fear of being broken up, Microsoft slowed.
With the anti-trust case, it slowed them down just enough to allow competition to get closer.
Anti-trust is the blue shell in Mario Kart.
Anti-trust blue shell is very much needed if the main player gets too far ahead and abusing their position, that game is no fun. Anti-trust is the rubber band AI system that keeps the game competitive .
Additionally, monopoly/oligopoly are bad when the value creators (engineering, product, creative) lose power to the value extractors (business, finance, marketing) in a company.
It really isn't the fault of value extractors to extract the most value from the created value, but if there is no competition or balance between creation/extraction, that leads to stagnation on value creation and eventually more power plays that abuse market leading positions to stifle competitors. Ultimately we all lose when that state is entered.
Here's a great quick point by Steve Jobs about product stagnation and the managers/business side and how they can run amok if not controlled to allow value creation to continue, and how monopolies or problems that arise when only the business/managers are in charge. 
> It turns out the same thing can happen in technology companies that get monopolies, like IBM or Xerox. If you were a product person at IBM or Xerox, so you make a better copier or computer. So what? When you have monopoly market share, the company's not any more successful.
> So the people that can make the company more successful are sales and marketing people, and they end up running the companies. And the product people get driven out of the decision making forums, and the companies forget what it means to make great products. The product sensibility and the product genius that brought them to that monopolistic position gets rotted out by people running these companies that have no conception of a good product versus a bad product.
> They have no conception of the craftsmanship that's required to take a good idea and turn it into a good product. And they really have no feeling in their hearts, usually, about wanting to really help the customers.
Market leaders should always fear the anti-trust blue shell, when that fear is gone the game is not competitive and we all lose.
Of course, it's also possible to run a research lab poorly and do nothing of value, naturally I don't know anything about your experience.
It's certainly true for PARC, which owes its success in large part to the ARPA research community that preceded it. Many of the same people from that community came to PARC when in the beginning, as government funding was drying up elsewhere
On the one hand, they are kind of _forced_ to do research in one form or another to power the pipelines, so clearly corporate research is still alive there.
However, given the pathetic track record the industry seems to hace, and arguably complete lack of any real innovation (almost all of the drugs in most pipelines are just antibodies or their variants), some form of corporate research rot seems evident here as well.
One problem I see often on pharma side is that it's revealing when the CEO of the org is not a technical person. GSK recruited the CEO of L'oreal as its new CEO. TBH I can't for the life of me figure out how someone who sold lipsticks can make decisions on which preclinical trial has the highest chance of success in a human being. If the CEO of a company is not fundamentally versed in the fundamental technology they make can the company actually be successful?
L’Oreal also does research in things like methods to regrow human skin/hair, which aren’t on the same level as cancer treatments, but they do involve clinical trials and medical research. Hell, even something like launching a new facial moisturizer requires testing akin to clinical trials.
Aside from that, Emma Walmsley wasn’t CEO at L’Oreal, and she left there in 2010 to join GSK, where she worked for 7 years before being promoted to CEO in 2017. It’s not as if she went from being some business bean counter straight into being the head of GSK; she had 7 years to build GSK domain knowledge before taking the helm.
Point still stands though. Even most of my PhD friends often lack fundamental understanding of how biology works, and I find it hard to believe an MBA can ever catch up no matter how much training they get. I'd argue that in general it also shows - biotechs run by scientists seem to do the actual path breaking research just the same as in technology.
In the end, I personally believe that the CEO needs to know enough of the underlying technology that their company is working on to smell bullshit. Otherwise his execs have a very high likelihood of taking them for granted. Ive seen it happen on a smaller scale repeatedly in my lab. If our professors are not well versed in one field of science, the postdocs and students will take advantage of it in every corner.
This is especially true in biology. It doesn't matter how good your nanoparticle drug is, if you don't know that there are fundamental problems with immune recognition, half life, biodistrubution and non-specific binding, you would not know not to invest further. This evaluation is not something a CEO should outsource to another C level exec.
Real world experience trumps a degree in almost all cases. It certainly helps, but you don’t need a PhD to be a scientist, and you definitely don’t need one to lead scientists.
Since the market size/fit/price sensitivity is extremely predictable and there are so few customers to negotiate with (insurance companies, nationalized healthcare, medical purchasing groups, etc.), this is far more efficient for pharmaceutical companies than having a broad and risky drug discovery pipeline. Instead they acquire early and mid stage biotech companies with promising pre-clinical trial results or phase 1/2 successes - in essence externalizing the risk of early R&D even further to academia, venture capital, private equity, and retail investors. With the increasing availability of easy capital of the last few decades, this has become even more of a win-win for investors and the pharma giants recently.
Merck's CEO is a lawyer.
> they are kind of _forced_ to do research in one form or another
Modern pharma companies spend more on legal and marketing than they do on R&D . Valeant and Michael Pearson are emblematic of that horrifying trend.
MBA types and McKinsey alums have very different priorities than most hackers/researchers/builders, and it is their priorities that are driving investment decisions these days. That really needs to change.
It attacks a study that no one finds credible, which labeled a company's entire SG&A as "marketing" and found that "marketing" so measured was 10x R&D. It also predates the BBC link.
So, what you meant to say is "calling SG&A marketing is not really useful," which is true. But that's not what the linked BBC article did at all.
I'd say a CEO's job is to hire someone who is qualified to make those decisions, and to make sure there's enough money to keep the lights on so they can do that job.
In that respect, running a company like L'Oreal isn't so irrelevant.
Not saying your broader point is wrong though. Someone needs to have a vision for the future, and I'm not sure you can really have a strong vision without having been through the trenches yourself.
the fundamental technology of modern pharma is the same as the one of cosmetics company - marketing. To convince people that this lipstick or anti-cholesterol drug matches their lifestyle the best, or even more - to sell people on the lifestyle of that lipstick, that drug, etc.
> ... firms in the life sciences such as Pharmacia, Lilly, Bristol Myers Squibb, Pfizer, and
Amgen significantly increased publications. In the case of Pfizer and Amgen in the 2000s, this increase
in publishing kept up with changes in R&D expenditures. One key feature of the pharmaceutical
industry during this time period was the strong merger activity. However, comparisons with other
sectors that also experienced strong merger activity suggests that the publishing behavior of firms in
the life sciences was not simply an artifact of merger activity.
 https://static1.squarespace.com/static/593d9b08be65945a2e878... Page 29
That's why I put "product" in quotes. The CEO's concern is the business. Domain experts handle the tactical decisions.
a CEO cannot just outsource it to some other experts
I'm sorry but that is precisely what happens in big business all over the world. Call it "expert guidance" if you want, but if you really think the CEO is solely responsible for making domain decisions, as well as all the business development ones, you're misinformed. They will have views, but those will almost certainly come from the business end, not the domain end.
Apart from anything else, domain knowledge can change so fast, a CEO can't be expected to stay at the bleeding edge and manage the business.
Further reading: https://duckduckgo.com/?q=ceos+dont+matter&t=h_&ia=web
We know how to search, thank you very much. If you want to help us, you can link a specific book/article, and call that further reading.
I vaguely recall reading something on HN a while back, but also didn’t want to get accused here of linking to a biased source as I’m not real familiar with the leanings or any of the ones I found in that search.
Bit lazy, but had my reasons. Will try to do better next time.
I’d love for a billionaire to offer 18-25 year-olds, $25k for a summer to explore new projects.
You could fund 1000 kids with promising projects for $25 million.
If the next great innovation will start as a toy, we need to encourage people to make more toys.
- There are three Allen Institutes (Brain Science, Cell Science, and AI), funded by Microsoft's Paul Allen in Seattle.
- The Gates Foundation (also funded by Microsoft, I guess) funds research, though mostly at existing institutes.
- The Howard Hughes Memorial Institute funds a bunch of research at its Janelia Farms Campus, plus provides lavish support investigators at universities.
- Eli and Ethyl Broad put up about $700M to endow the Broad Institute at Harvard/MIT.
- The Michael J Fox foundation funds a lot of Parkinson's Disease research.
- Chan Zuckerberg Initiative
- Simons Foundation
- Gordon and Betty Moore Foundation
- Sloan Foundation
You can even put companies like Numenta on this list, which is pretty much blue sky research, even if nominally for profit
I was actually just looking at a job posting at the Simons Foundation, which makes its omission that much more embarrassing. The Schwartz Foundation also paid for some stuff (and a lot of pizza) in grad school,so I should give them a nod! CZI and Simons/Flatiron have their own buildings. I think GBM and Sloan are more grant-making organizations.
There's also DE Shaw Research, which is particularly interesting since the billionaire in question works there himself.
More of a mashup of the MacArthur and Thiel Fellowships.
Of course there are lots of problem with academic culture, like overemphasis on maximizing citation count. But the setup you describe will also require some form of simple metric to track progress- to ensure that 25k isn't going down the drain.
Proper staff scientist jobs would help break out of this mould and might even be more cost-effective by reducing churn in the lab and providing people with more guidance.
Maybe. I don't think MacArthur tracks the spending of their Fellowship (a much larger amount).
==According to the foundation's website, "the fellowship is not a reward for past accomplishment, but rather an investment in a person's originality, insight, and potential". The current prize is $625,000 paid over five years in quarterly installments. This figure was increased from $500,000 in 2013 with the release of a review of the MacArthur Fellows Program. Since 1981, 942 people have been named MacArthur Fellows, ranging in age from 18 to 82. The award has been called "one of the most significant awards that is truly 'no strings attached'".==
This sounds like a form of suvivorship bias. If you know which research pans out it wouldn't be research. You need to also include the cost of research done by other companies that led to nothing or something less impactful.
Why? Time horizon.
Companies will not fund research that has a more-than-20-year expected time to product. Usually, they won't fund things that will take more than 10 years to go from R&D to product. That's because of investment-- think about startups, what LP wants to put money in a fund for more than 20 years?
On the other hand, academic labs, funded by governments, often do work that pays off more than 20 years later. Think about Watson and Crick - their work on DNA in the 40s and 50s led to antibody drugs in the 1980s that are now being used widely today.
Bell Labs was the exception that tried to do long-term, basic research work, and its failure proves that corporate funding will eventually dry up for any long-term research. There's just no business case.
The US has the biggest tech and biotech economy in the world because the US government (that is, US citizens and US society) was smart to fund longterm basic research at the highest level in the world, building universities that attract some of the best talent in the world. Corporate research labs do more short-term work.
Why did it fade, though? AT&T was able to fund Bell Labs when it still had a monopoly over the telephone market. The article claims that antitrust laws were relaxed in the 80s, but AT&T was broken up in 1982 (https://en.wikipedia.org/wiki/Breakup_of_the_Bell_System). This break up lead to increased competition and innovation in telecom, but it did come at the cost of Bell Labs losing funding until eventually being spun off into Lucent.
Peter Thiel argued in the book Zero to One that only monopoly (or pseudo monopoly) businesses can afford to fund surplus salaries and corporate research labs that cost billions of dollars with potentially little or no gain for many years. I would argue that if a corporation exists in a competitive industry, their margins are driven too low to fund basic research.
Also note: even in the U.S. corporations fund a good chunk of basic research in academia: https://www.sciencemag.org/news/2017/03/data-check-us-govern...
The reason it appears rare is because funding research on the assumption it might be useful in more than 20 years from now is extremely wasteful. Companies don't work so far ahead because the risk of just going down a dead-end for half a lifetime is very, very high. In academia that doesn't matter because people are rewarded merely for researching novel things, in the real world people are rewarded for doing things that are useful.
It matters. For every DNA you can cite, others can cite dead end branches that despite decades of research have gone nowhere and probably never will. In CS, how many programmers are using Haskell every day? It's been in development for 35 years yet virtually nobody uses it - the new languages that gain traction (Rust, Swift, Go, Kotlin etc) invariably come from corporate R&D labs, and outside of a few bits of useful syntax borrow little from academic research languages. Even Haskellers have admitted now that lazyness was a dead end, new FP langs like Idris don't have it. Practically the entire field of PL research was swallowed up by FP and continues to be dominated by that paradigm (e.g. dependent types), despite the vast majority of PL users being disinterested in them.
Computational epidemiology. It's been researched for >20 years yet the models are always wrong. There are private sector epi models, but not surprisingly little work is done on them because there's obviously a missing piece, and developing huge and insanely complex simulations (e.g. the 15,000 LOC monster Ferguson produced) is obviously a dead end.
String theory. How much time has been sunk into that? Not a single testable prediction.
And in biology. You know, one biotech firm found 9 in 10 papers don't replicate. Papers professional labs can't replicate is not "useful in 20 years". That's "not useful today and never". People were selectively breeding for many centuries. Agritech firms would have eventually figured out the structure of DNA if Watson and Crick hadn't.
Meanwhile we tend to take for granted all the long term R&D projects the private sector does because it's much better at finding useful outcomes quicker. It doesn't need to wait 20 years to find products out of the research it does, and that's good!
I couldn't disagree with this more. I'm biased because I really like PL research, but when I look at modern languages like Rust, Haskell's shadow is plain to see. ADTs, immutability and parametric polymorphism for instance.
These days I think there's a lot of wishful thinking along these lines, as if nobody would have noticed without Haskell that re-using lots of global variables leads to frequent bugs. C++ got the const keyword in 1985, the same year Haskell was born. Templates were proposed in 1986. And would nobody have developed the notion of first-class functions without academic PL research? Given that even C has the notion of function pointers, it's hard to argue that.
Rust is hardly related to Haskell. If there's a shadow it's a very small one. Rust's primary research idea is adding linear types to an imperative type system. There's no lazyness, it's not pure, the syntax is obviously C-based and not ML/Haskell based. The similarities to C++ are much stronger than the similarities to Haskell.
When I look at the huge quantity of taxpayer money sunk into this line of programming languages, and how much impact it's had, I can't really support it. Academia/Haskell supporters like to lay claim to ideas and argue they "came" from academic PL research, but when you look into the histories and timelines that's just clearly not true. The ideas were either already in development a long time ago, or they were trivial and easily thought of.
Meanwhile, like I said - lazyness is now a dead end. I remember one of my CS lecturers who worked on Haskell-related DT research when I was an undergrad. He sang the praises of lazyness, how much better it made everything. They don't think that anymore and new son-of-Haskell langs don't have it. That entire line of PL theory was born, lived and died entirely within the taxpayer funded public sector.
I’m surprised you are the only one mentioning this. You change the rules of the game, you change the game.
Tensions like these might very well be the reason corporate research has declined.
R&D has been rising globally for over 70 years with no major declines. Globally, $1.7 trillion USD a year is currently spent on RD; roughly 2% of the global GDP. There is more research going on right now than at any point in history at any scale.
R&D, similar to startups, is full of broken dreams, surviver bias, etc. — those complaining their source of funding dried up, they never got lucky, etc.
There is no magic recipe for success in R&D — and anyone that tells you they are able to outperform the market at scale as it relates to R&D outcomes is lying.
This is complete nonsense. This logic applies well to stock and bond valuation, where you have armies of traders optimizing from public information. It's a pretty frictionless market
It's like telling a parent their kids can't outperform the market at scale in science. Of course they can. They just can't outperform the market at scale in math, athletics, leadership, foreign languages, science, and everything else all at the same time.
In virtually all other domains, you have better organizations and worse organizations. I've been in organizations that do R&D brilliantly, and ones that do it horribly. Did the ones that do it horribly die because of market forces? No. They did other things better.
An organization has many pieces: R&D, marketing, branding, advertising, legal, engineering, sales, strategy, finance, logistics, etc. Most organizations I've worked at were really good at maybe one or two of those, in most areas, followed industry best-practices, and were pretty bad in a few.
I can promise you that MANY people can outperform the market at scale in relationship to R&D outcomes. We just can't outperform the market at scale in ALL of those areas at the same time. Organizations and individuals have areas of focus.
Honestly, love research, would be happy to be wrong, but all I hear you saying is someone did it so it must be possible— that’s called survivor bias, it’s not a recipe and it would not double the global output of R&D for the next 10-1000 years.
Excellence requires focus, dedication, thinking things through from first principles, having the right people in place, etc.
The closest I can offer to a recipe is to hire a CEO / President / co-founder early on who has a track record of having R&D successes in former positions, who has a great depth of knowledge, who thinks deeply, and have them focus the organization on R&D, and to do this before the culture is set.
Of course, that's not always a winning strategy. If you do that, that same person is unlikely to have that same depth in, for example, customer engagement, negotiations, or legal.
Most hard things don't have recipes ("What's the recipe for an effective fighter jet?"). If they did have simple recipes, they usually wouldn't be hard. But that doesn't make them impossible (we have a whole fleet of effective fighter jets).
Your question was never about doubling global output of R&D. That's not a point one can even argue meaningfully; there's no way to offer more than an opinion there.
Your question was about being "able to outperform the market at scale." It was nonsense. Plenty of people and organizations can and do outperform the market, consistently, over many decades. That's not a survivorship bias, any more than weightlifters beating the general population at lifting weights is survivorship bias, or that Stanford CS majors have stronger technical skills than the general market is survivorship bias. It's a counterexample. Survivorship bias would be there if these were one-offs (company or individual makes ONE breakthrough, at random).
I'm signing off this thread. This is dumb.
- Some great researchers are not very good at convincing.
- From a purely financial standpoint some research does not make sense considering the risk (more ground breaking and longer usually means more risk).
I'm not sure that the alternative is not much better though.
Reminds me of this.
This article mentions a lot of research labs shutting down in the 90s. The 90s was also when the current 30 year period of <8% nominal interest rates started. And other easy money policies.
Any company that invested heavily in the future would have been a loser vs. people who worked on credit. It isn't surprising that none of the big corporations are investing in research. The investment framework levers have been set to 'short term' for a very long time now - it makes more sense to buy up innovative compeititors. It isn't surprising that long term investment in research vanished from the corporate world.
Low interest rates manifest as people starting Uber and Tesla rather than big companies finding budget for a research lab. Ford is competing with a company that doesn't feel a real need to have profit margins - that is a threat in the present. Big, cash burning machines with potential multi-billion dollar payoffs are where the credit goes and where the winners live - not boring people doing long term research in corporate labs.
It isn't like R&D is a losing proposition in this age - look at Apple's peak for example - but the resources are being directed to people who own assets or are shooting to control entire markets. Research labs aren't making companies winners.
* I'm sure there is somebody, but there won't be many.
This is better for a lot of reasons, obviously, it makes it easier to justify funding the projects. But it also ensures that the researchers truly understand the nature of the problem they're solving. Seeing actual translation queries on Google Translate helps direct the next piece of MT research towards what kind of things people really use the tech for. Building actual big data systems helps train the researchers in how to work with huge data sets and puts them on research paths they otherwise would have been unable to explore.
Basically it recognises that a hard division between research and practice doesn't really make sense when working at the cutting edge of technology. You want the constant feedback from real world practice to guide the theory.
Disclaimer: I work at FB, but my employment there does not change my opinion. I worked at Uber and my opinion of Uber's "research" was different ;)
The main issue with corporate research is that it's not easy for them to capture the output; it very quickly leaks into the public domain, despite patents. So why bother?
Eventually, though, once a company gets large enough, it needs to diversify in order for it to not be totally dependent on one revenue stream, lest it find that stream disrupted in the future. Apple used to be strictly a personal computer company; now it sells not only computers, but it sells smartphones, tablets, headphones, smartwatches, and other consumer electronics, as well as provides services for a fee. Google has a lot of activities, and Facebook has been branching out into VR through its Oculus acquisition.
Having a research lab could be helpful with exploring other areas that the company could branch out into.
I say that tongue-in-cheek, but only partly. There actually is a ton of research that goes into following things like cultural trends and human psychology to ensure maximum ad consumption. Advertising in the middle of a mobile game or on Netflix requires an entirely different school of thought than advertising on traditional cable TV or news websites.
A notable “innovation” (though many may not respect it very much) is the new paradigm of advertising stuff via Instagram influencers and the functionality built into the app to facilitate it. Someone had to research and design that, just like such companies now are probably trying to find the best way to advertise in VR/AR platforms or in rideshares.
To research new infrastructure technology to reduce expenses or simplify operations (making outages less likely, feature development easier, etc.)
To get better at targeting ads by investing in data science/machine learning/AI
To create new products through which to serve ads, reduce CAC, or build goodwill or diversify
To generate publicity and press for recruiting purposes, public perception, etc.
To acquire patents or to genuinely just try something new and figure out how to use it later
Even disregarding the product reasons, research often makes these companies look good to both a technical audience and sometimes to politicians who value "innovation".
They could stop working on their products but they wouldn't last long. The consumer tech world changes rapidly and is fiercely competitive.
When you don't have a monopoly, the investor mindset is that the company should be laser-focused on "core competencies" (buzzword, but important) and return excess capital to shareholders - who then provide it to other companies that will innovate in the field. Keep in mind, the universe of alternative investments go beyond the stock market/PE/VC.
Capital is tied to shareholder value. When you can't point to something creating value, there isn't a reason for capital to stay. For a company to maintain a research lab, it needs to be perceived as something other than a cost center. In contrast, Bell was able to entertain its own full-fledged R&D labs because shareholders expected them to create new avenues of profit themselves because they were the monopoly.
But at the same time, we've seen a decline in corporate R+D since the start of the neoliberal era - the article mentions that this started at around Nixon's time. This is the period of time where Milton Friedman's ideas started to gain widespread acceptance:
>“there is one and only one social responsibility of business– to use its resources and engage in activities designed to increase its profits so long as it stays within the rules of the game,”
Many organisations have taken the idea of "profits over everything", and interpreted it as "quarterly profits over everything". R+D labs don't result in quarterly profits. Much of the research ends up being profitable years down the track. So corporate R+D is killed off.
Thus, RISC was developed at IBM in the '60s (the 801) and shelved until the '90s, as it would have outperformed the 360/370 series. Xerox PARC created the key components of modern personal computing, but Xerox management saw no reason to bring any of it to market.
I work in an “Applied Research” group now. while not a classic corporate research lab, we do get plenty of freedom to experiment with new methods.
Perhaps there are less fundamental improvements because we don’t have a fundamental new medium? So currently most advances are interactive or building off fundamental advances.
Later in that decade with the success of Facebook, investors turned away from these investments and were more interested in finding the next Facebook. Essentially the 2000’s were a kind of lost decade in fundamental technology development. Most of what was built during the decade was the benefactor of cheap computing enabled by 50 years and trillions of dollars in semiconductor investment. Now that we are hitting the edge of what’s possible, it’s time for big, long term, hard tech investment.
It’s only been more recently where people like Elon Musk and to a lesser extent Larry Page at Alphabet have led the way by being willing to take big bets on technology development. In the last several years, we are finally starting to see some venture capital follow where investors are taking a long view and betting on a few “moonshots.” In the last couple years there have been big investments in computing with companies like Cerebrus doing wafer scale computing, PsiQuantum and Rigetti in quantum computing, and various optical computing companies to name a few. Of course, there has also been considerable investment in the AV space as well and the AV space will need lower power, cheaper solutions that you won’t be able to simply buy off the shelf and slap together like almost all of the AV companies are doing today.
It's not unheard of for them to employ people who's primary role is grant writing to try and get (for example) SBIR funding
In my country, if you land a job in such a lab, you know you can bullshit around and play with with some stuff. But nothing will ever come out of it. For profit, the corporation will acquire some product or startup.
If you really want to innovate freely, found your own startup. With the additional benefit of reaping the money yourself.
I've noticed a profound shift in the past decade away from corporate research labs such as IBM Labs and HP Labs where they worked on medium-term projects developing research prototypes that were sometimes passed onto product teams. In their place, companies such as Google have pioneered a different model of research (https://research.google/pubs/pub38149/), where PhDs are hired as software engineers who solve research problems and write production code, focusing more on shipping production code rather than writing papers (although there have been many great papers that have come out of Google, most notably the MapReduce and Spanner papers). I'm noticing that the vast majority of my PhD-holding friends in computer science are hired as software engineers rather than as researchers. The ones who started out at places like IBM Labs or HP Labs with the titles "Members of Technical Staff" would often end up taking positions at other companies as software engineers.
This development may be fine for researchers who want to work on production code and who don't mind de-emphasizing publishing in exchange of product development. However, what about researchers who want to focus on solving research problems that cannot immediately be applied to products? I'm finding that there's decreasing room for these types of researchers in this economy. More companies have a short-term mindset these days, partly due to changes in management style (e.g., the rise of Carly Fiorina-style CEOs), but also due to the fact that the computer industry has shown repeatedly that large 800-pound gorillas can be taken down by smaller companies. "Why invest in long-term research and long-term planning if there is no guarantee of a long-term future" is the logic of many companies, big and small. The alternative to industry is academia, but there are only so many professorships available, and for those professors, there is only so much NSF grant money to go around, which is highly competitive to earn. Professors at research universities spend a lot of time fundraising; it costs a lot of money building and maintaining a lab that is resourceful enough to perform the research and publish the results necessary to gain tenure.
Short of a major cultural change where companies are encouraged to invest in research at the levels that Xerox and AT&T did back in the 1970s and where we see an expansion of academia similar to the post-WWII boom (which is unlikely in the United States), the future I see for those wanting to work on problems that don't lead to immediate productization is independent research done on a researcher's spare time when not being engaged in "money-making" activity. After all, Einstein did brilliant work while he was a patent examiner, and Yitang Zhang did amazing research while being employed as an untenured lecturer. I would advise today's computer science PhD students of this current reality of research employment. If one wants to work on self-directed research projects, then that person must be willing to have a self-funded research career; all researchers need to be concerned with funding whether that funding comes in the form of a direct salary, a grant, or indirectly through the salary of an unrelated job.
The challenge I'm talking about is papers, specifically the game of publishing. I feel like PhD students spend more time in optimizing for publications than doing actual research. There's an obsession with the number of papers so everyone is trying to eek out a paper for every semi-failed experiment, overfitted model, or unfinished prototype. No one wants to throw away effort on something that basically failed, so they're trying to find the perfect narrative, frame their results just so it looks like they're good, or slice and combine it into something submittable. Every deadline is worth submitting to, the number of selective conferences is growing, and there's incentive to "get on" another paper as a co-author (which means building collaborations, helping out, editing).
For each paper, there's months spent writing, giving and receiving feedback, making figures and formatting. Each submission usually requires some change to the format and language, so upon rejection, the paper is edited and targeted towards the next conference. Then there's the submission game of proposing reviewers, choosing the right track or subcommittee, interpreting reviews, writing multi-page rebuttals, editing and getting feedback from co-authors about the rebuttals, and in the best case a month later, preparing the camera-ready version, the back-and-forth with the publisher, and finally preparing and practicing the conference presentation.
So before great research can truly come out of universities, I think publications need to be deemphasized. This could be a simple norm like judging researchers on their best 3 papers for faculty hiring and research awards. In turn, that would reduce paper submissions, increase paper acceptance rates, and finally -- leave more time for actual research at universities.
Companies like Intel and Nvidia and the pharma companies do that kind of research because you can reasonably guarantee that someone will want a faster chip or a better cure for a disease.
But beyond that, what does the market want next? If you can answer that, you can operate a lab. If you can't, then you are just blindly stumbling around. University research is probably also a heck of a lot cheaper as people will work for a lot less if they get a degree at the end of it.
I think they would be better off creating a prize system for problems they want to solve (the return on investment for XPrize is amazing) and letting the university researchers figure it out.
The Manhattan Project, Space Race, etc had program managers telling scientists what problems to solve and had other scientists building the architecture, identifying gaps, and running parallel experiments. It was about the individual scientist it was about the program - so we had Feynman running computer simulations.
There was an end goal and people took less interesting, but necessary, jobs to move the project forward. Also, results were expected - not papers, but actual results to problems.
Drucker wrote about this in (I think) the 90s. Its actually a much more sensible way to do this work.
EDIT: I'm looking for a reference for this. I saw an article that says it was funded with 1% of company wide revenues. This is undoubtedly true, but a graduate student I know that knows about such things says that a substantial fraction or maybe even all of it was reimbursed.
I left shortly afterwards.
You're partially right if the argument is "basic research" = less applied fields / "purer" math, but even there Microsoft research has been a significant player in the TCS and optimization community.
A lot of the current research done at universities is also funded by these organizations, either via grants (monetary or equipment like from Nvidia) to research labs, or scholarships and financial support to students. A large part of AI advancement in the past few years have been precisely because of the amount of effort these firms have put in (along with a lot of others, they are just the most public).
alas, it's likely that nobody in any position to do anything about this will bother