
The death of corporate research labs - fanf2
https://blog.dshr.org/2020/05/the-death-of-corporate-research-labs.html
======
legitster
Holdup. Lack of anti-trust enforcement is blamed (among other things) for the
end of corporate R&D, but Monopoly breakup is exactly what killed Bell labs! 9
smaller companies weren't going to fund their own lab, and the only reason the
lab existed was to find new markets to explore. In fact, nearly all of the
examples of successful R&D labs came from corporations that so dominated their
industry they put money into finding new investments: Xerox. Kodak Eastmann.
Google might meet this definition.

I think we also overrate the significance of the corporate labs. There are not
a lot of successful examples where the host company actually profited from the
invention. Just a lot that bungled them or prematurely killed them (like when
AT&T almost invented the internet). Or snuck out accidentally (like the Xerox
Alto).

~~~
badrabbit
It's not being too large that forced a split up but their anti-competitive
practices and positions. The regional bells still were big enough to run their
own research labs.

Imagine if Alphabet was broken up, Google search can still afford to run a
research lab, as can youtube.

The problem I think is how easy it is for large companies to acquire smaller
companies. It's how they expand or enter a market, they refuse to be bothered
to bootstrap a new org-unit. They just devour smaller, innovative and creative
companies. Look at Google, they couldn't be creative and patient enough to
compete with youtube so they gulp up youtube. It's the bigcorp M.O.

So why bother with R&D when you can just buy a smaller company that does R&D,
tests the market and builds a brand for you? My answer: you will suffer from
brain drain.and reputation loss,when you buy a smaller company,consumers
assume that brand is now dead. You become a cemerery of dreams and ideas. You
become an IBM,HP,Xerox and AT&T. Once the damage is done it becomes nearl
impossible to recover from. I like IBM as the best example, they are doing
superb amounta of innovation even today but look at all their initiatives lack
any traction or competitive edge. They have a ton of smart people working on
brand new areas of tech like quantum computing,but their reputation and
overall culture has not been great. They've been declining consistently. Look
at yahoo, yahoo!! They had legitimate means to compete with google toe-to-
toe,they relied too much on aquisitions as did Verizon that recently aquired
them for a meager $4B.

In the end I blame all this on how publicly traded companies prioritize
quarterly profits as opposed to multi-year growth. Acquiring bumps up the
stock value for a while, spending billions starting from scratch competing or
developing a new concept is risky so stocks go down.

~~~
enitihas
> Imagine if Alphabet was broken up, Google search can still afford to run a
> research lab, as can youtube.

It is far more likely that YouTube will go bankrupt, since running a free
video service where videos are almost infinite and cache hits much lower is
very costly. Couple that with all the other advantages of being with Google
like access to talent and infra, I don't see YouTube surviving with Google.
It's possible that the tab will be picked up by Facebook video, and that is an
even worse platform and more closed.

I don't see a cheap way of solving the YouTube problem. (Yes P2P exists but
has too many problems like battery consumption on phones and NAT).

~~~
cannabis_sam
That would be fantastic tho, we would have an influx of startups competing in
the video space, since they don’t have to compete with youtube being
artificially propped up by a completely unrelated business model

~~~
enitihas
Yeah, startups aren't going to magically make things cheaper than YouTube. So
if any startups arise, they would follow a paid model most likely.

Now that might be fantastic depending on where you come from. If you have
disposable income, it is a huge win to have a non ad supported video platform
which might even be more privacy protecting. If you are a poor kid using
YouTube to watch MIT OCW and khan academy and innumerable other resources,
well you are screwed.

~~~
cannabis_sam
If you are a poor kid, you upload your interpretation of a Bach piece and get
slammed by an injust, extralegal pseudo-copyright mechanism, that’s in essence
an attack on human culture..

MIT and Khan Academy could easily host their content using BitTorrent and
reduce their bandwidth usage by 95% ( [https://nrkbeta.no/2008/03/02/thoughts-
on-bittorrent-distrib...](https://nrkbeta.no/2008/03/02/thoughts-on-
bittorrent-distribution-for-a-public-broadcaster/) )

~~~
nine_k
Why do you think that a dozen of startups in the space remaining after a
possible youtube demise won't be burdened by the same policies? I think that
the copyright lobby would try their best to tighten their control at a time
when a huge corp stopped throwing its weight against them.

What youtube gives to the poor kid is instant and free access to a huge
audience. Yes, it's comes with an (admittedly not really high) risk of a false
positive and blocking for copyright reasons. I suppose the kid is smart enough
to also have copies of the performance on other services, and safely stored
locally.

~~~
cannabis_sam
Oh, they absolutely would! Every music streaming service is already beholden
to a tiny record industry cartel, even though there are a number of music
streaming services.

But it’s simple, video is more versatile, and by removing youtube’s monolithic
dominance, you remove the single point of failure that copyright cartels have
been able to attack.

As an example, imagine a separate youtube-clone dedicated to education that
was actually willing to fight for fair use! Or a youtube that didn’t
automatically demonetized you for swearing.

Internet already gives these kids (and the rest of us) both access and
audience, we have just been stupid enough to lock large parts of our cultural
heritage into corporate silos, protected by “intellectual property”-laws.
Competition wouldn’t solve this completely, but it would make it harder to
distort the market.

------
freyr
Bell Labs and Xerox PARC are the poster children for successful corporate R&D
labs, but I wonder if, to some extent, they were in the right place at the
right time? The advent of modern computing and digital telephony provided rich
soil for impactful research.

I worked for a large R&D lab a while back after completing my PhD, but the
organization turned out to be completely directionless. Funding went to snake-
oil salesmen who charmed executives with flashy proposals that they could
never deliver on. I don't think there were any major successes during the time
I was there, or since.

They also paid their Bay Area researchers about half the salary of FAANG
senior engineers, so they really couldn't retain top talent. This is the
downside of being a corporate R&D lab that's not funded by a near-monopoly.

~~~
teleforce
I think the largest corporate R&D lab at the moment must be Huawei. Last time
I've heard from someone working there, they have more than 10K PhDs working
for them and even only half of them working on R&D that is really massive. It
also being helped by the fact its founder is an ex military R&D engineer.

The silver lining is that since doing R&D is necessary for progress and
innovation, it will happen elsewhere. I foresee that the majority of R&D
exercises will move to university and industry sponsored research will be the
norm rather than the anomaly. This is also fueled by the fact that graduate
students' stipends are much much lower. You did mention that in Bay Area the
researchers are paid half of the FAANG senior engineers. In most developed
countries, the salary of university graduate students probably a quarter (4x
lower) of the company's researchers. The developing countries has the worst by
being paid 10x lower than the BAY Area researchers.

~~~
jdm2212
Google has a ton of PhDs, too, but the vast majority don't do R&D. I'm
guessing the same is true at Huawei.

Anecdata: My team of 12 or so at Google had (I think) 4 PhDs, and what we did
was the usual "turn one proto into another" Google work that barely required a
CS undergrad degree, to say nothing of a PhD. My wife has a PhD, works as a
software engineer, and also does pretty routine data plumbing work.

------
ramraj07
I am curious what the author thinks about the same problem in pharma/biotech.

On the one hand, they are kind of _forced_ to do research in one form or
another to power the pipelines, so clearly corporate research is still alive
there.

However, given the pathetic track record the industry seems to hace, and
arguably complete lack of any real innovation (almost all of the drugs in most
pipelines are just antibodies or their variants), some form of corporate
research rot seems evident here as well.

One problem I see often on pharma side is that it's revealing when the CEO of
the org is not a technical person. GSK recruited the CEO of L'oreal as its new
CEO. TBH I can't for the life of me figure out how someone who sold lipsticks
can make decisions on which preclinical trial has the highest chance of
success in a human being. If the CEO of a company is not fundamentally versed
in the fundamental technology they make can the company actually be
successful?

~~~
txcwpalpha
I think you’re really not giving L’Oreal the credit it deserves. It’s much
more than just a lipstick vendor. Sanofi, Europe’s largest pharma company, has
its roots in L’Oreal (L’Oreal previously was the parent company of Synthelabo,
which used to be the #3 pharma company in Europe until it merged with Sanofi).

L’Oreal also does research in things like methods to regrow human skin/hair,
which aren’t on the same level as cancer treatments, but they do involve
clinical trials and medical research. Hell, even something like launching a
new facial moisturizer requires testing akin to clinical trials.

Aside from that, Emma Walmsley wasn’t CEO at L’Oreal, and she left there in
2010 to join GSK, where she worked for 7 years before being promoted to CEO in
2017. It’s not as if she went from being some business bean counter straight
into being the head of GSK; she had 7 years to build GSK domain knowledge
before taking the helm.

~~~
ramraj07
Thanks for the context, but I suppose I just couldn't pass up on the poetic
phrasing of "selling lipstick" a la Steve Jobs' "selling sugar water" phrase.

Point still stands though. Even most of my PhD friends often lack fundamental
understanding of how biology works, and I find it hard to believe an MBA can
ever catch up no matter how much training they get. I'd argue that in general
it also shows - biotechs run by scientists seem to do the actual path breaking
research just the same as in technology.

~~~
t_serpico
You don't need to have a deep understanding of biology to have a decent grasp
of drug development. You're operating at a higher level of abstraction. You
could argue such a CEO wouldn't be able to independently gauge the value of a
new innovative medicine or approach, but that's where the CSO and others come
into play.

~~~
ramraj07
The issue is, what if the CSO is actually not competent? Can you tell if
they're bullshitting or not? What if they are competent but giving incorrect
advice because it's their pet project and invalidating it now means they look
bad? How can you tell?

In the end, I personally believe that the CEO needs to know enough of the
underlying technology that their company is working on to smell bullshit.
Otherwise his execs have a very high likelihood of taking them for granted.
Ive seen it happen on a smaller scale repeatedly in my lab. If our professors
are not well versed in one field of science, the postdocs and students will
take advantage of it in every corner.

This is especially true in biology. It doesn't matter how good your
nanoparticle drug is, if you don't know that there are fundamental problems
with immune recognition, half life, biodistrubution and non-specific binding,
you would not know not to invest further. This evaluation is not something a
CEO should outsource to another C level exec.

------
mkl
Why aren't there blue sky research labs funded by curious billionaires? Xerox
PARC's key computing inventions cost a total of US$48 million in today's
dollars [1]. There may not be so much big cheap low-hanging fruit now, but
there is still plenty within reach. Bezos could be doing this instead of or as
well as Blue Origin (which he's been funding at US$1 billion/year [2]).

[1] [https://www.forbes.com/sites/chunkamui/2012/08/01/the-
lesson...](https://www.forbes.com/sites/chunkamui/2012/08/01/the-lesson-that-
market-leaders-are-failing-to-learn-from-xerox-parc/#2a065e868296)

[2]
[https://en.wikipedia.org/wiki/Blue_Origin](https://en.wikipedia.org/wiki/Blue_Origin)

~~~
take_a_breath
Think of what the Vision Fund could have achieved with this mindset.

I’d love for a billionaire to offer 18-25 year-olds, $25k for a summer to
explore new projects.

You could fund 1000 kids with promising projects for $25 million.

If the next great innovation will start as a toy, we need to encourage people
to make more toys.

~~~
captain_price7
Isn't this what universities basically do with graduate students? They tend to
be a bit older, the pay maybe a little less, but universities does allow
pretty high level of freedom.

Of course there are lots of problem with academic culture, like overemphasis
on maximizing citation count. But the setup you describe will also require
some form of simple metric to track progress- to ensure that 25k isn't going
down the drain.

~~~
mattkrause
A lot of academic research is locked into a particular model where every
project has to be doable by 1-3 main people, most of whom are "trainees", and
produce a (positive) result in 1-3 years. It's not totally impossible to do
other things, but the career incentives push pretty hard in this direction:
trainees need 1st author papers, it's harder to fund postdocs after a few
years, etc.

Proper staff scientist jobs would help break out of this mould and might even
be more cost-effective by reducing churn in the lab and providing people with
more guidance.

------
mhneu
Hold on. Corporate research labs are _fundamentally different_ than academic
research labs.

Why? Time horizon.

Companies will not fund research that has a more-than-20-year expected time to
product. Usually, they won't fund things that will take more than 10 years to
go from R&D to product. That's because of investment-- think about startups,
what LP wants to put money in a fund for more than 20 years?

On the other hand, academic labs, funded by governments, often do work that
pays off more than 20 years later. Think about Watson and Crick - their work
on DNA in the 40s and 50s led to antibody drugs in the 1980s that are now
being used widely today.

Bell Labs was the exception that tried to do long-term, basic research work,
and its failure proves that corporate funding will eventually dry up for any
long-term research. There's just no business case.

The US has the biggest tech and biotech economy in the world because the _US
government_ (that is, US citizens and US society) was smart to fund longterm
basic research at the highest level in the world, building universities that
attract some of the best talent in the world. Corporate research labs do more
short-term work.

~~~
thu2111
Corporate labs do fund very long term research. Google was interested in and
working on AI more or less from the early years (anyone remember Google
Sets?), it's still funding fundamental AI research more than 20 years later.
Now they've been funding self driving cars for more than 10 years and still do
so.

The reason it appears rare is because funding research on the assumption it
might be useful in more than 20 years from now is extremely wasteful.
Companies don't work so far ahead because the risk of just going down a dead-
end for half a lifetime is very, very high. In academia that doesn't matter
because people are rewarded merely for researching novel things, in the real
world people are rewarded for doing things that are useful.

It matters. For every DNA you can cite, others can cite dead end branches that
despite decades of research have gone nowhere and probably never will. In CS,
how many programmers are using Haskell every day? It's been in development for
35 years yet virtually nobody uses it - the new languages that gain traction
(Rust, Swift, Go, Kotlin etc) invariably come from corporate R&D labs, and
outside of a few bits of useful syntax borrow little from academic research
languages. Even Haskellers have admitted now that lazyness was a dead end, new
FP langs like Idris don't have it. Practically the entire field of PL research
was swallowed up by FP and continues to be dominated by that paradigm (e.g.
dependent types), despite the vast majority of PL users being disinterested in
them.

Computational epidemiology. It's been researched for >20 years yet the models
are always wrong. There are private sector epi models, but not surprisingly
little work is done on them because there's obviously a missing piece, and
developing huge and insanely complex simulations (e.g. the 15,000 LOC monster
Ferguson produced) is obviously a dead end.

String theory. How much time has been sunk into that? Not a single testable
prediction.

And in biology. You know, one biotech firm found 9 in 10 papers don't
replicate. Papers professional labs can't replicate is not "useful in 20
years". That's "not useful today and never". People were selectively breeding
for many centuries. Agritech firms would have eventually figured out the
structure of DNA if Watson and Crick hadn't.

Meanwhile we tend to take for granted all the long term R&D projects the
private sector does because it's much better at finding useful outcomes
quicker. It doesn't _need_ to wait 20 years to find products out of the
research it does, and that's good!

~~~
anchpop
> Practically the entire field of PL research was swallowed up by FP and
> continues to be dominated by that paradigm (e.g. dependent types), despite
> the vast majority of PL users being disinterested in them.

I couldn't disagree with this more. I'm biased because I really like PL
research, but when I look at modern languages like Rust, Haskell's shadow is
plain to see. ADTs, immutability and parametric polymorphism for instance.

~~~
thu2111
I used to agree.

These days I think there's a lot of wishful thinking along these lines, as if
nobody would have noticed without Haskell that re-using lots of global
variables leads to frequent bugs. C++ got the const keyword in 1985, the same
year Haskell was born. Templates were proposed in 1986. And would nobody have
developed the notion of first-class functions without academic PL research?
Given that even C has the notion of function pointers, it's hard to argue
that.

Rust is hardly related to Haskell. If there's a shadow it's a very small one.
Rust's primary research idea is adding linear types to an imperative type
system. There's no lazyness, it's not pure, the syntax is obviously C-based
and not ML/Haskell based. The similarities to C++ are much stronger than the
similarities to Haskell.

When I look at the huge quantity of taxpayer money sunk into this line of
programming languages, and how much impact it's had, I can't really support
it. Academia/Haskell supporters like to lay claim to ideas and argue they
"came" from academic PL research, but when you look into the histories and
timelines that's just clearly not true. The ideas were either already in
development a long time ago, or they were trivial and easily thought of.

Meanwhile, like I said - lazyness is now a dead end. I remember one of my CS
lecturers who worked on Haskell-related DT research when I was an undergrad.
He sang the praises of lazyness, how much better it made everything. They
don't think that anymore and new son-of-Haskell langs don't have it. That
entire line of PL theory was born, lived and died entirely within the taxpayer
funded public sector.

------
deng
I know it's not a popular topic around here, but I'm confused that corporate
taxes are not mentioned at all. Wasn't one major incentive for creating
research labs that such investments in R&D were tax deductible, and the
drastic lowering of these taxes since the 80s made that point moot?

~~~
gahikr
[https://slate.com/business/2012/07/xerox-parc-and-bell-
labs-...](https://slate.com/business/2012/07/xerox-parc-and-bell-labs-brought-
to-you-by-high-taxes.html)

I’m surprised you are the only one mentioning this. You change the rules of
the game, you change the game.

------
lamchob
I actually work in a corporate research lab, with 1000+ researchers working on
all kinds of things. One thing that proves to be tricky is the "mission" these
kinds of labs have. There is always a tension between direct work for business
units on next years product, and research for novel ideas and methods. The
former is what what keeps the lab afloat, financially. The latter is what is
hoped to keep the company as a whole afloat in the future. But this includes
investing money in projects that will not pan out and just burn money. Finding
this balance is hard.

Tensions like these might very well be the reason corporate research has
declined.

------
billme
R&D is not dead.

R&D has been rising globally for over 70 years with no major declines.
Globally, $1.7 trillion USD a year is currently spent on RD; roughly 2% of the
global GDP. There is more research going on right now than at any point in
history at any scale.

R&D, similar to startups, is full of broken dreams, surviver bias, etc. —
those complaining their source of funding dried up, they never got lucky, etc.

There is no magic recipe for success in R&D — and anyone that tells you they
are able to outperform the market at scale as it relates to R&D outcomes is
lying.

~~~
wegs
> There is no magic recipe for success in R&D — and anyone that tells you they
> are able to outperform the market at scale as it relates to R&D outcomes is
> lying.

This is complete nonsense. This logic applies well to stock and bond
valuation, where you have armies of traders optimizing from public
information. It's a pretty frictionless market

It's like telling a parent their kids can't outperform the market at scale in
science. Of course they can. They just can't outperform the market at scale in
math, athletics, leadership, foreign languages, science, and everything else
all at the same time.

In virtually all other domains, you have better organizations and worse
organizations. I've been in organizations that do R&D brilliantly, and ones
that do it horribly. Did the ones that do it horribly die because of market
forces? No. They did other things better.

An organization has many pieces: R&D, marketing, branding, advertising, legal,
engineering, sales, strategy, finance, logistics, etc. Most organizations I've
worked at were really good at maybe one or two of those, in most areas,
followed industry best-practices, and were pretty bad in a few.

I can promise you that MANY people can outperform the market at scale in
relationship to R&D outcomes. We just can't outperform the market at scale in
ALL of those areas at the same time. Organizations and individuals have areas
of focus.

~~~
billme
What is the recipe? — (You have written a lot of words, no recipe.)

Honestly, love research, would be happy to be wrong, but all I hear you saying
is someone did it so it must be possible— that’s called survivor bias, it’s
not a recipe and it would not double the global output of R&D for the next
10-1000 years.

~~~
wegs
There isn't a simple recipe someone can follow. A recipe is what brings you up
to "industry best practices," and about where a typical business might
perform.

Excellence requires focus, dedication, thinking things through from first
principles, having the right people in place, etc.

The closest I can offer to a recipe is to hire a CEO / President / co-founder
early on who has a track record of having R&D successes in former positions,
who has a great depth of knowledge, who thinks deeply, and have them focus the
organization on R&D, and to do this before the culture is set.

Of course, that's not always a winning strategy. If you do that, that same
person is unlikely to have that same depth in, for example, customer
engagement, negotiations, or legal.

Most hard things don't have recipes ("What's the recipe for an effective
fighter jet?"). If they did have simple recipes, they usually wouldn't be
hard. But that doesn't make them impossible (we have a whole fleet of
effective fighter jets).

~~~
billme
To be direct: you literally failed to acknowledge my core point in response to
your prior comment, that is what you’re describing is “survivor bias” — then
went on in the comment I am currently responding to say a recipe is to fund,
hire, cofounder, etc - the survivors. Recipe you provided was still based on
survivor bias, at best an optimization based strategy — and would never double
the output globally of R&D for any meaningful amount of time.

~~~
wegs
To be direct: You made a nonsense statement, and now your changing your
claims. Your statement was: "anyone that tells you they are able to outperform
the market at scale as it relates to R&D outcomes is lying." This is a false
statement.

Your question was never about doubling global output of R&D. That's not a
point one can even argue meaningfully; there's no way to offer more than an
opinion there.

Your question was about being "able to outperform the market at scale." It was
nonsense. Plenty of people and organizations can and do outperform the market,
consistently, over many decades. That's not a survivorship bias, any more than
weightlifters beating the general population at lifting weights is
survivorship bias, or that Stanford CS majors have stronger technical skills
than the general market is survivorship bias. It's a counterexample.
Survivorship bias would be there if these were one-offs (company or individual
makes ONE breakthrough, at random).

I'm signing off this thread. This is dumb.

------
amitport
I've worked in a couple of corporate research labs. Yes, you'll be expected to
do more convincing about the value of your research for the company. That
raises two problems:

\- Some great researchers are not very good at convincing.

\- From a purely financial standpoint some research does not make sense
considering the risk (more ground breaking and longer usually means more
risk).

I'm not sure that the alternative is not much better though.

~~~
NalNezumi
> \- Some great researchers are not very good at convincing.

Reminds me of this. [https://www.smbc-
comics.com/comics/20101209.gif](https://www.smbc-
comics.com/comics/20101209.gif)

------
roenxi
I've only got one speed; sorry. Compare this to interest rate policy.

This article mentions a lot of research labs shutting down in the 90s. The 90s
was also when the current 30 year period of <8% nominal interest rates
started. And other easy money policies.

Any company that invested heavily in the future would have been a loser vs.
people who worked on credit. It isn't surprising that none of the big
corporations are investing in research. The investment framework levers have
been set to 'short term' for a very long time now - it makes more sense to buy
up innovative compeititors. It isn't surprising that long term investment in
research vanished from the corporate world.

~~~
dannyw
The lower the discount rate is, the more the long term matters.

~~~
roenxi
Is that an NPV reference? NPVs are for comparing options; and low discount
rates theoretically mean that people are more willing to burn money up front
for potential long term gains. But research then starts losing out to other,
riskier, decisions. Nobody* borrows money to fund steady long term corporate
research.

Low interest rates manifest as people starting Uber and Tesla rather than big
companies finding budget for a research lab. Ford is competing with a company
that doesn't feel a real need to have profit margins - that is a threat in the
present. Big, cash burning machines with potential multi-billion dollar
payoffs are where the credit goes and where the winners live - not boring
people doing long term research in corporate labs.

It isn't like R&D is a losing proposition in this age - look at Apple's peak
for example - but the resources are being directed to people who own assets or
are shooting to control entire markets. Research labs aren't making companies
winners.

* I'm sure there is somebody, but there won't be many.

------
glangdale
I have often found it baffling that tech giants, with near unlimited resources
to fund all sorts of things, don't establish these kinds of labs. I'm aware
that there are many things called something like "{Google, Facebook, Twitter,
...} Labs" but from what I understand they are significantly different from
the corporate research labs of old.

~~~
swyx
Microsoft Research is a notable exception

~~~
bjz_
Also Mozilla Research

------
hitekker
The paper that this article is based on:
[https://static1.squarespace.com/static/593d9b08be65945a2e878...](https://static1.squarespace.com/static/593d9b08be65945a2e878544/t/5d31ac9b33ae9b0001d88216/1563536539717/c14259.pdf)

------
durnygbur
Why would ad companies need research labs?

~~~
KKKKkkkk1
You're sadly downvoted, but you're making a good point. The digital ad
companies are at this point operating mature stable businesses in monopoly
markets. Do such companies really need to invest in R&D?

~~~
txcwpalpha
How do you think those companies got to be mature monopolies? Someone had to
do the research to find out which “hot singles near you” banner ad has the
highest conversion rates.

I say that tongue-in-cheek, but only partly. There actually is a ton of
research that goes into following things like cultural trends and human
psychology to ensure maximum ad consumption. Advertising in the middle of a
mobile game or on Netflix requires an entirely different school of thought
than advertising on traditional cable TV or news websites.

A notable “innovation” (though many may not respect it very much) is the new
paradigm of advertising stuff via Instagram influencers and the functionality
built into the app to facilitate it. Someone had to research and design that,
just like such companies now are probably trying to find the best way to
advertise in VR/AR platforms or in rideshares.

~~~
sunstone
Digital ad tech may be mature but search is not. If it was then the experience
of using any search engine would be very much the same but in my experience
that is not close to being true yet.

~~~
TeMPOraL
But isn't the experience pretty much the identical? Google, Bing and DDG look
essentially the same; the two main differences are the colour of the lipstick,
and the search index. The main experience - type in words, get suggestions
while typing, confirm and have it show relevant webpages + various
info/image/videoboxes - it's universal now.

~~~
joshuamorton
Right but the visual experience isn't as much a differentiator as the "does
this answer my question" experience, which relies on fairly sota methods in
NLP, clustering, etc.

------
mmmBacon
I’m not sure about the premise of the article but I do concur that startups
have been largely unable to develop new technology and commercialize it until
relatively recently. During the 2000’s if you were doing a “hard” tech
company, investors did not have the patience to invest in these companies and
allow them to grow. I worked at several of these startups, many with big VC
names you’d recognize and they killed these companies with their short term
focus.

Later in that decade with the success of Facebook, investors turned away from
these investments and were more interested in finding the next Facebook.
Essentially the 2000’s were a kind of lost decade in fundamental technology
development. Most of what was built during the decade was the benefactor of
cheap computing enabled by 50 years and trillions of dollars in semiconductor
investment. Now that we are hitting the edge of what’s possible, it’s time for
big, long term, hard tech investment.

It’s only been more recently where people like Elon Musk and to a lesser
extent Larry Page at Alphabet have led the way by being willing to take big
bets on technology development. In the last several years, we are finally
starting to see some venture capital follow where investors are taking a long
view and betting on a few “moonshots.” In the last couple years there have
been big investments in computing with companies like Cerebrus doing wafer
scale computing, PsiQuantum and Rigetti in quantum computing, and various
optical computing companies to name a few. Of course, there has also been
considerable investment in the AV space as well and the AV space will need
lower power, cheaper solutions that you won’t be able to simply buy off the
shelf and slap together like almost all of the AV companies are doing today.

~~~
stainforth
We could also embrace the idea of government being the investor of first
resort, and not leave it to the whims of VC.

[https://www.nytimes.com/2019/11/26/business/mariana-
mazzucat...](https://www.nytimes.com/2019/11/26/business/mariana-
mazzucato.html)

~~~
nravic
If you're looking at aerospace/defense startups, this is usually the case.

It's not unheard of for them to employ people who's primary role is grant
writing to try and get (for example) SBIR funding

------
andiamo
I have a different take on this: corporate research labs died because we
aggressively clamped down on monopolies.

When you don't have a monopoly, the investor mindset is that the company
should be laser-focused on "core competencies" (buzzword, but important) and
return excess capital to shareholders - who then provide it to other companies
that will innovate in the field. Keep in mind, the universe of alternative
investments go beyond the stock market/PE/VC.

Capital is tied to shareholder value. When you can't point to something
creating value, there isn't a reason for capital to stay. For a company to
maintain a research lab, it needs to be perceived as something other than a
cost center. In contrast, Bell was able to entertain its own full-fledged R&D
labs because shareholders expected them to create new avenues of profit
themselves because they were the monopoly.

~~~
foxrob92
I half with you - when you have a money printing machine (a monopoly), it's
easier to justify spending money on R+D.

But at the same time, we've seen a decline in corporate R+D since the start of
the neoliberal era - the article mentions that this started at around Nixon's
time. This is the period of time where Milton Friedman's ideas started to gain
widespread acceptance:

>“there is one and only one social responsibility of business– to use its
resources and engage in activities designed to increase its profits so long as
it stays within the rules of the game,”

Many organisations have taken the idea of "profits over everything", and
interpreted it as "quarterly profits over everything". R+D labs don't result
in quarterly profits. Much of the research ends up being profitable years down
the track. So corporate R+D is killed off.

------
PeterStuer
Can confirm. I have seen this happen from very up close in the European
research arena. The big corporate labs were either completely gutted, or
severely constraint and reoriented towards very short term practicalities,
basically just development centers fulfilling contract research.

~~~
crocal
Europe suckiness at R&D is well documented. I am European. It hurts.

[https://ec.europa.eu/eurostat/statistics-
explained/index.php...](https://ec.europa.eu/eurostat/statistics-
explained/index.php/Europe_2020_indicators_-_R%26D_and_innovation)

------
dantheman
This article misses the fundamental way in research is funded. The US
Government funds research through grants, and each scientist runs thier own
little shop doing independent research... The problem is this isn't a directed
research program.

The Manhattan Project, Space Race, etc had program managers telling scientists
what problems to solve and had other scientists building the architecture,
identifying gaps, and running parallel experiments. It was about the
individual scientist it was about the program - so we had Feynman running
computer simulations.

There was an end goal and people took less interesting, but necessary, jobs to
move the project forward. Also, results were expected - not papers, but actual
results to problems.

~~~
mrshu
Thanks for sharing this point of view: it does indeed seems to be sorely
missing in the article as well as in this thread.

------
Causality1
The sheer scale of the short-sighted profit-seeking tunnel vision pandemic
among American corporations is terrifying to anyone whose life doesn't revolve
around spreadsheets. It results in businesses looking absolutely amazing right
up until they run off a cliff.

------
atemerev
While I fully agree with the outline of the change in scientific organization
(move from private labs to public science in the universities), I think they
have misidentified the reasoning. It is not the anti-trust deregulation, which
waxed and waned many times. It is the fall of USSR, the end of the Cold War,
and consequential reorganization of scientific financing. Public universities
were somewhat anti-military, and didn’t receive much funding with potential
military applications; all of it went to private labs. After the end of the
Cold War, this situation changed.

------
ncmncm
The latitude offered researchers at the labs is missed, but the main corporate
role of labs was to draw in and divert the attention of the most innovative
researchers from activities that might upset markets. Innovative results were
collected, patented and systematically shelved.

Thus, RISC was developed at IBM in the '60s (the 801) and shelved until the
'90s, as it would have outperformed the 360/370 series. Xerox PARC created the
key components of modern personal computing, but Xerox management saw no
reason to bring any of it to market.

------
lettergram
What would you call OpenAI (I know it’s not exactly corporate but..), Google
Brain, any of the labs competing for quantum computing research, SRI
International, etc etc

I work in an “Applied Research” group now. while not a classic corporate
research lab, we do get plenty of freedom to experiment with new methods.

Perhaps there are less fundamental improvements because we don’t have a
fundamental new medium? So currently most advances are interactive or building
off fundamental advances.

------
the_gipsy
Have they ever been alive, though? They seem more of a PR stunt, to show
shareholders that the company is invested in the future.

In my country, if you land a job in such a lab, you know you can bullshit
around and play with with some stuff. But nothing will ever come out of it.
For profit, the corporation will acquire some product or startup.

If you really want to innovate freely, found your own startup. With the
additional benefit of reaping the money yourself.

~~~
OkayPhysicist
Bell Labs basically re-invented electricity. The invention of the transistor
was one of the most important inventions in history, on the scale of the
dynamo, gunpowder, or fucking metal tools.

~~~
marcosdumay
To be fair, there were plenty of people working into creating a transistor, so
if Bell Labs weren't there, somebody else would have done it. It is not even
clear that they were really the first.

------
linguae
I work for one of the last traditional corporate research labs in Silicon
Valley, and I consider myself very fortunate to have my position, especially
in the time of COVID-19. I love doing research. But I fear losing my job,
partly because I don't know where I'd find another research position in light
of the 30-year decline of industrial research combined with the impending
budget cuts that will hit academia starting with this upcoming school year.

I've noticed a profound shift in the past decade away from corporate research
labs such as IBM Labs and HP Labs where they worked on medium-term projects
developing research prototypes that were sometimes passed onto product teams.
In their place, companies such as Google have pioneered a different model of
research
([https://research.google/pubs/pub38149/](https://research.google/pubs/pub38149/)),
where PhDs are hired as software engineers who solve research problems and
write production code, focusing more on shipping production code rather than
writing papers (although there have been many great papers that have come out
of Google, most notably the MapReduce and Spanner papers). I'm noticing that
the vast majority of my PhD-holding friends in computer science are hired as
software engineers rather than as researchers. The ones who started out at
places like IBM Labs or HP Labs with the titles "Members of Technical Staff"
would often end up taking positions at other companies as software engineers.

This development may be fine for researchers who want to work on production
code and who don't mind de-emphasizing publishing in exchange of product
development. However, what about researchers who want to focus on solving
research problems that cannot immediately be applied to products? I'm finding
that there's decreasing room for these types of researchers in this economy.
More companies have a short-term mindset these days, partly due to changes in
management style (e.g., the rise of Carly Fiorina-style CEOs), but also due to
the fact that the computer industry has shown repeatedly that large 800-pound
gorillas can be taken down by smaller companies. "Why invest in long-term
research and long-term planning if there is no guarantee of a long-term
future" is the logic of many companies, big and small. The alternative to
industry is academia, but there are only so many professorships available, and
for those professors, there is only so much NSF grant money to go around,
which is highly competitive to earn. Professors at research universities spend
a lot of time fundraising; it costs a lot of money building and maintaining a
lab that is resourceful enough to perform the research and publish the results
necessary to gain tenure.

Short of a major cultural change where companies are encouraged to invest in
research at the levels that Xerox and AT&T did back in the 1970s and where we
see an expansion of academia similar to the post-WWII boom (which is unlikely
in the United States), the future I see for those wanting to work on problems
that don't lead to immediate productization is independent research done on a
researcher's spare time when not being engaged in "money-making" activity.
After all, Einstein did brilliant work while he was a patent examiner, and
Yitang Zhang did amazing research while being employed as an untenured
lecturer. I would advise today's computer science PhD students of this current
reality of research employment. If one wants to work on self-directed research
projects, then that person must be willing to have a self-funded research
career; all researchers need to be concerned with funding whether that funding
comes in the form of a direct salary, a grant, or indirectly through the
salary of an unrelated job.

~~~
lazyjeff
In general I agree with the warning in your comment, but I think there's
second challenge besides funding at universities. I mean, funding is always
hard, but right now it's a decent time for computer science funding. There's
the DoD (for things more about AI and security), NIH (for things with some
health relevance), industry (ML and topical problems), or NSF which is more
generous towards CS than any other field today. But funding isn't as much of a
problem for PhD students who should be supported by their advisors, so they
have a good several years of solid focused research time.

The challenge I'm talking about is papers, specifically the game of
publishing. I feel like PhD students spend more time in optimizing for
publications than doing actual research. There's an obsession with the number
of papers so everyone is trying to eek out a paper for every semi-failed
experiment, overfitted model, or unfinished prototype. No one wants to throw
away effort on something that basically failed, so they're trying to find the
perfect narrative, frame their results just so it looks like they're good, or
slice and combine it into something submittable. Every deadline is worth
submitting to, the number of selective conferences is growing, and there's
incentive to "get on" another paper as a co-author (which means building
collaborations, helping out, editing).

For each paper, there's months spent writing, giving and receiving feedback,
making figures and formatting. Each submission usually requires some change to
the format and language, so upon rejection, the paper is edited and targeted
towards the next conference. Then there's the submission game of proposing
reviewers, choosing the right track or subcommittee, interpreting reviews,
writing multi-page rebuttals, editing and getting feedback from co-authors
about the rebuttals, and in the best case a month later, preparing the camera-
ready version, the back-and-forth with the publisher, and finally preparing
and practicing the conference presentation.

So before great research can truly come out of universities, I think
publications need to be deemphasized. This could be a simple norm like judging
researchers on their best 3 papers for faculty hiring and research awards. In
turn, that would reduce paper submissions, increase paper acceptance rates,
and finally -- leave more time for actual research at universities.

------
MattGaiser
The challenge is that companies have poor track records of determining what
the market will want, so how can you know what to research unless it is a
better version of what you already do?

Companies like Intel and Nvidia and the pharma companies do that kind of
research because you can reasonably guarantee that someone will want a faster
chip or a better cure for a disease.

But beyond that, what does the market want next? If you can answer that, you
can operate a lab. If you can't, then you are just blindly stumbling around.
University research is probably also a heck of a lot cheaper as people will
work for a lot less if they get a degree at the end of it.

I think they would be better off creating a prize system for problems they
want to solve (the return on investment for XPrize is amazing) and letting the
university researchers figure it out.

~~~
thinkingkong
Startups are more or less externalized R&D facilities and the acquisitions
happen only after a team has eliminated sufficient risk. Sometimes thats in
the form of revenue but sometimes it’s research or some other signal.

------
known
Something like
[https://en.wikipedia.org/wiki/Corporate_social_responsibilit...](https://en.wikipedia.org/wiki/Corporate_social_responsibility)
for R&D

------
LatteLazy
Research is risky, it's easy to cut for short term gains, it's very hard to
quantify before POC and that makes it very very hard to justify keeping, and
few Co panties are good at the whole pipeline needed to turn research into
profits and research is best done by people with equity. Those are just some
of the reasons that the modern world has moved to a model where small, new
companies soley do research and then either sell/license it or sell themselves
to bigger companies.

Drucker wrote about this in (I think) the 90s. Its actually a much more
sensible way to do this work.

------
tehjoker
It's worth keeping in mind that Bell Labs existed because the government paid
Bell to keep it open. I forget the precise agreement, but simply having the
lab earned Bell money, it wasn't a cost center.

EDIT: I'm looking for a reference for this. I saw an article that says it was
funded with 1% of company wide revenues. This is undoubtedly true, but a
graduate student I know that knows about such things says that a substantial
fraction or maybe even all of it was reimbursed.

~~~
tehjoker
I asked some people that were there and they said that Bell was a regulated
monopoly and the government allowed them to charge an additional 10% to
customers for investment. After competition and deregulation was introduced,
things got increasingly more profit focused (as in research had to be
justified on that basis). After the company broke up, there was some kind of
cooperative lab between seven child companies that equally shared funding of
it. However, that only worked well while they were prevented from competing
with each other. Once competition was instituted, the companies started asking
the lab to sell results to individual companies, and it went downhill from
there.

------
sbisson
I worked at GEC Hirst in the early 90s. It was a fascinating place, and I
learned a lot in my research into local loop technologies in the early days of
the commercial internet. However it was clear that the writing was on the wall
when the on-site library was closed down.

I left shortly afterwards.

------
the-dude
As a Dutchie, I want to point out Philips-Natlab, which invented the CD(ROM).

------
sharker8
I worked at a corporate research lab at a well known tech co. It was mostly
sales and marketing to be honest.

------
ComodoHacker
Why so many identical links to the cited paper? I would understand if they had
anchors, but they haven't.

------
KKKKkkkk1
The corporate research labs have been replaced with the likes of DeepMind
(hell bent on doing research only in neural networks) and Google X (personal
playground of the company's founder with no known scientific or business
output). The fact that big tech does not invest in basic research anymore does
not bode well for the future of big tech.

~~~
govg
What an ill informed comment - Google, Microsoft, Facebook are among the
largest producers of original research in CS, especially for machine learning
these days. You can look at any major conference in the last few years, and it
will not be surprising to see one or two of these names being at the top in
terms of papers contributed.

You're partially right if the argument is "basic research" = less applied
fields / "purer" math, but even there Microsoft research has been a
significant player in the TCS and optimization community.

A lot of the current research done at universities is also funded by these
organizations, either via grants (monetary or equipment like from Nvidia) to
research labs, or scholarships and financial support to students. A large part
of AI advancement in the past few years have been precisely because of the
amount of effort these firms have put in (along with a lot of others, they are
just the most public).

------
naringas
fantastic summary of some pretty interesting research from 2019...

alas, it's likely that nobody in any position to do anything about this will
bother

