
Innovation and the Bell Labs Miracle - gkanai
https://www.nytimes.com/2012/02/26/opinion/sunday/innovation-and-the-bell-labs-miracle.html
======
larsberg
I was recently talking with Prof. Dave MacQueen about this (he ran a group at
Bell Labs for around 20 years before the Lucent debacle). The most amazing
thing was the management style of that lab. Once a year, you had to write an
"I am smart" report (yes, the actual name) where you told management why you
were doing good, smart things. Management would then meet on two separate
days, once to ensure everyone was meeting the bar of doing good things and
once to figure out how the money thing should be handled.

That's it. No assessment of "high-impact publications this year"; no
assessment up the chain of how many $100MM businesses had been created by your
group (thanks, Lucent!); no demo days to see if some product group is going to
give you the "leave research and work on this or work on something else"
ultimatum...

~~~
jorleif
> The most amazing thing was the management style of that lab

I've worked some years in academic research in a position where the pioneering
research of one guy led to a big laboratory with plenty of researchers in the
lucky situation that they can work very freely on what they want. Coming from
industry I was completely baffled by this. This is probably the style where
the great ones work best. No corporate BS, just the work. What has been very
puzzling is that in this setting, is that the people there are not that great
at many things. Incompetence can be quite rampant among some people. Was there
some kind of "quality control" for the work at Bell Labs? I see now the
circles I'm working in moving towards a more classical "publish or perish"
mindset, which produces safe and unambitious research. Sure, the incompetence
needs to be purged out of the system, but direct measurement very easily kills
the long term for the short term. I wonder if there is any other alternative.

------
mattquinn
"Regrettably, we now use ["innovation"] to describe almost anything. It can
describe a smartphone app or a social media tool; or it can describe the
transistor or the blueprint for a cellphone system. The differences are
immense."

I cannot agree with this more. We may not be able to re-create the environment
of Bell Labs, but I'm hoping to see more in terms of actual science (and it
will be at the nano-scale) in the future, rather than seeing so many people
create yet another social app, claiming that it's "revolutionizing" an
industry.

~~~
pron
Exactly. Also, the word "technology" is now sometimes used to simply describe
a certain piece of software middleware.

While some fascinating Bell-labs-like research is taking place at IBM and
Microsoft, newer tech giants have opted to foster creation of lots of
competing startups to do research for them, so that they're able to spend
money (through acquisition) only on the successful ideas, rather than develop
their own large and expensive research departments. The result has been that
startups often focus (in fact, they are strongly advised to focus - for
example in numerous blog posts that are widely popular here on HN) on ideas
that can generate market value within a couple of years. They actually have no
choice because otherwise they are losing their chances for investments and/or
lucrative exits.

This is not to say that most people working in startups are the same type of
people who could have worked at Bell Labs - they aren't. Most of them are
"simple" engineers well versed in current "technologies" but are uninterested
or unable to break new frontiers (this is not meant as a negative statement).
However, quite a few of them are capable and interested, but the money and the
Silicon Valley game are simply too enticing.

I often feel angry at Google particularly, who've taken quite a few bright and
inquisitive minds from more innovative companies, like Sun Microsystems (RIP),
and turned them into application builders.

But the game has changed, money, and a lot of it, could be made from
technology much faster now than before, and many minds who could have been
used for true innovations are now working on designing social networks (which
is an interesting research topic, but not THAT interesting, and there are
certainly other less explored avenues that lead to truer innovation).

~~~
mattquinn
> "Most of them are "simple" engineers well versed in current "technologies"
> but are uninterested or unable to break new frontiers".

I'm glad you said this because I feel as if it needs to be reiterated much
more often. I'm a CS student right now, and a lot of times I look around and
see my university's CS program as a factory designed to turn out by-the-book
software engineers fit for corporate consumption.

I really enjoy CS, but here's my anecdote: the other day I got a chance to
tour a microfabrication lab, with tons of expensive equipment and a lot of
knowledgeable people milling about (who I'm sure were nervous with us being
there). That really instilled an appreciation of the complexities involved
with real, true innovation. I love writing software, but I won't do it
forever, because the vast majority of _true innovation_ really does require a
deep understanding of scientific foundations.

Also: the guy who showed us around worked for Bell Labs for awhile before
coming to the university where I'm studying. His one pre-condition for
accepting the university job: the $2 million, room-size, laser-equipped system
that he built for detecting flaws in silicon wafers had to come with him.

------
pessimist
In defense of Google, I think their machine translation is highly underrated,
its simply incredible how good it is, at least for European languages. As is
their contribution to building large data centers and manipulating large
amounts of data (mapreduce, Bigtable was the first major nosql).

Facebook is too young to tell.

Still, IBM is the gold standard for CS research - Microsoft appears to have
many productive researchers but I'm not sure they have made fundamental
advances like virtual memory, hard disks, relational databases (all of which
came from IBM).

~~~
gruseom
I thought virtual memory came from Burroughs.

~~~
bane
I think it was the first commercial computer to support it, but much of the
theoretical groundwork wasn't really figured out till 68 and 70

<http://dl.acm.org/citation.cfm?doid=363095.363141>

<http://dl.acm.org/citation.cfm?doid=356571.356573>

~~~
gruseom
That sounds like the old joke, "I can see that it works in practice but will
it work in theory?" If the Burroughs guys both invented it and commercialized
it then surely they should get the credit for it.

The story is in this wonderful memoir:
<http://news.ycombinator.com/item?id=2856567> (with the relevant passage
quoted here: <http://news.ycombinator.com/item?id=2928672>).

~~~
bane
I agree, I hadn't heard of the Burroughs guys until this post. The two papers
I linked to were always the earliest ones I was aware of.

There's a surprising amount of stuff that gets built though without
understanding all of the theory behind it. So it doesn't surprise me if a
"pre-theory" example of VM was built.

After all, we had the wheel for thousands of years before figuring out PI.

------
bsb
Having worked at Bell Labs, I gotta say the culture of innovation was
paramount best described by this statement of an old supervisor: "You should
do what you feel is the right thing, it's my job to align that to the needs of
the business." So you might spend weeks researching highly available protocols
and data distribution techniques to get the one feature correct. It should be
no surprise then that the old refriderator-looking phone systems pioneered in
the late 70s and 80s had a well known 5 9's of reliability.

------
gkanai
I thought this op-ed was important because I agree with the author that
companies like Google, Apple, and Facebook (etc.) get the bulk of the media's
attention when none of what they do could have been done without the work that
came out of Bell Labs. Real innovation is not done at Google or Facebook or
Apple for that matter. Those companies are too tied to meeting quarterly
results to invest in real innovation.

I think the other important implication is that the US doesn't have an entity
like Bell Labs innovating today. And so we rest on the work that was done back
then (cramming more transistors onto a smaller chip, etc.) vs. new innovation
that would provide the platform for decades of future growth.

~~~
ComputerGuru
IBM has turned/is turning into the Bell and Parc of yesteryear. They've all
but vanished from the consumers' eye, yet they're stil _huge_ doing and
licensing and all kinds of research and technologies all across the technology
spectrum, from transistors to chips to software.

~~~
spitfire
Do you have any real evidence to back that up? When Lou Gerstner came to IBM
he all but said their innovating days were over. Since the 90's IBM hasn't
come up with any new products, instead focusing on acquisitions. Their patent
flow is full of business process patents.

and their technology products are aimed clearly at tying themselves into large
govt/business service contracts (See "smarter cities" stuff).

Watson is a neat tech demo, but not much more. I've built parts of the
technology in Watson independently.

~~~
com
IBM apparently still does a lot of interesting fundamental research, in their
PR it looks like they've got a focus on materials science for storage and
medicine and they do interesting data-driven work around health, population
genetics work (out of Africa hypothesis as one example) etc.

[http://www-03.ibm.com/press/us/en/pressreleases/finder.wss?t...](http://www-03.ibm.com/press/us/en/pressreleases/finder.wss?topic=8)

It might not be Xerox Parc stuff, and a lot of the PR looks like fluff, but it
looks like there's real stuff going on there...

------
kevinalexbrown
At first, I really agreed with this piece. I think it's an awesome point that
the overuse of the word innovative deflates its meaning, just like calling
every programmer a "hacker" or saying "let's do this 'the hacker way!'" makes
it less meaningful, or calling every member of the military a 'national hero'
is maybe a little degrading to purple-heart recipients.

But on a second reading, this article really uses a straw man. Comparing a
facebook and a few cherry-picked innovations from Bell Labs is a little
disingenuous for a few reasons.

The most obvious comes from the fact that there were a _lot_ of people working
at Bell Labs. See that first picture? Halls so long they ended in the vantage
point? There were a _lot_ of ideas that never quite made a splash in the 70
years of Bell Lab's heyday.

The second part is that the innovations of Bell Labs may not have been lost,
but transferred. There's still quite a lot of research done in this country,
although most of hit has slowly been shifting to universities. I'm not sure if
this is better, but no one can say there isn't tons of cool stuff being done
for the "understanding" and not the short-term profit. I think there should be
more, but that's not the point. While we're cherry-picking, I might point out
brain machine interfaces that let monkeys (and soon humans) control prosthetic
limbs _directly from their brains_ , nano-scale machines a few molecules
large, quantum computers. These things are anything but short-term profit
focused, and everything but falsely innovative.

Finally, the real straw man lies with the fact that Bell Labs has the benefit
of hindsight. We now know the transistor was an incredibly useful invention.
Before we knew what we could really do with computers, it wasn't quite so
obvious. It might end up that facebook doesn't turn out anything more
innovative than extremely well executed social networking, but that's going to
take more than 8 years to find out.

Don't overuse innovation, but don't become so narrowmindedly awed by the past
that you don't take part in the tumult of things that might (or might not)
share the same sentences as the transistor when someone write an article in 30
years wondering why we don't have the same pizzazz as those Silicon Valley
entrepreneurs at Google.

~~~
jpdoctor
> _The most obvious comes from the fact that there were a lot of people
> working at Bell Labs._

There's a number of issues I'd raise with the piece, but the above is not one
of them. There were a lot of people in a number of R&D labs, but measuring the
output per person (examples: nobels/papers/patents/citations per employee) I'd
guess there was something special about the place.

Full disclosure - I'm a Bell alum, so perhaps I'm glorifying the past.

~~~
jmares
Could you tell us about project genesis at Bell Labs? How did projects
originate, grow, and get killed or morphed?

Thank you.

~~~
jpdoctor
> _project genesis_

It was a big company: I pretty much saw the entire scale of formal defn to
skunk project.

~~~
jmares
Thanks jpdoctor. Could you elaborate on what you mean by formal defn? Who
defined them, out of which principles or goals? (I understand that this might
have multiple answers)

------
balsam
Guess who. Ok, when it was still in NY.

"I went to Princeton to do graduate work, and in the spring I went once again
to the Bell Labs in New York to apply for a summer job. I loved to tour the
Bell Labs. Bill Shockley the guy who invented transistors, would show me
around. I remember somebody’s room where they had marked a window: The George
Washington Bridge was being built, and these guys in the lab were watching its
progress. They had plotted the original curve when the main cable was first
put up, and they could measure the small differences as the bridge was being
suspended from it, as the curve turned into a parabola. It was just the kind
of thing I would like to be able to think of doing. I admired those guys; I
was always hoping I could work with them one day."

~~~
chernevik
The fact that they had Shockley tour-guiding Richard Feynman shows that Bell
Labs was more than just a big pile of patient capital.

------
luriel
I have watched this more than a dozen times, I'm still awe-struck every time I
watch it, and it barely scratches the surface of all that was created at such
a magical place:

<http://doc.cat-v.org/bell_labs/innovations_song/>

~~~
palish
So what happened to Bell Labs?

What were the reasons why it started to lose that magic?

~~~
luriel
A very good book could be written on the topic, is one of the most sad
tragedies of the 20th century.

All my information is second hand, and is a very complex topic, but basically
when AT&T spun off Lucent, and they tried to more directly monetize the
innovations they were making, the whole thing started to break down,
management sucked, researchers started to bail out, and it became a vicious
circle.

~~~
palish
Ah. Thank you.

If anyone has any more information about the decline of Bell Labs, it would be
deeply appreciated.

~~~
aswanson
<http://www.goodreads.com/book/show/438022.Optical_Illusions>

------
got2surf
My dad worked at Bell Labs in the 80s, and he's always telling me about the
quality of the experience there.

I'm not sure if it was the politics, or the people, but he said there was an
Apple or Google-like quality of innovation and inspiration. What's amazing
though is that this innovation extended to basic research as well, and not
just consumer products. He always tells me that when his team was using an
early implementation of C++, a member simply called up Bjarne (creator of
C++), and got an answer within seconds.

I agree with others about shifting research locations, but I think we have
some fundamental problems with the attitudes of research. I've researched
heavily outside of school for the past 4 years (thousands of hours, multiple
publications) and the thing I've learned is that there's gotta be a better way
to do this. Labs aren't the best place for industry and academia to talk to
each other, because we'll always have conflicting motives.

Perhaps a move towards more technology incubation from university research - a
Y-Combinator for universities, perhaps - is the future.

------
cli
"Two of its researchers were awarded the first patent for a laser, and
colleagues built a host of early prototypes."

I hear that most companies own the things that their employees create while
working there. The above quote seems to imply that the patents were owned by
the individual researchers. Did Bell Labs have a policy of researchers owning
their inventions?

~~~
jpdoctor
_Did Bell Labs have a policy of researchers owning their inventions?_

No, Bell retained the ownership (and revenue stream.)

------
ColinWright
Single page:

[https://www.nytimes.com/2012/02/26/opinion/sunday/innovation...](https://www.nytimes.com/2012/02/26/opinion/sunday/innovation-
and-the-bell-labs-miracle.html?pagewanted=all)

------
psykotic
For insight into the computing side of Bell Labs, Doug McIlroy's retirement
lecture is a great read: <http://research.swtch.com/bell-labs>.

A paper covering the earlier period is A History of Computing Research at Bell
Laboratories (1937-1975): <http://cm.bell-labs.com/cm/cs/cstr/99.pdf>

------
jnazario
a book bell labs' innovations already exists, "three degrees above zero" by
jeremy bernstein. highly recommended. i have a copy i got used off of amazon,
but here it is for preview in google books.

[http://books.google.com/books?id=N6s8AAAAIAAJ&dq=three+d...](http://books.google.com/books?id=N6s8AAAAIAAJ&dq=three+degrees+above+zero&source=gbs_navlinks_s)

------
gluejar
I'm another Bell Labs alum. This article seemed banal and backward looking.

Think about what a "hothouse of innovation" is and was. Back then, you had to
physically gather people together to get ideas flowing. Today, we can do it
here, on chat rooms, listservs, irc channels, blogs.

------
jbarham
For all the genuine innovations that came out of Bell Labs, it's also the case
that they killed inventions that would have threatened their
telecommunications monopoly, such as the magnetic answering machine that was
invented at Bell Labs in _1934_!

[http://gizmodo.com/5691604/how-ma-bell-shelved-the-future-
fo...](http://gizmodo.com/5691604/how-ma-bell-shelved-the-future-for-60-years)

------
yabai
The article never mentioned the creation of Unix. How could any tech loving
journalist miss this point?!!

~~~
groovy2shoes
It's mentioned on the first page, second paragraph from the bottom: "Its
computer scientists developed Unix and C, which form the basis for today’s
most essential operating systems and computer languages."

