
You just went to the Google homepage. What actually happened? - jmonegro
https://plus.google.com/112218872649456413744/posts/dfydM2Cnepe?
======
stephengillie
You just pressed a key on your keyboad.

Simple, isn't it?

What just actually happened?

You engaged a cascade of motor neurons to coordinate the contraction of
thousands of muscle cells, which pulls a lever attached to your calcium
crystalline framework, grinds across a glucosamine joint. This forces your
calcium crystalline frame-member to depress, compressing your saline-filled
lipid-polymer foam skin against the keyboard. As you do this, you constantly
measure the pressure against the lipid-polymer walls to ensure you are not
deforming your muscle cells too much or too little.

\---

Reality has inordinate complexity. When humans build roads or build narratives
or build websites, we are simplifying reality for ourselves and others,
including other animals.

~~~
criley
At least with technology, _someone somewhere_ understood what is going on.
With biology/biochem/physics all the way down... there is no expert who
designed the system originally!

Can you imagine computer science if we had no prior knowledge of computers and
had to research the entire process starting at the end point? What a task.

~~~
sanderjd
> _someone somewhere_ understood what is going on.

Not quite. A huge group of someones in aggregate understood what is going on.
There is still no single expert who designed the system originally.
Eventually, or perhaps already in many cases, some portion of the system has
no _living_ expert who designed it.

I can imagine a future where expertise on some deep components of the system
has been lost to the sands of time, and people have to study them blindly to
determine their function in just the same way that we currently study nature.

Neat.

edit: I think raelshark and Carl Sagan said it better than I did.

~~~
tadfisher
In Vernor Vinge's _A Fire Upon the Deep_ he describes the role of the
programmer-archaeologist, whose job is to sift through mountains of
abstraction layers and black-box code modules written in dead languages
thousands of years prior, determine their function, and stitch them together
to perform a desired function.

In the prequel, _A Deepness in the Sky_ , one of these modules, buried beneath
said layers, contains a booby-trap set by an ancient civilization that only
the protagonist knows about. I believe the Unix epoch is mentioned briefly.

~~~
kiba
Why don't they just code new software from scratch?

~~~
tadfisher
It's a logical continuance of the current state of affairs. Even in the early
90s when these books were written, few people wrote entire applications in
assembly language, or punched machine code into cards; they wrote in a higher-
level languages that are eventually compiled into machine code, or were
interpreted by other tools of which they know nothing about the internal
workings.

Hell, we are inventing entire languages today that compile to Javascript,
which in turn is interpreted or JIT-compiled by other tools. Or we write
advanced HTML5 frontends that talk to advanced Node.js backends that in turn
talk to legacy systems written in COBOL running on IBM mainframes. Programmer-
archaeologists _already exist_.

------
kevinalexbrown
When I got to the last paragraph or two, I realized this was not just a
"computers are complicated" post. He marks a specific _social_ problem the
complexity of the technology has created.

This is the best way to frame the "tech laws problem" I've seen in a long
time. I'm curious: what is the best way to approach the bikeshedding issue?

On the one hand, the people who recognize the issue tend to be technical. On
the other, the solution will inevitably be a social one, unless something
comes along that makes patents and technological laws moot.

Here are three social avenues I could see being helpful, but none of them
seems to solve the problem. I'd love to know what people are doing in this
area.

a) Improve technical education for the general public so that they can call
BS, or make reasonable decisions.

b) Improve technical education of public servants that make crucial decisions
regarding technology. I'm not competent to rule in a legal case about
pollution, so why should we assume judges are competent to rule in a legal
case about code? (How do you measure that? Certifications? - egh).

c) Improve social outreach for technical people. Most technical people
probably want to build cool things instead of sit in Congress, knock on doors,
or otherwise get involved. I've talked with engineers who despise legal
proceedings so much they started trolling the lawyers in depositions. Honestly
I'd rather build something cool than think for five hours about how to get
people to care about patent law. Maybe that should change.

I'd love feedback on this, because the bikeshedding issue is the scariest
social problem I can't think of a solution to. It doesn't just affect a
specific patent, it affects the way we rule on them in general.

If you are both a lawyer and technical, I would really love your feedback,
here, or via email.

~~~
rayiner
So before I was a lawyer I was an engineer, and most of my friends are still
engineers. One of the things that I noticed then and still notice now is that
tech people think they're special. They think that their problems are somehow
qualitatively different from all the other problems that society has ever
faced. They are simultaneously optimists (we can build anything!) and
pessimists (we have no hope of influencing the system!). They are both self-
centered (our things are the most important things!) and convinced of their
own impotence (nobody cares about the things we do!). Indeed, this article
_exudes_ of "we're special!"

If tech people want a world with sensible tech laws, the first thing they have
to do is internalize one simple fact: computer tech isn't special. It's no
different, in the grand scheme of things, than petrochemical refining or
agriculture. It's just one specialized problem domain within a larger society.

That realization is simultaneously humbling and empowering. If tech is the
same as everything else, then that means the same social tools that work for
everything else can be leveraged to work for tech! And that means (c), lots
and lots of (c).

But not just (c), even though (c) is the starting step. Ultimately, through
(c) you can do (b). For example, a judge isn't a domain expert in petro-
chemical refining either, but they make rulings on petro-chemical refining all
the time and it works more or less well. That's because the system is
structured so as to not require judges to be experts in everything. It is
structured so people versed in a specific problem domain, be it petro-chemical
refining or code, can explain in plain terms the moving parts of his case, and
the judge, generally a highly intelligent person, can make decisions based on
those explanations. And ultimately, through (c) you get to (a). Ultimately,
the burden is on tech people to convince the public at large to care about the
things that they care about.

I've used this example elsewhere, but I think it's a really important one. The
tech industry complains up and down about its inability to fight the "big
money" of the media companies. Yet, the entire U.S. movies and music industry
put together are about $50 billion in domestic revenue per year, or equivalent
to just Apple's revenues in just one quarter. You're telling me that the tech
industry can't fight the "big money" of an industry that's a fraction of its
size? Please! Another example: Apple's revenues and profits are about the same
as Goldman Sachs, Morgan Stanley, and JP Morgan Chase _combined_. Tech isn't
the skinny schoolboy getting picked on by the big kids--it's the behemoth. The
only sector that can compare is the petro-chemical sector.

We live in a democracy. In a democracy, you can't just sit around waiting for
everyone else to realize how wonderful and special you are and legislate to
further your interests. You have to _integrate_. You have to participate in
the political process. You have to explain to policymakers the moving parts of
your industry, and you have to convince the public to care about the things
you care about. And you have to accept that the policy makers sometimes will
not agree with you (because they're balancing a broader array of interests
than just your own), and you'll have to accept that the public won't
necessarily buy into your worldview. But when that happens it's not an excuse
to take your marbles and go home.

For a contra-example, look at environmental legislation. Environmentalists
have been incredibly successful considering there is very little money behind
the movement, and that the people on the opposing side of the table are petro-
chemical giants, each of which are 2-10x as large as the entire domestic media
industry that tech people think are too monied to be overcome. Yet they have
been remarkably successful given those odds! Why? Because they don't hole
themselves up in ivory towers. They participate in the political process. They
translate their value systems into things that perk up the ears of politicians
(this environmental bill might cost a few jobs, but it will be more than made
up for by the avoided health costs from the reduction in pollution!) Jobs,
costs, etc. Those are things politicians care about, and indeed those are the
things they're elected to care about! Sometimes, they even fight dirty. They
participate in the _war_ that is living in a democratic society with competing
factions.

~~~
dfj225
I'd argue that software and digital technology are different from everything
that the legal system has legislated on in the past because this is the first
time we've been able to make 100% accurate copies for zero cost. (Ok, maybe
not zero with the cost of electricity and storage, but essentially zero.) I
also feel that the legal protections of the patent system, while perhaps not
completely broken, are certainly not tuned to the realities of software
development. Software is different. It's sui generis and I believe our laws
should be adjusted to reflect that.

That said, I agree completely with your point about integration. The worst
thing we can do as a community is step aside and allow others to create
legislation that is not in the interest of technology or the good of the
people at large.

Edit: minor grammar.

~~~
rayiner
We have had, for hundreds of years, technology that makes the cost of each
marginal copy of a creative work some tiny percentage of what people are
willing to pay for that work. Digital technology making that percentage even
smaller doesn't fundamentally change anything. There is nothing magic about
"essentially zero cost" copies versus "negligible cost" copies.

This is largely because our whole system of property is structured so that
marginal cost is broadly irrelevant. We have a pervasive notion in our system
that people are entitled to the "benefit of the bargain." That is to say,
people are entitled to profit from the difference in price people are willing
to pay for something, based on supply and demand, and the marginal cost of
producing that something. That's why Apple can sell for $600 iPhones that cost
only $207 to produce, or why Louis Vuitton can sell for thousands of dollars
handbags that cost maybe $100 to produce. The marginal cost of production is
irrelevant, from the buyers viewpoint, in anything we buy. So why should it be
different for digital goods?

~~~
czr80
You're right, the marginal cost argument is wrong, and yet there is something
different going on here and I think I know what it is: The fundamental change
here is that _customers_ own the means of reproducing the product and
reproduction costs are equal to the marginal cost. How much could Apple sell
an iPhone for if the same was true? What would Apple do to remain in business?

~~~
msellout
I think that's worth repeating: "the ownership of the means of production has
changed". Intentionally rephrased to allude to a certain economist.

------
guptaneil
Wow, it's easy to forget how much complexity there is, even for a
technologist. This post reminds me to step back and appreciate everything
that's going on under the hood. For a similar effect on non-technologists, see
Louis C.K's bit about how even the shittiest technologies are a miracle[1].

I disagree with the conclusion though. I think the reason Steve Jobs' death
impacted people more than Dennis Ritchie's is that Jobs was taken in his
prime. Who knows what the world lost by his premature death.

1: <http://www.youtube.com/watch?v=KpUNA2nutbk>

~~~
hoprocker
"Any sufficiently advanced technology is indistinguishable from magic."
-Arthur C Clarke[1]

This article has been posted at least once onto HN, and I liked reading it
then, too. One of the first major insights I remember from my CS curriculum
was the concept of abstraction -- in CS it's applied to code-as-data, the OSI
model, etc, but it exists everywhere, including all engineering, large
bureaucracies, etc. Thank you Prof Harvey!

1: <http://en.wikipedia.org/wiki/Clarkes_three_laws>

~~~
gdickie
I prefer the correlary, as motivation:

"Any technology distinguishable from magic is insufficiently advanced"

------
davidkatz
Beautiful. One thing though, I believe the OP is mistaken when he implies that
Steve Jobs didn't affect how computers work on the inside:

"On the one hand, I can imagine where the computing world would be without the
work that Jobs did and the people he inspired: probably a bit less shiny, a
bit more beige, a bit more square. Deep inside, though, our devices would
still work the same way and do the same things."

Ultimately, computer architectures serve real world use cases. Innovation in
use cases results in innovation in architectures. There are countless new
technologies that exist because of the products that Apple invented.

~~~
solarexplorer
Example?

~~~
guptaneil
The entire ultrabook category can be credited the Macbook Air.

The iPhone drove the creation of the modern smartphone market. Before 2007,
only businessmen carried Blackberries. Now pre-teens have smartphones. There's
a really powerful image NBC posted comparing the Papal Conclave in 2005 and
2013[1] that tells the whole story. Similarly, the iPad is driving the tablet
market.

And that's just 3 recent consumer-facing categories. The list goes on as you
dig deeper or farther back. If you're using Chrome, guess who started Webkit?

1: <https://twitter.com/neilgupta/status/312317589432971264>

Edit: I should have said, guess who popularized Webkit. We can debate
semantics on whether forking KHTML to Webkit makes Apple the author of Webkit
or not, but I think we can agree Apple popularized it.

~~~
ajross
The macbook air is just a small laptop. In what way is making things smaller
an Apple technology? Laptops have been getting steadily smaller for decades
now.

The iPhone may have created a market, but that was because of how it worked.
The question was whether anything Jobs did influenced the _technology_ of the
device. The iPhone was an early (but not the first) user of capacitive
touchscreens, but virtually everything else in the device was an off-the-shelf
part.

And FWIW: I know who started WebKit. It wasn't Apple. :)

~~~
guptaneil
Apple pressured Intel to design the smaller chipsets that made the Macbook
Air, and subsequently other ultrabooks, possible. Without Apple's influence,
we might not have ultrabooks.

Similarly, Apple pressured Corning to make Gorilla Glass. I would also argue
the iPhone's UI has heavily influenced modern smartphone OS's to be more user
friendly.

I'll cede the WebKit point, though I do think we can credit Apple for
popularizing it.

~~~
cooldeal
How does Gorilla Glass even come close to enhancing the state of computing
technology like Dennis Ritchie did?

~~~
Retric
Unlike C it was a net benefit even if a small one. Pascal was often the direct
competitor to C and for low level tasks it's a much better language that lost
out due to UNIX / tool chain support.

------
Mithrandir
Previous discussion: <https://news.ycombinator.com/item?id=3115951>

~~~
joecurry
Thank you - I'm constantly surprised how frequent the same information is
churned on this site.

------
lazyjones
To add some cynical flavor: when you went to the Google homepage, you expended
electricity, CPU cycles, network traffic, gave up some privacy and waited some
time just to see something that could have been shown to you by a static HTML
file(²) on your disk (or even something built in the browser), because some
corporation sees some benefit in all this extra work. We're not exactly trying
to do things efficiently these days, since it's no longer a priority.

(²) not valid for search results of course, but even they could be sent to you
in a more efficient, less privacy-destroying way if only some corporation's
interests weren't more important than your own

------
pacaro
The best bottom up approach to this that I have seen is Charles Petzold's
"Code: The Hidden Language of Computer Hardware and Software" [1] which starts
with using a flashlight to send messages and walks up the abstraction chain
(switch, relay, alu, memory, cpu...) to most of the components of a modern
computer. It's very accessible.

[1] [http://www.amazon.com/Code-Language-Computer-Hardware-
Softwa...](http://www.amazon.com/Code-Language-Computer-Hardware-
Software/dp/0735611319)

------
tiger10guy
What's even more cool?

Before long computers will be able to conceptualize the whole complex mess.

Consider how we acquire knowledge (perhaps like
<http://matt.might.net/articles/phd-school-in-pictures/>). The more we've
learned, the more we need to know about how to learn. At each level of
knowledge we gain knowledge by accessing new knowledge and combining it with
what we know. Eventually the supply of new knowledge dwindles and the only
tool we can rely on is learning; the only tool we can rely on is that ability
to combine knowledge.

This is much harder than taking in new knowledge; especially for computers!
However, computers are getting better and better at it. Whereas many of us are
out of college, computers are still in middle school, but they're getting
better and better at both large scale and high complexity learning, so they'll
move on to high school and college soon. Moreover, they're moving at an
exponential pace! (see Ray Kruzwell and his ideas on the exponential growth of
technology) Eventually, no... soon, computers will be able to conceptualize
and intuit the scale and complexity of something like google.com. No person
can come close to this, so we have no idea what that ability will bring.

------
joshaidan
When it comes to complexity, I think it's a miracle that spinning hard disk
drives actually work.

~~~
bstpierre
I have told people before, "every time you make a phone call on your cell
phone, a miracle happens". The little bit I know about the complexity behind a
mobile network is mind boggling -- even just at the physical layer, dealing
with reflections, power levels, etc.

Though I guess I can sort of understand how that works. What I _can't_
understand is that I can understand _anything_... I simply can't fathom how
the human brain works...

~~~
javert
In terms of epistemology (e.g., how can we form valid concepts and principles
and be sure they're right), or in terms of neuro-biology/neuro-chemistry?

~~~
bstpierre
More the epistemology than the biology. The biology makes some sense on some
level -- it's all "just chemistry" at the bottom.

~~~
javert
I'm with you. "It's all just chemistry" is sufficient for me for most things,
but I worry a lot more about epistemology, which is why I have spent some time
studying it.

Until recently, there were two basic camps on "how we know stuff." The
Rationalists and the Empiricists.

The Rationalists thought that knowledge was came from mental abstractions, not
from reality. For example, Plato believed that we could all access the "ideal"
world of "forms" by just introspecting. The medieval scholastic philosophers
tended to be very rationalistic - they made arguments about abstractions not
connected to real experience, like how many angels could fit on the head of a
pin. Decartes was a rationalist - "I think, therefore I am" is rooted in the
idea that we look inward to perceive reality.

The Empiricists, which are somewhat more modern, were a reaction to
Rationalism, and rejected all of that. They said that all we can really "know"
or "trust" is sensory experience. We can't build up complex mental models, and
we can't have principles - we just have to be pragmatic all the time and do
what seems best, because we can't really "know."

Kant tried to improve on both of these approaches by saying that we can't
really know anything about "true" reality, because reality has to be filtered
through the senses. There is only subjective "reality."

Ayn Rand, of whom I am a big fan, went in the other direction from Kant, but
also rejected Rationalism and Empiricism. She said that you can build complex
models of reality, but they have to be based on actual observations of reality
(i.e., sense data). To do this, she proposed a new theory of concept formation
(i.e., how to form abstractions based on observation of reality). Her approach
is very Aristotelian, which is interesting because Aristotle was the main guy
who disagreed with Plato at the very beginning. I'd highly recommend
"Introduction to Objectivist Epistemology" if you're interested in this
approach (although you may be better served by starting at a less advanced
level... still, the book is short and very accessible, albeit kind of a mind
trip).

------
jeremyarussell
So I like how he ends with that bit on patents, when I did my presentation at
the USPTO Silicon Valley roundtable event a month ago, a couple of the
presenters made the case that absolutely nothing needs to change with software
patents because computers should be treated the same as any other kind of
machine, and so software should be considered the same as every other type of
patent. The fact of the matter is this is simply not true, computer software
cannot be equated to physical items, and barely equate to business flow and
methods. With all the complexities, as well as the fact that to try out a new
version of software happens in seconds where making a prototype of some
machine or object takes days, weeks or months. It seemed they either didn't
fully understand the difference, or they understand and do not want the system
to change since it works greatly in their favor.

Anyone who has an opinion on patents, especially software patents, should be
keeping up with the roundtable events. And, I'm not saying that because I went
either, stuff is being talked about at these events that will either be
ignored or shape the patent system in one way or another. In either case, it's
in our best interest to stay involved in the process.

Edit: Spelling

~~~
pbhjpbhj
> _to try out a new version of software happens in seconds where making a
> prototype of some machine or object takes days, weeks or months_ //

Aren't most prototypes "made" in software nowadays and only final products are
really fully produced.

In generally I think I agree with what you're saying. In Europe software
patents, as such [!], are not allowed but patents to software have always been
allowed that made a technical effect, ie performed a real physical change to a
system. It's very hard to pin down the boundary but I think that this is
something the board got right.

That said I think personally that all manufacturing rates have increased
greatly since patent terms were set and that the terms should be decreased to
compensate for this change in the rate of development.

~~~
jeremyarussell
That's a valid observation I hadn't made yet, concerning manufacturing in
general. You're right about the prototypes being made with software as well, I
mostly meant that to physically produce the prototype is different then when
you produce the prototype for software, given the click and compile nature of
it. All in all though I agree that the terms really need to be reconsidered,
but I don't see our legislative branch making an real changes to our
governmental system in general, it works out for them pretty good right now.

------
_red
From another perspective this is a modern view of the classic "I, Pencil" -
which Friedman gave an great overview of:

<http://www.youtube.com/watch?v=OlTRau_XgGs>

~~~
frozenport
Except this one isn't well written, and smacks of postmodern impossibility
instead of structuralist awe.

------
ultimoo
It's a scary thought that if all the computers in the world were destroyed in
an instant, how long would it take for us to build a Core i7 processor.
Decades?

~~~
dntrkv
I was just talking to another developer about this, except we took it one step
fruther: If all man made objects disappeared in an instant, how long would it
take us to build a modern computer? The best we could come up with was: a
really long time. Anyone have any input or recommended reads that pertain to
this?

~~~
elwin
From observing reenactments of early American life, I think an 18th century
agricultural technology is self-sustaining and directly bootstrappable, except
for the problem of mining and working iron. So my guess is not much longer
than 300 years.

~~~
PeterisP
There exists a viewpoint that in case of a cataclysm (which would involve man-
made objects disappearing) we would never, ever progress past 18th century
tech again.

The argument is that getting from animal-powered devices to
solar/nuclear/whatever powered devices while at the same time switching from
90%-agricultural workforce to anything more progressive can happen only if
there is a cheap source of energy available - and we already have mined and
spent all of easily available fossil fuels.

Even if all kinds of fancy devices are available and constructed by rich
enthusiasts, the lack of _cheap_ steam power ensures lack of cheap steel/etc,
and all the technologies don't get the mass adoption required for their
improvements, there are almost no advantages for industrialization, so the
world gets stuck in feudal-agriculture systems as the local optimum.

~~~
jodrellblank
I wonder how far you'd get with, say, bamboo and bamboo-charcoal as an
abundant and renewable fuel?

[http://opinionator.blogs.nytimes.com/2012/03/13/in-
africas-v...](http://opinionator.blogs.nytimes.com/2012/03/13/in-africas-
vanishing-forests-the-benefits-of-bamboo/)

It wouldn't have quite the pop of surface-minable coal and oil, but if you
were pushing for high technology it would boil water, drive steam engines -
but maybe it wouldn't happen in England. Where you'd get the metal from is
another question.

------
dzhiurgis
Brilliant stuff that I have been looking for quite some time. I work in
support and sometimes I really want to show this to our users, yet I would
love to see something more human like: Jennie from Sales Support just used
word merge to print a letter. Simple, isn't it? What just actually happened?
The letter will go to post box, were postman will take it sorting center,
where it will be distributed and sent to customer, who will read the letter,
discuss it with the family and take number of steps to take a decision how to
proceed further.

Or a guy from sales just received a call, but what just happened? The client
was recommended by a friend after a splendid experience with the product, the
client spent bunch of time reading number of sites and reviews and he is just
about to purchase a product when sales guy's computer has crashed and he lost
a client. We always try to make people understand the IT more and the other
guys will try to makes understand their processes more, yet it is going to be
never ending dialogue, process, fight..

------
mfenniak
I've previously used this as an interview pre-screen question for job
candidates. It's a great question to ask; there are so many levels of details
involved.

As a web company, you generally want potential employees to at least mention
"HTTP" in the response. DNS is great. TCP/IP too. You'll definitely weed out
some people who don't have a clue.

------
mryan
This question is one of my favourites when interviewing front/back-end
developers and sysadmins. It gives candidates the chance to talk about the
part of the stack they find interesting, so I can see what they are most
knowledgeable/passionate about as well as judging their overall level of
background knowledge.

------
zobzu
What if I don't use docsis? :P

also, we quite understand how chips are automatically organized by other
chips. it's because we don't understand it that computers are "building"
computers. its because they're way faster at those repetitive tasks (and yes
i'm talking about automatic chip layouts, for example)

Even thus it's not exactly TFA's point, since TFA goes pages and pages on
complexity for pointing out that people care about what they see, not what it
really does, I though that's worth mentioning.

------
savrajsingh
My favorite one-sentence version of this, said by a comedian (whose name I
forget): If you're left in the woods with nothing, how long before you can
send an email?

------
hereonbusiness
I'm just glad I understand technology at least to some extent. I mean
technology is often so complex I could never put together most of it myself or
even understand it down to intricate detail, but at least I have a grasp of
how it all fits together to make a pc, cpu, memory, television, radio, phone,
remote control, ...

I can't imagine going trough this life without having the faintest idea of how
a lot of the stuff I use everyday actually works.

------
fractalsea
The problem is not only with communication between end-users/management, it's
also a problem with communication between teams -- for the same reason.

Because of the huge amount of complexity described, it becomes impossible for
one developer -- or one group of developers working on the same project -- to
understand at one time much more than their current specialization. This makes
it hard to talk to peers working on other projects.

------
devindotcom
Great post. I wrote something similar a while back -
<http://techcrunch.com/2012/06/16/the-way-things-work/>

But he takes it to another level. There's a lot to be said on this, and
education is super important, but ultimately one has to sort of ... surrender,
at least to some degree.

------
Vlaix
I've long contemplated the depths of technological stacks (and knowledge as a
whole, it pertains to epistemology as well), and my opinion is that knowing
the intricacies of them isn't that important, as long as we manage to archive
their fundamental principles somehow (may it be in our heads through studying
and transmission, or in data).

------
ambrop7
Hypothetical question: if all the knowledge in the world was available to you,
how long would it take to build a modern electronic device from raw materials
and without using any existing machine in the build process? I mean the actual
time spent building something, excluding any mental work.

~~~
sjwright
Forget modern electronics, consider how long it would take to build a simple
electric toaster from scratch:

<http://www.thetoasterproject.org/>

------
zenbowman
Another Abelson/Sussman gem comes to mind: "The secret to engineering is
knowing what NOT to think about"

------
sidcool
I don't have access to Google Plus in office. Can someone post it here? I am
very curious.

~~~
superkarn
[[https://plus.google.com/112218872649456413744/posts/dfydM2Cn...](https://plus.google.com/112218872649456413744/posts/dfydM2Cnepe)]

Dizzying but invisible depth

You just went to the Google home page.

Simple, isn't it?

What just actually happened?

Well, when you know a bit of about how browsers work, it's not quite that
simple. You've just put into play HTTP, HTML, CSS, ECMAscript, and more. Those
are actually such incredibly complex technologies that they'll make any
engineer dizzy if they think about them too much, and such that no single
company can deal with that entire complexity.

Let's simplify.

You just connected your computer to www.google.com.

Simple, isn't it?

What just actually happened?

Well, when you know a bit about how networks work, it's not quite that simple.
You've just put into play DNS, TCP, UDP, IP, Wifi, Ethernet, DOCSIS, OC,
SONET, and more. Those are actually such incredibly complex technologies that
they'll make any engineer dizzy if they think about them too much, and such
that no single company can deal with that entire complexity.

Let's simplify.

You just typed www.google.com in the location bar of your browser.

Simple, isn't it?

What just actually happened?

Well, when you know a bit about how operating systems work, it's not quite
that simple. You've just put into play a kernel, a USB host stack, an input
dispatcher, an event handler, a font hinter, a sub-pixel rasterizer, a
windowing system, a graphics driver, and more, all of those written in high-
level languages that get processed by compilers, linkers, optimizers,
interpreters, and more. Those are actually such incredibly complex
technologies that they'll make any engineer dizzy if they think about them too
much, and such that no single company can deal with that entire complexity.

Let's simplify.

You just pressed a key on your keyboard.

Simple, isn't it?

What just actually happened?

Well, when you know about bit about how input peripherals work, it's not quite
that simple. You've just put into play a power regulator, a debouncer, an
input multiplexer, a USB device stack, a USB hub stack, all of that
implemented in a single chip. That chip is built around thinly sliced wafers
of highly purified single-crystal silicon ingot, doped with minute quantities
of other atoms that are blasted into the crystal structure, interconnected
with multiple layers of aluminum or copper, that are deposited according to
patterns of high-energy ultraviolet light that are focused to a precision of a
fraction of a micron, connected to the outside world via thin gold wires, all
inside a packaging made of a dimensionally and thermally stable resin. The
doping patterns and the interconnects implement transistors, which are grouped
together to create logic gates. In some parts of the chip, logic gates are
combined to create arithmetic and bitwise functions, which are combined to
create an ALU. In another part of the chip, logic gates are combined into
bistable loops, which are lined up into rows, which are combined with
selectors to create a register bank. In another part of the chip, logic gates
are combined into bus controllers and instruction decoders and microcode to
create an execution scheduler. In another part of the chip, they're combined
into address and data multiplexers and timing circuitry to create a memory
controller. There's even more. Those are actually such incredibly complex
technologies that they'll make any engineer dizzy if they think about them too
much, and such that no single company can deal with that entire complexity.

Can we simplify further?

In fact, very scarily, no, we can't. We can barely comprehend the complexity
of a single chip in a computer keyboard, and yet there's no simpler level. The
next step takes us to the software that is used to design the chip's logic,
and that software itself has a level of complexity that requires to go back to
the top of the loop.

Today's computers are so complex that they can only be designed and
manufactured with slightly less complex computers. In turn the computers used
for the design and manufacture are so complex that they themselves can only be
designed and manufactured with slightly less complex computers. You'd have to
go through many such loops to get back to a level that could possibly be re-
built from scratch.

Once you start to understand how our modern devices work and how they're
created, it's impossible to not be dizzy about the depth of everything that's
involved, and to not be in awe about the fact that they work at all, when
Murphy's law says that they simply shouldn't possibly work.

For non-technologists, this is all a black box. That is a great success of
technology: all those layers of complexity are entirely hidden and people can
use them without even knowing that they exist at all. That is the reason why
many people can find computers so frustrating to use: there are so many things
that can possibly go wrong that some of them inevitably will, but the
complexity goes so deep that it's impossible for most users to be able to do
anything about any error.

That is also why it's so hard for technologists and non-technologists to
communicate together: technologists know too much about too many layers and
non-technologists know too little about too few layers to be able to establish
effective direct communication. The gap is so large that it's not even
possible any more to have a single person be an intermediate between those two
groups, and that's why e.g. we end up with those convoluted technical support
call centers and their multiple tiers. Without such deep support structures,
you end up with the frustrating situation that we see when end users have
access to a bug database that is directly used by engineers: neither the end
users nor the engineers get the information that they need to accomplish their
goals.

That is why the mainstream press and the general population has talked so much
about Steve Jobs' death and comparatively so little about Dennis Ritchie's:
Steve's influence was at a layer that most people could see, while Dennis' was
much deeper. On the one hand, I can imagine where the computing world would be
without the work that Jobs did and the people he inspired: probably a bit less
shiny, a bit more beige, a bit more square. Deep inside, though, our devices
would still work the same way and do the same things. On the other hand, I
literally can't imagine where the computing world would be without the work
that Ritchie did and the people he inspired. By the mid 80s, Ritchie's
influence had taken over, and even back then very little remained of the pre-
Ritchie world.

Finally, last but not least, that is why our patent system is broken:
technology has done such an amazing job at hiding its complexity that the
people regulating and running the patent system are barely even aware of the
complexity of what they're regulating and running. That's the ultimate
bikeshedding: just like the proverbial discussions in the town hall about a
nuclear power plant end up being about the paint color for the plant's bike
shed, the patent discussions about modern computing systems end up being about
screen sizes and icon ordering, because in both cases those are the only
aspect that the people involved in the discussion are capable of discussing,
even though they are irrelevant to the actual function of the overall system
being discussed.

CC:BY 3.0

~~~
sidcool
Thanks!!

------
joezhou
This will definitely be my interview question, and I expect the exact detailed
answer!

------
chris_mahan
uh, "The proxy server is refusing connections" to plus.google.com (corporate
desktop) so, I don't know.

------
eric970
He didn't explain how DOCSIS works.

------
jacobmarble
Technologist. What a stupid word.

------
gonzo
Just another (linux) fanboi.

------
jbverschoor
JBQ BEOS! WOOOOOOOO!

Sorry, couldn't resist

