
A Complete Understanding is No Longer Possible - angrycoder
http://prog21.dadgum.com/129.html
======
ilaksh
Article makes a good point. However, no one mentioned Google.

I think people are afraid to admit how often they Google search things.

I started teaching myself programming when I was 7. I am 34 now. There is
plenty of stuff that I remember, but a lot more that I don't and I don't try
to. And I find out about a new piece of software, service or whatever every
day or every few days.

I am proud of the fact that I Google so much.

I think its actually more important and more useful to be able to Google
things and quickly get the gist of a new thing (or look up some old syntax)
and drill down into a solution than it is to know lots of stuff off the top of
your head.

Mainly because there are so many tools, frameworks, programming languages
coming out that if you don't Google first you will usually end up wasting a
lot of time.

I think the main thing holding science and technology (including information
technology of course) back right now is the limited ability to integrate
knowledge and thought and its application. Google is the closest we have come
to a solution to that problem.

But I think we may see something like a metasystem transition resulting in
some higher-level integration and organization for knowledge, technology, and
its application to problem solving. It could come through some kind of
artificial general intelligence, or maybe just some type of advanced
knowledge/semantic engineering/collaboration system. Or maybe high bandwidth
brain-computer interfaces will allow brains to be networked and even
integrated wirelessly.

------
bwarp
A complete understanding if you ask me is one which requires no impenetrable
"black-box abstractions" to be substituted in the abstraction hierarchy. In
the case of a CPU, you would have to have a full gate level understanding of
it and an understanding of how the gates operate.

To be honest, the "complete understanding" died way before the original post.
It was the moment that they started building computers with integrated
circuits inside them (before 1970) and complexity rocketed logarithmically. It
died doubly so the moment that electrical engineering and the practical side
of computer science diverged into "them" and "us".

If you want to use something which you understand, you will probably need to
buy a crate of transistors, resistors and diodes and wirewrap yourself an
entire computer from scratch PDP-7 style.

This fact is a warning: We really are building things with too deep
abstraction hierarchies causing knowledge to be divided. One day we will never
hope to comprehend anything in a lifetime.

~~~
gaius
I do all my "fun" computing these days on a BBC Micro, in assembly language. I
don't have total understanding of the ICs, it's true. But in its 64k of
addressable memory (32k RAM), I've a pretty good idea of where everything is
and what happens when. Very satisfying.

~~~
jgw
Very cool. I still have the old Osborne 1 on which I learned to program.

Interesting historical point - the BBC Micro was designed by Acorn Computers,
who are the parent of the ARM processors that are so ubiquitous today.

~~~
gaius
Indeed, the original ARM was designed on a BBC Micro + 6502 Co-Pro. Amazing
what "real work" you can get done on one :-)

<http://en.wikipedia.org/wiki/ARM_architecture#History>

~~~
bwarp
I was the proud owner of an ARM copro [1] many years ago (I still have the
Master it was plugged into) and the first Acorn RISC machine (an A310 with
512k RAM if I remember correctly).

They were and still are extremely powerful and productive machines.

[1]
[http://en.wikipedia.org/wiki/BBC_Micro_expansion_unit#ARM_Ev...](http://en.wikipedia.org/wiki/BBC_Micro_expansion_unit#ARM_Evaluation_System)

~~~
gaius
Returning to your original post,

 _We really are building things with too deep abstraction hierarchies causing
knowledge to be divided_

Abstraction is necessary, true, but it's not clear to me what the abstraction-
level we have now really gets us. In other words, say we had a BBC micro with
a 2Ghz 6502 in it. What productive computing tasks that we do now could it
_not_ do? Or let's imagine an Atari ST with a 2Ghz 68000, to get us a little
more memory. What could it _not_ do, that we need to do now? I'm struggling to
think of anything.

~~~
william42
The main thing is that you'd need a new set of abstractions for security, and
then you'd need to implement HTML5 on it anyways to do all the things we can
do on a computer now.

~~~
gaius
_to do all the things we can do on a computer now_

Such as what? 99% of websites are a) screens where you enter something into
predefined fields, to be stored in a database and/or b) screens that format
nicely for display things that you or other people have previously entered.
They were doing that in the 1970s. Only the buzzwords change.

------
suprgeek
A complete understanding is no longer necessary. It never was. Think about any
time in history where a complete understand WAS necessary. -

Now we come to the question of when a complete understanding was Possible. As
soon as the age of computing dawned with ICs (Integrated circuits) rather than
vacuum tubes, the age of complete understanding was dead. Fewer and fewer
people knew the internals of the most complex invention of humanity
(computers). I think at some point in the IBM PC age, the number of people
with a complete understanding dwindled to zero.

~~~
jacquesm
> As soon as the age of computing dawned with ICs (Integrated circuits) rather
> than vacuum tubes, the age of complete understanding was dead.

That's complete nonsense.

Until the second half of the 16 bit era (before the 68K series) it was
possible to completely grok the schematic of a computer and _all_ of the
associated parts, including disk drives and the guts of each of the chips, and
to know each and every byte of software that ran on it. It was a lot of work
but it could be done.

Be it at the block level or for the die-hard fanatics all the way down to
gates (and the transistors that made up those gates).

From there forward it got harder and harder to keep the whole stack from
software to the lowest hardware layers in mind when writing code. And it
wasn't necessary anyway! After all, code is already a very high level of
abstraction. But even today it helps to know the guts of compilers, caching,
branch prediction, register optimization, the difference between various buses
and their typical latencies when writing high performance software.

So even if it is no longer possible to know each and every detail there still
is a _degree_ of necessity.

And there always will be. But now we will have to be satisfied with an
understanding limited by the resources that are at our disposal. The result is
that we specialize. Which happens in every branch of science that is
sufficiently advanced.

A single second of cycles on a modern computer is more than you could
conceivably analyze in a lifetime because of the volume. But _in principle_ ,
given enough patience it could be done. In principle, given enough time and
resources you _could_ understand your computer at the same level as before.
But our lives are too short and our brains too small to hold these quantities
of information so we deal with abstractions instead.

Just like you can't have a complete understanding of every atom in the banana
that you eat there is a degree of useful knowledge to which you can abstract
it: ripeness, flavor, color, general quality and so on.

Plenty of people have opened up VLSI chips and put them under a microscope
just as plenty of people have looked at a thin slice of banana. And in both
cases they came away with an increased understanding. It's all about the
investment in resources, not about the possibility.

The moment we can no longer completely understand our machines is when they'll
stop to be deterministic. And by that time chances are that we'll call them
biology.

~~~
astrodust
The specifications for the original IBM PC were published and could be mostly
understood by a high-school student. Things get a bit crazy when the
unfettered prominence of the CPU starts to wane and the GPU and bridge
chipsets take over more and more responsibility for making things work.

Even now it's not like you _can't_ understand it, but the amount of time you'd
have to invest in understanding it is well beyond the commitment level of
most.

------
obeattie
Relevance: I, Pencil <http://en.wikisource.org/wiki/I,_Pencil>

------
js2
I learned on an Apple II. It came with the necessary manually to almost fully
understand it, including a hardware manual that included a schematic of the
motherboard. As I recall, the ICs were treated as blackboxes by that manual,
but barely so. By the Apple IIe, I think this level of documentation had been
dropped by Apple, but they were still at least making an effort to explain the
inner workings of the machine:

[http://download.info.apple.com/Apple_Support_Area/Manuals/de...](http://download.info.apple.com/Apple_Support_Area/Manuals/desktops/AppleIIeUG.PDF)

 _Ch 4: The Inside Story.

When you connected your disk drive to your Apple IIe, you popped the top and
got your first look at the integrated circuits, or chips, inside the computer.
In this chapter, you'll learn what some of the chips do, and how they are used
to process your data.

..._

It's interested that the more complicated the computer got, the more sparse
the end-user documentation became.

------
sherwin
Yes, the author's observation is true, but I don't think it's particularly
interesting for two reasons:

1) This isn't anything new. A complete understanding is not only _no longer_
possible, it was _never_ possible. What level of detail counts as a "complete"
understanding? Even when computers were much simpler, you probably wouldn't
have a thorough understanding of every layer of abstraction, from software to
hardware to the physics of the underlying electrical components.

2) This sort of complete understanding is unnecessary anyway; these references
exist as references for a reason. To do your job, you need familiarity and
layered understandings of the relevant processes, and when you absolutely need
to know the fine details of something, you can simply look it up.

~~~
Spearchucker
Respectfully, I disagree.

Such a complete understanding is eminently possible. It is as easy to
understand as any other complex system. You work at a high level of
abstraction to understand the breadth of complexity, and then work down into
the component level to understand the depth of complexity. Rinse, and repeat
for every subsequent level.

The caveat is that it takes time and dedication.

I can't comment on whether it's necessary or not. There might be scenarios
where it is. There might be other scenarios where you're better off studying
kung fu, because that might be a quicker route to enlightenment.

~~~
DanBC
Clifford Stoll, in "The Cuckoo's Egg"[1] describes the hardest interview he
ever had. The interviewer asked "Why is the sky blue?" and then, after Stoll
gave his answer, asked "Why?" This process repeated several times, until Stoll
was describing in great detail some advanced physics and chemistry and math.

In that spirit:

Please could you describe electron and hole flow in semiconductors? Why? Why?
Why?

There's lots of bits of computing that works because it works even though we
don't really know why it works.

~~~
gtani
<http://news.ycombinator.com/item?id=3606653>

------
DodgyEggplant
as usual, it's already written in the bible:
<http://www.joelonsoftware.com/articles/LordPalmerston.html>

"There was a time when if you read one book by Peter Norton, you literally
knew everything there was to know about programming the IBM-PC"

~~~
JulianMorrison
That's mainly because DOS couldn't _do_ anything. If you want to reproduce the
capabilities of DOS - single tasking, single threading file and screen IO in
text or simple non-windowed graphics - you really wouldn't need to learn much
more nowadays.

~~~
gaius
A lot of people did a lot of real work on DOS (or CP/M, or whatever) systems.
And they knew a lot more than today's Ruby On Rail For Dummies crowd.

~~~
JulianMorrison
Not comparable. The then-contemporary equivalent to "today's Ruby On Rails For
Dummies crowd" was hobbyist BASIC programmers.

The modern day equivalent to people making DOS jump hoops would be
professional C++ games programmers.

------
lhnz
A 'complete' understanding including every single tiny detail has never been
possible in any complex subject _ever_ and this isn't bad as you seem to
think. Certain knowledge has more utility in your craft than other knowledge
[0].

Also, and more importantly, some knowledge when held with other knowledge
creates emergent understanding of other knowledge. You are able to intuit from
core principles. The ability to understand is far more important than the
ability to recall.

[0] <http://en.wikipedia.org/wiki/Pareto_principle>

------
eterps
That's why we need systems to be more like this:
<http://www.vpri.org/pdf/tr2011004_steps11.pdf>

~~~
eterps
And even though the STEPS project above is about software, The Elements of
Computing Systems [ <http://amzn.to/xMTkWX> ] proves that hardware can be
understandable as well. Of course that architecture is far too simple to be
useful today, but there are still a lot of opportunities that we are missing
today: <http://bit.ly/ySQf25>

------
erwan574
I agree with the point of this article, and I find this quite ennoying.

The problem is not that a single individual cannot master everything but that
the reward for learning a part of the book pile becomes lower and lower in
proportion to the required effort. This trend is clearly discouraging students
to "jump" in some of these fundamental reference books. And teachers find it
probably very difficult to provide applied exercises for their theoretical
teaching.

"Which do you choose ? How do you know which one will be useful in 5 years ?"

My suggestion is that academia standardize in a virtual machine, like what
Knuth started to do with MIX and MMIX. these VM can provide a stable, simpler
target to help students and professionnals to enjoy "more complete
understanding". I expect this to make learning a lot more fun.

When there is a clear breakthrough in technology (like multicore systems),
then a new version of the VM can be specified. This can help to improve
communication between academia and the real world : btw MMIX was specified
with input from experienced chip archicture designers.

~~~
tluyben2
If you learn the basics well (of logic, hardware, functional programming,
imperative programming etc) (and by well I do not mean a bit of practical
tinkering; I mean a few years of studying theoretical foundations in-depth and
applying those on several non-trivial practical cases), this will be useful in
5 years. It will be useful the rest of your life.

My theoretical foundations in CS, AI, Math (esp discrete) and physics from the
late 80s/begin 90s allow me to read cutting edge papers (memristors,
spintronics), learn new languages fast, read 'modern' source code and
understand modern hardware architectures. This is 'knowledge' from 20 years
ago; not that much has changed tbh on low levels.

Actually ; my dad enjoyed his education under Dijkstra in Eindhoven (and
others, but this I now find cool :) and he also understands hardware and
software on a low level after almost 50 years.

I don't see this panicky 'what you learn is obsolete when you finish uni';
might be for some studies, but for math/physics/cs it definitely isn't (again
IFF the education is not flaky; there are plenty uni's I know of which are
teaching CS courses which are WAY too practical ; students coming from there
'know' C# and find it very hard to do anything else. That's not a solid
foundation at all. Example; university of Dubai.).

------
erwan574
A few books exist that try to restore some sort of "complete understanding" :

Software programming starting from assembly language under linux :
[http://www.amazon.com/Programming-Ground-Up-Jonathan-
Bartlet...](http://www.amazon.com/Programming-Ground-Up-Jonathan-
Bartlett/dp/0975283847)

A study going from chip design to high level programming :

[http://www.amazon.com/Elements-Computing-Systems-Building-
Pr...](http://www.amazon.com/Elements-Computing-Systems-Building-
Principles/dp/026214087X)

IHMO this "problem" of "incomplete vision" started when device drivers where
introduced in general purpose OS. Professionnal application developpers
started to target API instead of hardware. A milestone in this trend for me is
Windows 3.0 (1990). This also marks the demise of fixed hardware computers
that the hobbyist favored so far.

------
tluyben2
For knowing 'everything' about computers, I suggest buying a few 8-bit 80s
systems with all included for <$50, buy <http://www.xgamestation.com/> and
master that hardware/software. It's relaxing, a lot of fun, and you'll 'get'
modern hardware/arch/OSs as well after that. Shame is you cannot mold modern
hardware/software so much easily, so after learning all this old stuff you
might end up spending more time making hardware extensions for those systems
rather than writing your RoR CRUD screens.

------
olalonde
I think a more realistic goal for our time is to aim at being at level 1 of
ignorance (ignorance with awareness) [1].

[1] [http://andrewboland.blogspot.com/2008/08/five-levels-of-
igno...](http://andrewboland.blogspot.com/2008/08/five-levels-of-
ignorance.html) (just submitted on HN:
<http://news.ycombinator.com/item?id=3643445>)

------
haraldk
In our society, my take on this is that one shouldn't try to understand
everything. We get smarter as a whole if different individuals focus on
different areas. I'm not saying no one should collaborate, but 6 billion jack-
of-all-trades, master-of-nothing simply is not smart enough to care for the
next billion.

Read "The Shallows" by Nicholas Carr, it discuss this very topic. Humanity had
the the same discussion when the printing press became normal. People could
start to look up things in books instead of remembering everything (analogy:
you're searching the interwebs).

~~~
epscylonb
I agree, but apparently we are in the age of the polymath.

------
ericHosick
I think it's the addition of new information/technology without reflecting on
and simplifying existing information/technology which is leading to this
problem of information overload.

------
bnegreve
Well, on the other hand, if you know a bit about hardware, it is fairly easy
to build a simple yet fully functional processor from basic components,
implementing an OS with most common features and a compiler with advanced
error checking and an optimizer is also doable. You probably need to be
familiar with math as well. Now anyone with this knowledge can become very
expert in whatever field he is interested in. Why would you know _everything_
, from electrons to software ?

------
alimbada
My strategy is just-in-time learning. Anyone attempting to eager load all the
knowledge is either crazy or exceptionally brilliant.

------
thesash
There was an article on HN in the last year that addressed this, i.e the
amount of knowledge required to understand a complex application from
distributed software to assembly language, to hardware. I think the title was
something about unknowable depths of complexity, does anyone remember the
link?

------
mark_l_watson
The article talks about what I would call "pre-emptive understanding," but
what about "just in time understanding?"

Have a good general understanding of a platform (all the way down to hardware,
if you wish) and know where to look when either curiosity or need arises.

------
olalonde
I think being on level

------
rsanchez1
That's kind of the point. Modularity allows all these complicated parts to
work together as a unified whole, without the creator of any one part needing
to know exactly how the other part works. It's how high school kids are able
to write their own computer programs instead of having to go to college for 12
years. It's what allowed cars to be assembled en masse. It's not a problem,
and if it ever is, at least you have a 11,000 pages of text from the creators
giving you a complete explanation.

------
its_so_on
let's say you've just had a baby, a little bundle of joy, and you want to read
all the books about babies so you can understand it completely.

you can "understand a baby" just fine. you can never gain a "complete
understanding" of man.

------
codeonfire
11000 pages at 3 min per page is roughly 550 hours of reading, which is
roughly 3 months of full time reading. Double that for some exercises and
review and a good understanding is doable in about 6 months.

~~~
barrkel
There is no way you can do exercises at a rate of 3 minutes per page of
material, for material of any density. For many concepts, you'd do well to do
exercises at a rate of one day per page; exercises that would be essential to
a good understanding.

I remember first encountering tree views in Borland Delphi 2. Took me a good
couple of days before I got the hang of them, and how they interacted with
image lists, to fully get how the recursive structure held together, how the
dependencies interlinked.

Similarly with OpenGL. Took quite some time to get the hang of the sequencing
of matrix operations, world and camera transforms, etc. - all just the very
basics, before getting into anything interesting at all.

For real learning, 8 hours a day is simply unrealistic. It wasn't until the
third or fourth time I tried many things that I really understood them, to the
point that I didn't have to follow any recipes anywhere.

------
csomar
That's not true. You can have complete understanding. You actually should
have, if you want to be recognized in the industry. You need more than just
learning, you need to live it.

Pick an Operating System, and an environment. I'm using Windows, for more than
10 years now. And that's why I'm not moving to OS X any time soon. I'm
incredibly good with Windows, and it can handle all of my daily business work,
and software development, as well as entertainment.

I'm on Web development for more than 4 years. It's an overwhelming field, if
you ask me. Lots of stuff: Web servers (picked Abyss, played with Apache),
HTTP, HTTPS (and SSL), Server Side (picked PHP), Database (picked mySql,
played with Mongo), HTML5 (lots of new API), JavaScript (not that simple, and
a new spec. is coming), CSS3, Photoshop (some skills to get by).

And then there are lots of other things: caching, optimization, patterns, ORM,
CMS (WordPress, Drupal...), frameworks (CodeIgniter, Symphony...), libraries
(jQuery, Prototype...), best-practices, security (SQL injections,
encryption...), Unit Testing (Jasmin, PHPunit...), Code Inspection (JSlint),
IDEs (Aptana, PHPstorm...), Mobiles (Titanium, Sencha...)

And certainly for business (you probably run your server on Ubuntu) there is
Postfix, dovecot for email, SFTP for an FTP server, Mercurial or Git for
Version Control and some other stuff to monitor your servers (and maybe
services)

And then there is more stuff, if you run an online business like HN,
SmashingMagazine, GitHub, Envato, AppStore and all social networks, magazines,
markets...

So how can you do it? First, it'll take years. Second, you must live it. When
you live it, you'll forget about the quantity, and with time you'll keep all
those things in your head.

~~~
charliesome
I doubt you have _complete_ understanding though.

Do you understand how SSL works? Do you know your way around PHP's internals?
Do you know exactly how TCP, IP, Ethernet, etc work? Could you write a
Photoshop plugin? Do you understand how parsers and compilers work?

I doubt you (or anyone else for that matter) could answer yes to all those
questions. That's the point I think this blog post is getting at. It's no
longer possible (question for the reader: was it ever possible, given how deep
the rabbit hole goes?) to have a complete understanding of how all the
technologies you use every day work.

A lot of people have a decent understanding of most things they need to deal
with. That's probably good enough to get by.

~~~
masklinn
> was it ever possible, given how deep the rabbit hole goes?

You probably couldn't handle the breadth of it, but the depth of computing in
the 60s was pretty shallow. You had maybe a dozen levels of (pretty shallow)
abstractions, even "higher-level" languages (like Fortran) were pretty
uncommon, so you mostly had assembly, an assembler, machine code and the
hardware you were running on.

You probably couldn't hold the whole machine in your head, but you could hold
a good idea of its cross-section.

~~~
kruhft
I was programming GameBoy about 10 years ago, and it was an amazing machine.
You literally could hold the entire machine in your head when writing
assembler code. It really let you push the machine to its limits, right down
to the clock cycle.

~~~
masklinn
> It really let you push the machine to its limits, right down to the clock
> cycle.

This is still commonly done for video game consoles (although it's
significantly harder), especially late in the cycle when the console's
hardware is well understood and its behavior is mapped, in order to wring
every bit of performance out of the machine.

Probably not to the point it was in the 16bit era and earlier, but it's one of
the "edges" consoles have over the PC's raw power.

