Hacker News new | comments | show | ask | jobs | submit login
A Complete Understanding is No Longer Possible (dadgum.com)
122 points by angrycoder 2035 days ago | hide | past | web | 77 comments | favorite

Article makes a good point. However, no one mentioned Google.

I think people are afraid to admit how often they Google search things.

I started teaching myself programming when I was 7. I am 34 now. There is plenty of stuff that I remember, but a lot more that I don't and I don't try to. And I find out about a new piece of software, service or whatever every day or every few days.

I am proud of the fact that I Google so much.

I think its actually more important and more useful to be able to Google things and quickly get the gist of a new thing (or look up some old syntax) and drill down into a solution than it is to know lots of stuff off the top of your head.

Mainly because there are so many tools, frameworks, programming languages coming out that if you don't Google first you will usually end up wasting a lot of time.

I think the main thing holding science and technology (including information technology of course) back right now is the limited ability to integrate knowledge and thought and its application. Google is the closest we have come to a solution to that problem.

But I think we may see something like a metasystem transition resulting in some higher-level integration and organization for knowledge, technology, and its application to problem solving. It could come through some kind of artificial general intelligence, or maybe just some type of advanced knowledge/semantic engineering/collaboration system. Or maybe high bandwidth brain-computer interfaces will allow brains to be networked and even integrated wirelessly.

A complete understanding if you ask me is one which requires no impenetrable "black-box abstractions" to be substituted in the abstraction hierarchy. In the case of a CPU, you would have to have a full gate level understanding of it and an understanding of how the gates operate.

To be honest, the "complete understanding" died way before the original post. It was the moment that they started building computers with integrated circuits inside them (before 1970) and complexity rocketed logarithmically. It died doubly so the moment that electrical engineering and the practical side of computer science diverged into "them" and "us".

If you want to use something which you understand, you will probably need to buy a crate of transistors, resistors and diodes and wirewrap yourself an entire computer from scratch PDP-7 style.

This fact is a warning: We really are building things with too deep abstraction hierarchies causing knowledge to be divided. One day we will never hope to comprehend anything in a lifetime.

Then blacksmiths in the Middle Ages did not 'understand' the forging of swords. Only modern materials science allowed us to 'understand' why forging creates harder metal.

This is a semantic discussion about the meaning of 'understanding': does it mean you can globally explain how the system works and could come to understand the smallest detail of every part? Or does it mean you understand the smallest detail of every part?

The latter is a nonsensical definition: if that is the case, then nobody understands processors, because nobody understands transistors, because nobody understands quantum mechanics, because nobody understands why the fundamental forces act in certain ways. Nobody understands Newton's laws, nobody understands where babies come from and nobody understands what it means to perform a 'computation'[1].

Of course, that means the former was also a nonsensical defintion.

[1] http://plato.stanford.edu/entries/church-turing/

> Then blacksmiths in the Middle Ages did not 'understand' the forging of swords. Only modern materials science allowed us to 'understand' why forging creates harder metal.

The interesting way to construe the article's claim is not that it's impossible to know everything, but that's impossible to know everything that people already know about the field you work in.

Were there blacksmiths who knew everything anyone knew about forging swords? Did Newton or Da Vinci know everything anyone knew about the various fields they were expert in? Are there farmers now who know everything anyone knows about how farming works? The article claims that at some point it became a certainty that programmers cannot know everything that anyone knows about how to use the tools they use and what those tools do. The stack is too complex. That's at least a sensible and interesting claim.

No blacksmith, farmer or famous scientist/artist/Renaissance man in history has ever known everything there was to know (about the field they worked in). Even back then, there was already more knowledge being produced than they could possibly ever obtain. As with us, the ultimate problem is a lack of time. Whether the time required to travel to a neighbouring city to learn from their guild or the time required to read a paper on the internet, the problem is time.

All stacks have always been too complex. A farmer, to produce optimally, has to understand everything there is to know about meteorology, biology, sociology, economics and earth sciences concerning his specific area. That has never been the case.

In that sense of 'understand', nobody has ever completely understood any system they worked with/in. Given a system and a person, you can come up with a legitimate detail question about the system that you also believe the person couldn't answer.

Semantically speaking, understanding is simply knowing enough to recreate something, preferably with your own aquired skills and knowledge. We're all just fancy parrots wrapped up in monkey bodies.

To be honest, the bedrock abstraction should stop at "what humans can realistically create with their own hands from nothing". You can make your own transistor quite easily and Ebers-Moll provides a nice set of rules to work with.

The quantum physicists and philosophers can remain arguing about technicalities then and let the rest of the world observe, understand and create.

> what humans can realistically create with their own hands from nothing

could medieval blacksmiths really create a sword from nothing but their own hands? how would they get the iron? would they have the knowledge locate veins of iron and then mine that iron? could they build the kiln and forge the equipment necessary for forging

arguably, by this standard, even farmers could not farm. farmers may know how to plant their crops, but without current crops to gather seeds from, would they know how to find the strains of plants that suit farming and then gather the seeds from those plants?

Well actually, I reckon yes. Much to my parents' latent terror (I didn't tell them until they came back from holiday), at the age of 15 a friend and myself built a small blast furnace using some ceramic pots, bits of stone lying around and a hoover. I managed to get 50g of what looked like pig iron out of that bugger before it basically fell to bits and set fire to the lawn. That was from about 2kg of ore I found at the bottom of a cliff face next to a beach at Skegness. Was great fun! (flux was limestone from the shed, coke was 3 large bags of barbeque charcoal)

I'm sure that and the rest of the process wasn't beyond people with a higher budget and requirements...

As for plants - it's all knowledge and experience. There is no abstraction. Eat this, don't eat that. I grow quite a few edible plants myself and there is little abstraction.

> You can make your own transistor quite easily

Not from scratch no. Not even with all the intermediate knowledge available but none of the tooling and technology. Unless by "quite easily" you mean "in under a dozen generations".

You're joking right? Taking apart an existing diode to build a transistor from it comes nowhere close to making the transistor from scratch. Not to mention the clamps and files, the microscope, the phosphorous bronze for the contacts, ...

This video is great, but it's a century or two of progress away from "from scratch".

I do all my "fun" computing these days on a BBC Micro, in assembly language. I don't have total understanding of the ICs, it's true. But in its 64k of addressable memory (32k RAM), I've a pretty good idea of where everything is and what happens when. Very satisfying.

Very cool. I still have the old Osborne 1 on which I learned to program.

Interesting historical point - the BBC Micro was designed by Acorn Computers, who are the parent of the ARM processors that are so ubiquitous today.

Indeed, the original ARM was designed on a BBC Micro + 6502 Co-Pro. Amazing what "real work" you can get done on one :-)


I was the proud owner of an ARM copro [1] many years ago (I still have the Master it was plugged into) and the first Acorn RISC machine (an A310 with 512k RAM if I remember correctly).

They were and still are extremely powerful and productive machines.

[1] http://en.wikipedia.org/wiki/BBC_Micro_expansion_unit#ARM_Ev...

Returning to your original post,

We really are building things with too deep abstraction hierarchies causing knowledge to be divided

Abstraction is necessary, true, but it's not clear to me what the abstraction-level we have now really gets us. In other words, say we had a BBC micro with a 2Ghz 6502 in it. What productive computing tasks that we do now could it not do? Or let's imagine an Atari ST with a 2Ghz 68000, to get us a little more memory. What could it not do, that we need to do now? I'm struggling to think of anything.

It doesn't get us anything at all other than a fucking huge rabbit hole to stare down every time you do anything. Lets look at a pretty naff case for .Net CLR on x86 for windows workflow:

application -> xaml -> framework -> server -> container -> c# -> cil -> bytecode -> x86 instructions -> microcode.

Now forth on a 68000:

Forth screen -> 68k instructions

To be honest, for what I consider to be life and death stuff, a 10MHz 68000 is good enough (I have one in my TI92 calculator).

The main thing is that you'd need a new set of abstractions for security, and then you'd need to implement HTML5 on it anyways to do all the things we can do on a computer now.

to do all the things we can do on a computer now

Such as what? 99% of websites are a) screens where you enter something into predefined fields, to be stored in a database and/or b) screens that format nicely for display things that you or other people have previously entered. They were doing that in the 1970s. Only the buzzwords change.

HTML5 is a piss poor evolutionary abstraction of what was effectively SGML and scripting carnage.

If you could start again, would you really end up with HTML5?

Any idea where someone (such as me) born too late to have an original Micro might be able to get hold of one?

If you're in the UK, there's a liquid market in BBCs, C64s and similar on eBay. There are a few sellers who refurb them (e.g. new power supplies, cleaned up cases etc). You can easily adapt a BBC to use SCART too (the lead will cost about a fiver) and use it on a modern TV, if you don't fancy using a big old CUB monitor.

I hope that the RasberryPi will be a re-run of the BBC Micro, they could do a lot worse than bundle it with a modern version of BBC basic, which was very advanced for its time.

Brandy Basic is pretty much that: http://sourceforge.net/projects/brandy/

Get a Cub monitor though - it's not the same without one!

I have two :-)

Cool - I bow before your Eliteness! (pun intended ;-)

Good for you!

I (the parent of your post) actually have a BBC Master (and the advanced reference manuals) lying around still for precisely that reason. It's quite a handy and very powerful little machine to be honest.

It even runs LISP (AcornSoft LISP).

Will the game be changed once again when 3D printers will be able to print circuits?

Not until the printers can make their own CPUs and complex parts such as threaded rods. That is a long time off.

3d printers are supposedly promoted as printing themselves i.e. as self-replicating. They are not. They print a small fraction of their own non-complex parts.

A complete understanding is no longer necessary. It never was. Think about any time in history where a complete understand WAS necessary. -

Now we come to the question of when a complete understanding was Possible. As soon as the age of computing dawned with ICs (Integrated circuits) rather than vacuum tubes, the age of complete understanding was dead. Fewer and fewer people knew the internals of the most complex invention of humanity (computers). I think at some point in the IBM PC age, the number of people with a complete understanding dwindled to zero.

> As soon as the age of computing dawned with ICs (Integrated circuits) rather than vacuum tubes, the age of complete understanding was dead.

That's complete nonsense.

Until the second half of the 16 bit era (before the 68K series) it was possible to completely grok the schematic of a computer and all of the associated parts, including disk drives and the guts of each of the chips, and to know each and every byte of software that ran on it. It was a lot of work but it could be done.

Be it at the block level or for the die-hard fanatics all the way down to gates (and the transistors that made up those gates).

From there forward it got harder and harder to keep the whole stack from software to the lowest hardware layers in mind when writing code. And it wasn't necessary anyway! After all, code is already a very high level of abstraction. But even today it helps to know the guts of compilers, caching, branch prediction, register optimization, the difference between various buses and their typical latencies when writing high performance software.

So even if it is no longer possible to know each and every detail there still is a degree of necessity.

And there always will be. But now we will have to be satisfied with an understanding limited by the resources that are at our disposal. The result is that we specialize. Which happens in every branch of science that is sufficiently advanced.

A single second of cycles on a modern computer is more than you could conceivably analyze in a lifetime because of the volume. But in principle, given enough patience it could be done. In principle, given enough time and resources you could understand your computer at the same level as before. But our lives are too short and our brains too small to hold these quantities of information so we deal with abstractions instead.

Just like you can't have a complete understanding of every atom in the banana that you eat there is a degree of useful knowledge to which you can abstract it: ripeness, flavor, color, general quality and so on.

Plenty of people have opened up VLSI chips and put them under a microscope just as plenty of people have looked at a thin slice of banana. And in both cases they came away with an increased understanding. It's all about the investment in resources, not about the possibility.

The moment we can no longer completely understand our machines is when they'll stop to be deterministic. And by that time chances are that we'll call them biology.

The specifications for the original IBM PC were published and could be mostly understood by a high-school student. Things get a bit crazy when the unfettered prominence of the CPU starts to wane and the GPU and bridge chipsets take over more and more responsibility for making things work.

Even now it's not like you can't understand it, but the amount of time you'd have to invest in understanding it is well beyond the commitment level of most.

I learned on an Apple II. It came with the necessary manually to almost fully understand it, including a hardware manual that included a schematic of the motherboard. As I recall, the ICs were treated as blackboxes by that manual, but barely so. By the Apple IIe, I think this level of documentation had been dropped by Apple, but they were still at least making an effort to explain the inner workings of the machine:


Ch 4: The Inside Story.

When you connected your disk drive to your Apple IIe, you popped the top and got your first look at the integrated circuits, or chips, inside the computer. In this chapter, you'll learn what some of the chips do, and how they are used to process your data.


It's interested that the more complicated the computer got, the more sparse the end-user documentation became.

Yes, the author's observation is true, but I don't think it's particularly interesting for two reasons:

1) This isn't anything new. A complete understanding is not only no longer possible, it was never possible. What level of detail counts as a "complete" understanding? Even when computers were much simpler, you probably wouldn't have a thorough understanding of every layer of abstraction, from software to hardware to the physics of the underlying electrical components.

2) This sort of complete understanding is unnecessary anyway; these references exist as references for a reason. To do your job, you need familiarity and layered understandings of the relevant processes, and when you absolutely need to know the fine details of something, you can simply look it up.

Respectfully, I disagree.

Such a complete understanding is eminently possible. It is as easy to understand as any other complex system. You work at a high level of abstraction to understand the breadth of complexity, and then work down into the component level to understand the depth of complexity. Rinse, and repeat for every subsequent level.

The caveat is that it takes time and dedication.

I can't comment on whether it's necessary or not. There might be scenarios where it is. There might be other scenarios where you're better off studying kung fu, because that might be a quicker route to enlightenment.

I think we are talking about different meanings of "understand". You seem to be using the general meaning -- any expert in a field, given enough time and dedication, can gain understand both the high level complexity and the individual components of a complex system; I agree with you here.

But the sort of understanding I'm referring to, and what I believe the author is referring to, is a comprehensive, infinitely detailed understanding -- the sort of understanding of OS X APIs after internalizing 5000+ pages of reference.

I think this sort of "complete" understanding is not what we mean when we say a expert has an excellent understanding of a subject; instead, when we mean that that expert can work knows what is important.

For example, you could be an expert (and have an excellent grasp of the material) of some programming language, but if someone asked you about the exact name of a rarely used library call, you might not know it off the top of your head. You would know where to look it up, and you would know how to use it, and you would know that that library call exists, but you wouldn't recall the exact name.

I think many people can obtain these expert understandings of complex systems, but that's not what the author is referring to (or if he is, then I think his argument is unsound because no one would approach gaining an expert understanding of OS X by reading 11,000 pages of reference material). The author is talking about a unnecessarily detailed "complete" understanding in which you know everything about the system.

Clifford Stoll, in "The Cuckoo's Egg"[1] describes the hardest interview he ever had. The interviewer asked "Why is the sky blue?" and then, after Stoll gave his answer, asked "Why?" This process repeated several times, until Stoll was describing in great detail some advanced physics and chemistry and math.

In that spirit:

Please could you describe electron and hole flow in semiconductors? Why? Why? Why?

There's lots of bits of computing that works because it works even though we don't really know why it works.

…never possible…

I disagree. Consider a computer like the Apple ][. I didn't 'completely understand' it at the time, but with what I've learned since (including a CS degree), it is now very plausible to me that it could be exhaustively understood, down to every wire and line of OS/firmware code. I can see the boundaries of the areas I don't know clearly, and know that a bounded amount of extra effort could get me down through every 6502 opcode, or all relevant functional physical/electrical properties of all the subsystems.

(While I'm still not strong in the electrical engineering parts, the exercise in my undergrad EECS class where we built a 4-bit CPU from TTL components was the key for me in connecting the electrical world of circuits and logic gates to the abstractions of executable software.)

Maybe, nowadays, the typical systems (like the article's MacBook Air example) are too big, drawing from too many specialties and involving too many details, for any one person to decompose 'all the black boxes' with comprehensive understanding. But just a few decades ago, that wasn't yet the case.

What I think was possible was understanding down to the level that you could reliably know what caused the Apple II to act in any given way.

Was it possible all the way down to the transistor level? Maybe not, but it didn't have to be in order to understand the behavior.

It was certainly possible down through the IC level, and there were any number of people who achieved that sort of mastery.

I don't believe that's still possible with any consumer general purpose computer. I think there are people out there with that level of understanding of specialized devices, probably up to and including things like Xbox and PS3. I don't know if there are people that understand an airbook to that level, or a desktop PC, or their webserver. Maybe, but it seems unlikely.

as usual, it's already written in the bible: http://www.joelonsoftware.com/articles/LordPalmerston.html

"There was a time when if you read one book by Peter Norton, you literally knew everything there was to know about programming the IBM-PC"

That's mainly because DOS couldn't do anything. If you want to reproduce the capabilities of DOS - single tasking, single threading file and screen IO in text or simple non-windowed graphics - you really wouldn't need to learn much more nowadays.

A lot of people did a lot of real work on DOS (or CP/M, or whatever) systems. And they knew a lot more than today's Ruby On Rail For Dummies crowd.

Not comparable. The then-contemporary equivalent to "today's Ruby On Rails For Dummies crowd" was hobbyist BASIC programmers.

The modern day equivalent to people making DOS jump hoops would be professional C++ games programmers.

I'll take a guess and say that that time was well in the past by 1990. I remember the confusion over XMS / EMS and how to write programs that could use extended / expanded memory.

It all depends on how far into the system you want to look. How many users of the FAT file system actually learnt the innards of NTFS? How many programmers specialized on Borland IDEs rather than WatCom? There's not really that much more to learn these days if you want to just write software; understanding the entire system is a matter of jumping down the rabbit hole. If I understand how electron tunneling prevents processor architecture going much beyond the current state of the art, do I have a greater understanding of computers than someone who knows the OpenGL API backwards?

I agree. And in case you have some specific bottleneck you can always go into that. As knowledge grows, it happens in every science. However, it's another feeling when you have a good grasp of the entire system.

A 'complete' understanding including every single tiny detail has never been possible in any complex subject ever and this isn't bad as you seem to think. Certain knowledge has more utility in your craft than other knowledge [0].

Also, and more importantly, some knowledge when held with other knowledge creates emergent understanding of other knowledge. You are able to intuit from core principles. The ability to understand is far more important than the ability to recall.

[0] http://en.wikipedia.org/wiki/Pareto_principle

That's why we need systems to be more like this: http://www.vpri.org/pdf/tr2011004_steps11.pdf

And even though the STEPS project above is about software, The Elements of Computing Systems [ http://amzn.to/xMTkWX ] proves that hardware can be understandable as well. Of course that architecture is far too simple to be useful today, but there are still a lot of opportunities that we are missing today: http://bit.ly/ySQf25

I agree with the point of this article, and I find this quite ennoying.

The problem is not that a single individual cannot master everything but that the reward for learning a part of the book pile becomes lower and lower in proportion to the required effort. This trend is clearly discouraging students to "jump" in some of these fundamental reference books. And teachers find it probably very difficult to provide applied exercises for their theoretical teaching.

"Which do you choose ? How do you know which one will be useful in 5 years ?"

My suggestion is that academia standardize in a virtual machine, like what Knuth started to do with MIX and MMIX. these VM can provide a stable, simpler target to help students and professionnals to enjoy "more complete understanding". I expect this to make learning a lot more fun.

When there is a clear breakthrough in technology (like multicore systems), then a new version of the VM can be specified. This can help to improve communication between academia and the real world : btw MMIX was specified with input from experienced chip archicture designers.

If you learn the basics well (of logic, hardware, functional programming, imperative programming etc) (and by well I do not mean a bit of practical tinkering; I mean a few years of studying theoretical foundations in-depth and applying those on several non-trivial practical cases), this will be useful in 5 years. It will be useful the rest of your life.

My theoretical foundations in CS, AI, Math (esp discrete) and physics from the late 80s/begin 90s allow me to read cutting edge papers (memristors, spintronics), learn new languages fast, read 'modern' source code and understand modern hardware architectures. This is 'knowledge' from 20 years ago; not that much has changed tbh on low levels.

Actually ; my dad enjoyed his education under Dijkstra in Eindhoven (and others, but this I now find cool :) and he also understands hardware and software on a low level after almost 50 years.

I don't see this panicky 'what you learn is obsolete when you finish uni'; might be for some studies, but for math/physics/cs it definitely isn't (again IFF the education is not flaky; there are plenty uni's I know of which are teaching CS courses which are WAY too practical ; students coming from there 'know' C# and find it very hard to do anything else. That's not a solid foundation at all. Example; university of Dubai.).

A few books exist that try to restore some sort of "complete understanding" :

Software programming starting from assembly language under linux : http://www.amazon.com/Programming-Ground-Up-Jonathan-Bartlet...

A study going from chip design to high level programming :


IHMO this "problem" of "incomplete vision" started when device drivers where introduced in general purpose OS. Professionnal application developpers started to target API instead of hardware. A milestone in this trend for me is Windows 3.0 (1990). This also marks the demise of fixed hardware computers that the hobbyist favored so far.

For knowing 'everything' about computers, I suggest buying a few 8-bit 80s systems with all included for <$50, buy http://www.xgamestation.com/ and master that hardware/software. It's relaxing, a lot of fun, and you'll 'get' modern hardware/arch/OSs as well after that. Shame is you cannot mold modern hardware/software so much easily, so after learning all this old stuff you might end up spending more time making hardware extensions for those systems rather than writing your RoR CRUD screens.

I think a more realistic goal for our time is to aim at being at level 1 of ignorance (ignorance with awareness) [1].

[1] http://andrewboland.blogspot.com/2008/08/five-levels-of-igno... (just submitted on HN: http://news.ycombinator.com/item?id=3643445)

In our society, my take on this is that one shouldn't try to understand everything. We get smarter as a whole if different individuals focus on different areas. I'm not saying no one should collaborate, but 6 billion jack-of-all-trades, master-of-nothing simply is not smart enough to care for the next billion.

Read "The Shallows" by Nicholas Carr, it discuss this very topic. Humanity had the the same discussion when the printing press became normal. People could start to look up things in books instead of remembering everything (analogy: you're searching the interwebs).

I agree, but apparently we are in the age of the polymath.

I think it's the addition of new information/technology without reflecting on and simplifying existing information/technology which is leading to this problem of information overload.

Well, on the other hand, if you know a bit about hardware, it is fairly easy to build a simple yet fully functional processor from basic components, implementing an OS with most common features and a compiler with advanced error checking and an optimizer is also doable. You probably need to be familiar with math as well. Now anyone with this knowledge can become very expert in whatever field he is interested in. Why would you know everything, from electrons to software ?

My strategy is just-in-time learning. Anyone attempting to eager load all the knowledge is either crazy or exceptionally brilliant.

There was an article on HN in the last year that addressed this, i.e the amount of knowledge required to understand a complex application from distributed software to assembly language, to hardware. I think the title was something about unknowable depths of complexity, does anyone remember the link?

The article talks about what I would call "pre-emptive understanding," but what about "just in time understanding?"

Have a good general understanding of a platform (all the way down to hardware, if you wish) and know where to look when either curiosity or need arises.

I think being on level

That's kind of the point. Modularity allows all these complicated parts to work together as a unified whole, without the creator of any one part needing to know exactly how the other part works. It's how high school kids are able to write their own computer programs instead of having to go to college for 12 years. It's what allowed cars to be assembled en masse. It's not a problem, and if it ever is, at least you have a 11,000 pages of text from the creators giving you a complete explanation.

let's say you've just had a baby, a little bundle of joy, and you want to read all the books about babies so you can understand it completely.

you can "understand a baby" just fine. you can never gain a "complete understanding" of man.

11000 pages at 3 min per page is roughly 550 hours of reading, which is roughly 3 months of full time reading. Double that for some exercises and review and a good understanding is doable in about 6 months.

There is no way you can do exercises at a rate of 3 minutes per page of material, for material of any density. For many concepts, you'd do well to do exercises at a rate of one day per page; exercises that would be essential to a good understanding.

I remember first encountering tree views in Borland Delphi 2. Took me a good couple of days before I got the hang of them, and how they interacted with image lists, to fully get how the recursive structure held together, how the dependencies interlinked.

Similarly with OpenGL. Took quite some time to get the hang of the sequencing of matrix operations, world and camera transforms, etc. - all just the very basics, before getting into anything interesting at all.

For real learning, 8 hours a day is simply unrealistic. It wasn't until the third or fourth time I tried many things that I really understood them, to the point that I didn't have to follow any recipes anywhere.

That's not true. You can have complete understanding. You actually should have, if you want to be recognized in the industry. You need more than just learning, you need to live it.

Pick an Operating System, and an environment. I'm using Windows, for more than 10 years now. And that's why I'm not moving to OS X any time soon. I'm incredibly good with Windows, and it can handle all of my daily business work, and software development, as well as entertainment.

I'm on Web development for more than 4 years. It's an overwhelming field, if you ask me. Lots of stuff: Web servers (picked Abyss, played with Apache), HTTP, HTTPS (and SSL), Server Side (picked PHP), Database (picked mySql, played with Mongo), HTML5 (lots of new API), JavaScript (not that simple, and a new spec. is coming), CSS3, Photoshop (some skills to get by).

And then there are lots of other things: caching, optimization, patterns, ORM, CMS (WordPress, Drupal...), frameworks (CodeIgniter, Symphony...), libraries (jQuery, Prototype...), best-practices, security (SQL injections, encryption...), Unit Testing (Jasmin, PHPunit...), Code Inspection (JSlint), IDEs (Aptana, PHPstorm...), Mobiles (Titanium, Sencha...)

And certainly for business (you probably run your server on Ubuntu) there is Postfix, dovecot for email, SFTP for an FTP server, Mercurial or Git for Version Control and some other stuff to monitor your servers (and maybe services)

And then there is more stuff, if you run an online business like HN, SmashingMagazine, GitHub, Envato, AppStore and all social networks, magazines, markets...

So how can you do it? First, it'll take years. Second, you must live it. When you live it, you'll forget about the quantity, and with time you'll keep all those things in your head.

This depends on the definition of "complete". Yes, you've described a pretty comprehensive summary of the web stack and related tools. But this is all application software. What about the underlying network protocols? The OS kernel? The CPU architecture? Cache coherency protocols? Direct memory access to shuffle data to the network card? The physical layer signalling to get this data across the wire? PCB layout? Schematic design? How about the power supply for all this?

A couple decades ago one could actually buy a computer as a kit and put the circuit boards together. And even then it was unlikely that you could know everything: even if you were Woz, you probably would not have seen the masks used to make the CPU.

> You can have complete understanding.

No you can't. Nobody can. Literally. So you're using Windows right now. I suppose you know that a complete windows desktop takes more than 200 millions lines of code ? That's about 10,000 books.

If you really want _complete_ understanding of your system, you really have to read and understand 10,000 technical books. Let's say you learn really really fast, and you can read 1 book each week. That's about 50 books per year. Reading everything therefore would takes you 200 years. I suggest you sign up for cryonics, or donate to an anti-ageing research program.

You may think counting actual lines of code is not fair, but what else? Systems have bugs and limitations that you need to be aware of, if you want to really comprehend them. Also, I have not counted the design of the hardware.

Nevertheless, I think there is hope: http://vpri.org/html/work/ifnct.htm

This looks fantastic.

I doubt you have complete understanding though.

Do you understand how SSL works? Do you know your way around PHP's internals? Do you know exactly how TCP, IP, Ethernet, etc work? Could you write a Photoshop plugin? Do you understand how parsers and compilers work?

I doubt you (or anyone else for that matter) could answer yes to all those questions. That's the point I think this blog post is getting at. It's no longer possible (question for the reader: was it ever possible, given how deep the rabbit hole goes?) to have a complete understanding of how all the technologies you use every day work.

A lot of people have a decent understanding of most things they need to deal with. That's probably good enough to get by.

> was it ever possible, given how deep the rabbit hole goes?

You probably couldn't handle the breadth of it, but the depth of computing in the 60s was pretty shallow. You had maybe a dozen levels of (pretty shallow) abstractions, even "higher-level" languages (like Fortran) were pretty uncommon, so you mostly had assembly, an assembler, machine code and the hardware you were running on.

You probably couldn't hold the whole machine in your head, but you could hold a good idea of its cross-section.

I was programming GameBoy about 10 years ago, and it was an amazing machine. You literally could hold the entire machine in your head when writing assembler code. It really let you push the machine to its limits, right down to the clock cycle.

> It really let you push the machine to its limits, right down to the clock cycle.

This is still commonly done for video game consoles (although it's significantly harder), especially late in the cycle when the console's hardware is well understood and its behavior is mapped, in order to wring every bit of performance out of the machine.

Probably not to the point it was in the 16bit era and earlier, but it's one of the "edges" consoles have over the PC's raw power.

was it ever possible, given how deep the rabbit hole goes?

I don't think so. People have always complained that the more you know, the more you realize you don't know.

But lately we're more and more confronted with our limitations as humans. There are so darn many details, so much information.

And no field is static, it's hard to keep up. As programmers/software engineers we're required to have a lot of specific product knowledge. And much of it is very important at one point, and nearly useless the next moment, as new versions are released, as new systems replace the older ones, and old assumptions are overturned.

Applications are open for YC Winter 2018

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact