I think people are afraid to admit how often they Google search things.
I started teaching myself programming when I was 7. I am 34 now. There is plenty of stuff that I remember, but a lot more that I don't and I don't try to. And I find out about a new piece of software, service or whatever every day or every few days.
I am proud of the fact that I Google so much.
I think its actually more important and more useful to be able to Google things and quickly get the gist of a new thing (or look up some old syntax) and drill down into a solution than it is to know lots of stuff off the top of your head.
Mainly because there are so many tools, frameworks, programming languages coming out that if you don't Google first you will usually end up wasting a lot of time.
I think the main thing holding science and technology (including information technology of course) back right now is the limited ability to integrate knowledge and thought and its application. Google is the closest we have come to a solution to that problem.
But I think we may see something like a metasystem transition resulting in some higher-level integration and organization for knowledge, technology, and its application to problem solving. It could come through some kind of artificial general intelligence, or maybe just some type of advanced knowledge/semantic engineering/collaboration system. Or maybe high bandwidth brain-computer interfaces will allow brains to be networked and even integrated wirelessly.
To be honest, the "complete understanding" died way before the original post. It was the moment that they started building computers with integrated circuits inside them (before 1970) and complexity rocketed logarithmically. It died doubly so the moment that electrical engineering and the practical side of computer science diverged into "them" and "us".
If you want to use something which you understand, you will probably need to buy a crate of transistors, resistors and diodes and wirewrap yourself an entire computer from scratch PDP-7 style.
This fact is a warning: We really are building things with too deep abstraction hierarchies causing knowledge to be divided. One day we will never hope to comprehend anything in a lifetime.
This is a semantic discussion about the meaning of 'understanding': does it mean you can globally explain how the system works and could come to understand the smallest detail of every part? Or does it mean you understand the smallest detail of every part?
The latter is a nonsensical definition: if that is the case, then nobody understands processors, because nobody understands transistors, because nobody understands quantum mechanics, because nobody understands why the fundamental forces act in certain ways. Nobody understands Newton's laws, nobody understands where babies come from and nobody understands what it means to perform a 'computation'.
Of course, that means the former was also a nonsensical defintion.
The interesting way to construe the article's claim is not that it's impossible to know everything, but that's impossible to know everything that people already know about the field you work in.
Were there blacksmiths who knew everything anyone knew about forging swords? Did Newton or Da Vinci know everything anyone knew about the various fields they were expert in? Are there farmers now who know everything anyone knows about how farming works? The article claims that at some point it became a certainty that programmers cannot know everything that anyone knows about how to use the tools they use and what those tools do. The stack is too complex. That's at least a sensible and interesting claim.
All stacks have always been too complex. A farmer, to produce optimally, has to understand everything there is to know about meteorology, biology, sociology, economics and earth sciences concerning his specific area. That has never been the case.
In that sense of 'understand', nobody has ever completely understood any system they worked with/in. Given a system and a person, you can come up with a legitimate detail question about the system that you also believe the person couldn't answer.
To be honest, the bedrock abstraction should stop at "what humans can realistically create with their own hands from nothing". You can make your own transistor quite easily and Ebers-Moll provides a nice set of rules to work with.
The quantum physicists and philosophers can remain arguing about technicalities then and let the rest of the world observe, understand and create.
could medieval blacksmiths really create a sword from nothing but their own hands? how would they get the iron? would they have the knowledge locate veins of iron and then mine that iron? could they build the kiln and forge the equipment necessary for forging
arguably, by this standard, even farmers could not farm. farmers may know how to plant their crops, but without current crops to gather seeds from, would they know how to find the strains of plants that suit farming and then gather the seeds from those plants?
I'm sure that and the rest of the process wasn't beyond people with a higher budget and requirements...
As for plants - it's all knowledge and experience. There is no abstraction. Eat this, don't eat that. I grow quite a few edible plants myself and there is little abstraction.
Not from scratch no. Not even with all the intermediate knowledge available but none of the tooling and technology. Unless by "quite easily" you mean "in under a dozen generations".
This video is great, but it's a century or two of progress away from "from scratch".
Interesting historical point - the BBC Micro was designed by Acorn Computers, who are the parent of the ARM processors that are so ubiquitous today.
They were and still are extremely powerful and productive machines.
We really are building things with too deep abstraction hierarchies causing knowledge to be divided
Abstraction is necessary, true, but it's not clear to me what the abstraction-level we have now really gets us. In other words, say we had a BBC micro with a 2Ghz 6502 in it. What productive computing tasks that we do now could it not do? Or let's imagine an Atari ST with a 2Ghz 68000, to get us a little more memory. What could it not do, that we need to do now? I'm struggling to think of anything.
application -> xaml -> framework -> server -> container -> c# -> cil -> bytecode -> x86 instructions -> microcode.
Now forth on a 68000:
Forth screen -> 68k instructions
To be honest, for what I consider to be life and death stuff, a 10MHz 68000 is good enough (I have one in my TI92 calculator).
Such as what? 99% of websites are a) screens where you enter something into predefined fields, to be stored in a database and/or b) screens that format nicely for display things that you or other people have previously entered. They were doing that in the 1970s. Only the buzzwords change.
If you could start again, would you really end up with HTML5?
I (the parent of your post) actually have a BBC Master (and the advanced reference manuals) lying around still for precisely that reason. It's quite a handy and very powerful little machine to be honest.
It even runs LISP (AcornSoft LISP).
3d printers are supposedly promoted as printing themselves i.e. as self-replicating. They are not. They print a small fraction of their own non-complex parts.
Now we come to the question of when a complete understanding was Possible. As soon as the age of computing dawned with ICs (Integrated circuits) rather than vacuum tubes, the age of complete understanding was dead. Fewer and fewer people knew the internals of the most complex invention of humanity (computers). I think at some point in the IBM PC age, the number of people with a complete understanding dwindled to zero.
That's complete nonsense.
Until the second half of the 16 bit era (before the 68K series) it was possible to completely grok the schematic of a computer and all of the associated parts, including disk drives and the guts of each of the chips, and to know each and every byte of software that ran on it. It was a lot of work but it could be done.
Be it at the block level or for the die-hard fanatics all the way down to gates (and the transistors that made up those gates).
From there forward it got harder and harder to keep the whole stack from software to the lowest hardware layers in mind when writing code. And it wasn't necessary anyway! After all, code is already a very high level of abstraction. But even today it helps to know the guts of compilers, caching, branch prediction, register optimization, the difference between various buses and their typical latencies when writing high performance software.
So even if it is no longer possible to know each and every detail there still is a degree of necessity.
And there always will be. But now we will have to be satisfied with an understanding limited by the resources that are at our disposal. The result is that we specialize. Which happens in every branch of science that is sufficiently advanced.
A single second of cycles on a modern computer is more than you could conceivably analyze in a lifetime because of the volume. But in principle, given enough patience it could be done. In principle, given enough time and resources you could understand your computer at the same level as before. But our lives are too short and our brains too small to hold these quantities of information so we deal with abstractions instead.
Just like you can't have a complete understanding of every atom in the banana that you eat there is a degree of useful knowledge to which you can abstract it: ripeness, flavor, color, general quality and so on.
Plenty of people have opened up VLSI chips and put them under a microscope just as plenty of people have looked at a thin slice of banana. And in both cases they came away with an increased understanding. It's all about the investment in resources, not about the possibility.
The moment we can no longer completely understand our machines is when they'll stop to be deterministic. And by that time chances are that we'll call them biology.
Even now it's not like you can't understand it, but the amount of time you'd have to invest in understanding it is well beyond the commitment level of most.
Ch 4: The Inside Story.
When you connected your disk drive to your Apple IIe, you popped the top and got your first look at the integrated circuits, or chips, inside the computer. In this chapter, you'll learn what some of the chips do, and how they are used to process your data.
It's interested that the more complicated the computer got, the more sparse the end-user documentation became.
1) This isn't anything new. A complete understanding is not only no longer possible, it was never possible. What level of detail counts as a "complete" understanding? Even when computers were much simpler, you probably wouldn't have a thorough understanding of every layer of abstraction, from software to hardware to the physics of the underlying electrical components.
2) This sort of complete understanding is unnecessary anyway; these references exist as references for a reason. To do your job, you need familiarity and layered understandings of the relevant processes, and when you absolutely need to know the fine details of something, you can simply look it up.
Such a complete understanding is eminently possible. It is as easy to understand as any other complex system. You work at a high level of abstraction to understand the breadth of complexity, and then work down into the component level to understand the depth of complexity. Rinse, and repeat for every subsequent level.
The caveat is that it takes time and dedication.
I can't comment on whether it's necessary or not. There might be scenarios where it is. There might be other scenarios where you're better off studying kung fu, because that might be a quicker route to enlightenment.
But the sort of understanding I'm referring to, and what I believe the author is referring to, is a comprehensive, infinitely detailed understanding -- the sort of understanding of OS X APIs after internalizing 5000+ pages of reference.
I think this sort of "complete" understanding is not what we mean when we say a expert has an excellent understanding of a subject; instead, when we mean that that expert can work knows what is important.
For example, you could be an expert (and have an excellent grasp of the material) of some programming language, but if someone asked you about the exact name of a rarely used library call, you might not know it off the top of your head. You would know where to look it up, and you would know how to use it, and you would know that that library call exists, but you wouldn't recall the exact name.
I think many people can obtain these expert understandings of complex systems, but that's not what the author is referring to (or if he is, then I think his argument is unsound because no one would approach gaining an expert understanding of OS X by reading 11,000 pages of reference material). The author is talking about a unnecessarily detailed "complete" understanding in which you know everything about the system.
In that spirit:
Please could you describe electron and hole flow in semiconductors? Why? Why? Why?
There's lots of bits of computing that works because it works even though we don't really know why it works.
I disagree. Consider a computer like the Apple ][. I didn't 'completely understand' it at the time, but with what I've learned since (including a CS degree), it is now very plausible to me that it could be exhaustively understood, down to every wire and line of OS/firmware code. I can see the boundaries of the areas I don't know clearly, and know that a bounded amount of extra effort could get me down through every 6502 opcode, or all relevant functional physical/electrical properties of all the subsystems.
(While I'm still not strong in the electrical engineering parts, the exercise in my undergrad EECS class where we built a 4-bit CPU from TTL components was the key for me in connecting the electrical world of circuits and logic gates to the abstractions of executable software.)
Maybe, nowadays, the typical systems (like the article's MacBook Air example) are too big, drawing from too many specialties and involving too many details, for any one person to decompose 'all the black boxes' with comprehensive understanding. But just a few decades ago, that wasn't yet the case.
Was it possible all the way down to the transistor level? Maybe not, but it didn't have to be in order to understand the behavior.
It was certainly possible down through the IC level, and there were any number of people who achieved that sort of mastery.
I don't believe that's still possible with any consumer general purpose computer. I think there are people out there with that level of understanding of specialized devices, probably up to and including things like Xbox and PS3. I don't know if there are people that understand an airbook to that level, or a desktop PC, or their webserver. Maybe, but it seems unlikely.
"There was a time when if you read one book by Peter Norton, you literally knew everything there was to know about programming the IBM-PC"
The modern day equivalent to people making DOS jump hoops would be professional C++ games programmers.
It all depends on how far into the system you want to look. How many users of the FAT file system actually learnt the innards of NTFS? How many programmers specialized on Borland IDEs rather than WatCom? There's not really that much more to learn these days if you want to just write software; understanding the entire system is a matter of jumping down the rabbit hole. If I understand how electron tunneling prevents processor architecture going much beyond the current state of the art, do I have a greater understanding of computers than someone who knows the OpenGL API backwards?
Also, and more importantly, some knowledge when held with other knowledge creates emergent understanding of other knowledge. You are able to intuit from core principles. The ability to understand is far more important than the ability to recall.
The problem is not that a single individual cannot master everything but that the reward for learning a part of the book pile becomes lower and lower in proportion to the required effort. This trend is clearly discouraging students to "jump" in some of these fundamental reference books. And teachers find it probably very difficult to provide applied exercises for their theoretical teaching.
"Which do you choose ? How do you know which one will be useful in 5 years ?"
My suggestion is that academia standardize in a virtual machine, like what Knuth started to do with MIX and MMIX. these VM can provide a stable, simpler target to help students and professionnals to enjoy "more complete understanding". I expect this to make learning a lot more fun.
When there is a clear breakthrough in technology (like multicore systems), then a new version of the VM can be specified. This can help to improve communication between academia and the real world : btw MMIX was specified with input from experienced chip archicture designers.
My theoretical foundations in CS, AI, Math (esp discrete) and physics from the late 80s/begin 90s allow me to read cutting edge papers (memristors, spintronics), learn new languages fast, read 'modern' source code and understand modern hardware architectures. This is 'knowledge' from 20 years ago; not that much has changed tbh on low levels.
Actually ; my dad enjoyed his education under Dijkstra in Eindhoven (and others, but this I now find cool :) and he also understands hardware and software on a low level after almost 50 years.
I don't see this panicky 'what you learn is obsolete when you finish uni'; might be for some studies, but for math/physics/cs it definitely isn't (again IFF the education is not flaky; there are plenty uni's I know of which are teaching CS courses which are WAY too practical ; students coming from there 'know' C# and find it very hard to do anything else. That's not a solid foundation at all. Example; university of Dubai.).
Software programming starting from assembly language under linux :
A study going from chip design to high level programming :
IHMO this "problem" of "incomplete vision" started when device drivers where introduced in general purpose OS. Professionnal application developpers started to target API instead of hardware. A milestone in this trend for me is Windows 3.0 (1990). This also marks the demise of fixed hardware computers that the hobbyist favored so far.
 http://andrewboland.blogspot.com/2008/08/five-levels-of-igno... (just submitted on HN: http://news.ycombinator.com/item?id=3643445)
Read "The Shallows" by Nicholas Carr, it discuss this very topic. Humanity had the the same discussion when the printing press became normal. People could start to look up things in books instead of remembering everything (analogy: you're searching the interwebs).
Have a good general understanding of a platform (all the way down to hardware, if you wish) and know where to look when either curiosity or need arises.
you can "understand a baby" just fine. you can never gain a "complete understanding" of man.
I remember first encountering tree views in Borland Delphi 2. Took me a good couple of days before I got the hang of them, and how they interacted with image lists, to fully get how the recursive structure held together, how the dependencies interlinked.
Similarly with OpenGL. Took quite some time to get the hang of the sequencing of matrix operations, world and camera transforms, etc. - all just the very basics, before getting into anything interesting at all.
For real learning, 8 hours a day is simply unrealistic. It wasn't until the third or fourth time I tried many things that I really understood them, to the point that I didn't have to follow any recipes anywhere.
Pick an Operating System, and an environment. I'm using Windows, for more than 10 years now. And that's why I'm not moving to OS X any time soon. I'm incredibly good with Windows, and it can handle all of my daily business work, and software development, as well as entertainment.
And then there are lots of other things: caching, optimization, patterns, ORM, CMS (WordPress, Drupal...), frameworks (CodeIgniter, Symphony...), libraries (jQuery, Prototype...), best-practices, security (SQL injections, encryption...), Unit Testing (Jasmin, PHPunit...), Code Inspection (JSlint), IDEs (Aptana, PHPstorm...), Mobiles (Titanium, Sencha...)
And certainly for business (you probably run your server on Ubuntu) there is Postfix, dovecot for email, SFTP for an FTP server, Mercurial or Git for Version Control and some other stuff to monitor your servers (and maybe services)
And then there is more stuff, if you run an online business like HN, SmashingMagazine, GitHub, Envato, AppStore and all social networks, magazines, markets...
So how can you do it? First, it'll take years. Second, you must live it. When you live it, you'll forget about the quantity, and with time you'll keep all those things in your head.
A couple decades ago one could actually buy a computer as a kit and put the circuit boards together. And even then it was unlikely that you could know everything: even if you were Woz, you probably would not have seen the masks used to make the CPU.
No you can't. Nobody can. Literally. So you're using Windows right now. I suppose you know that a complete windows desktop takes more than 200 millions lines of code ? That's about 10,000 books.
If you really want _complete_ understanding of your system, you really have to read and understand 10,000 technical books. Let's say you learn really really fast, and you can read 1 book each week. That's about 50 books per year. Reading everything therefore would takes you 200 years. I suggest you sign up for cryonics, or donate to an anti-ageing research program.
You may think counting actual lines of code is not fair, but what else? Systems have bugs and limitations that you need to be aware of, if you want to really comprehend them. Also, I have not counted the design of the hardware.
Nevertheless, I think there is hope: http://vpri.org/html/work/ifnct.htm
Do you understand how SSL works? Do you know your way around PHP's internals? Do you know exactly how TCP, IP, Ethernet, etc work? Could you write a Photoshop plugin? Do you understand how parsers and compilers work?
I doubt you (or anyone else for that matter) could answer yes to all those questions. That's the point I think this blog post is getting at. It's no longer possible (question for the reader: was it ever possible, given how deep the rabbit hole goes?) to have a complete understanding of how all the technologies you use every day work.
A lot of people have a decent understanding of most things they need to deal with. That's probably good enough to get by.
You probably couldn't handle the breadth of it, but the depth of computing in the 60s was pretty shallow. You had maybe a dozen levels of (pretty shallow) abstractions, even "higher-level" languages (like Fortran) were pretty uncommon, so you mostly had assembly, an assembler, machine code and the hardware you were running on.
You probably couldn't hold the whole machine in your head, but you could hold a good idea of its cross-section.
This is still commonly done for video game consoles (although it's significantly harder), especially late in the cycle when the console's hardware is well understood and its behavior is mapped, in order to wring every bit of performance out of the machine.
Probably not to the point it was in the 16bit era and earlier, but it's one of the "edges" consoles have over the PC's raw power.
I don't think so. People have always complained that the more you know, the more you realize you don't know.
But lately we're more and more confronted with our limitations as humans. There are so darn many details, so much information.
And no field is static, it's hard to keep up. As programmers/software engineers we're required to have a lot of specific product knowledge. And much of it is very important at one point, and nearly useless the next moment, as new versions are released, as new systems replace the older ones, and old assumptions are overturned.