Hacker News new | past | comments | ask | show | jobs | submit login
Ask HN: Muggles and wizards
2 points by nollidge on Aug 24, 2010 | hide | past | favorite
I was just reading the Successful Software blog post "10 things non-technical users don’t understand about your software", and then ensuing HN discussion, and was inspired to start spitballing here.

My idea is: I have always wanted to start some sort of educational website for people like those that Andy describes - that have never used copy-paste, don't understand what files are, etc.

To me, the fundamental problem is that most people think computers are magic; that they're turtles all the way down. And this is somewhat a problem of our (programmers and engineers) making. We create these abstractions for users, but when they inevitably break down, we expect them to grok the next level down. Is there a way to break this cycle, to educate these muggles in the Ways of Computation? Personally, I learned because I was self-motivated. I noticed, probably around age 8 or 9, that there was this world inside the machines that could be manipulated by those who spoke the right incantations, and I had the audacity to think I could learn to be such a wizard. Can someone learn these things without such a motivation? Do they need to?

I can understand how not everyone wants to be a computer programmer, certainly (because I don't want to be, say, an architect), but can't everyone learn what it is to be one? We all understand, roughly, how an architect does their job, the constraints they must consider, because building physical objects is incredibly intuitive to anyone who's ever stacked wooden blocks. And for me, computation is just as intuitive, but what was the key insight? Was it gradual? I feel like maybe my assembly language and binary logic classes may have been when my feet finally touched the ground, but that wasn't until college. Does it need to take that long? Certainly my learning was in fits and starts--my brain was stuck in procedural BASIC mode for probably a decade before discovering OO and beyond.

Another roadblock is that, in some sense, programmers and computer engineers are a bunch of liars. We talk about, for example, object-oriented programming, but there aren't any objects. We just made them up. Our "object" is a set of bits, probably not even in a contiguous sequence in memory. And we point at it and call it an object when it looks nothing like all the other nouns. It's an utter deception. We are conjurers, not in any supernatural sense, but in the Penn-and-Teller sense that there's a secret trick to everything we do. Teller's not actually teleporting around the stage; he's just crawling (albeit deftly and with great practice) through a hidden chamber. Gamers aren't blowing up aliens; they're just standing atop a mountain of abstractions designed to fool them (willingly) into having that experience. There's a reasonable explanation for everything happening on the screen.

I guess my dream would be to teach computing in a completely non-magical way, to have people understand that the turtles stop, as far as computing goes, at on-off switches. You can't start there, though, because most people would get discouraged by the vast disconnect between "on-off" and "Toy Story".

Anyway, in case it isn't obvious, it's all very fuzzy in my head at this point. I know that I was particularly inspired by the style and flow of guides like Joel Spolsky's http://hginit.com (a Mercurial tutorial), _why's Poignant Guide to Ruby, and Learn You A Haskell for Great Good!, but these are obviously just primers on specific technologies, not a way to learn computing itself, or however you would describe what I want to do. I'm not sure the communication mode would translate.

Does this make any sense? Has anyone tried to do something like this before?




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: