Hacker News new | past | comments | ask | show | jobs | submit login
The drive to UI simplicity deters kids from learning about how computers work (quandyfactory.com)
33 points by RyanMcGreal on June 28, 2012 | hide | past | favorite | 53 comments



(After reading this over, this is a little more misanthropic than I'd intended... I'll post it regardless though.)

The drive to turn general-purpose computation devices into information appliances is something that I find interesting and a bit disturbing. I'm not sure that it's just a fault of streamlined UIs, though. People actively want to neuter their machines to present only what they want and are familiar with, and not a great deal else.

I'm not a Luddite, I don't believe in bashing up the machinery of the internet because it shall enslave us all. It is bewildering, though, that when given ever-better tools for learning and problem-solving, we've managed to infect everything with the same bullshit advertising and consumerism memes that plagued our parents.

Given the opportunity and mechanism for solving problems that were utterly infeasible mere decades ago, for connecting people of different walks of life across untold miles, and for storing several times over the combined sum of mankind's knowledge, a good chunk of startups appear to be either banking on advertising or data mining (for better advertising) as their core business.

Worse, our customers (read: product) flock in droves to our sites and give up their information, buy our products knowing full well our anti-competitive practices (secure boot, trusted computing, and all the rest) because they're too lazy to maintain their own systems, and cheerfully pull down any of their member that question the economic assumptions that condemn "piracy"--and we laud and encourage them!

From time to time we see folks making light of the FSF and Stallman and GNU and that whole mess. I'm basically convinced at this point that the only real mistake those neckbeards have made is making the assumption that the users are worth protecting and ensuring the freedom of.

Why are people concerned about educating the next generation of willful idiots?


Very well put together, I just want to say I fully agree with your point. To give a face to what you describe I would say that Apple is one of the most blatant example of what you are describing.

Furthermore the same dumbing down is happening with websites: we lose control over the information or customization because we want a slick design. Just take a look at modern website with just one function that are posted almost daily on here. They just feel so empty compared to the old and clunky websites of just a few years ago.

I don't want to make it sound like a rant but I really wonder where we went wrong in society and decided, again, to take the simplest path and just ignore details and be dumb about a technology that governs us all.


iOS may be one of the most blatant examples of what he's describing, but Apple are not. For all the eye candy of OS X, it's still the only mainstream operating system where a bash shell is just a click away.


Depending on the definition of 'mainstream' there is only one OS where the bash prompt is not one click away...


> educating the next generation of willful idiots?

You could ask the same question to any teacher, and maybe the answer is that without this education the world would be even worse. Actually the world was much worse before. Not so long ago, say a couple millenniums ago, an accomplished man was the one who killed the most enemies and a complete genocide was the glorious achievement of wars with neighbors. On Egyptian temples you can see warriors offering plates of enemies penises to the emperor.


I've never seen a plumber upset that not everyone they meet knows how to do basic plumbing. Chefs don't get mad that people they meet can't cook a fancy souffle. Why is it that so many programmers expect the world to understand computers?

Yes, people use computers daily. They also use telephones, cars, and projectors. They make use of cd players, mp3 players, and water fountains. Do you think that, back in the '30s, everyone who used the then-new FM radio had any idea how the hell it worked? Most people who watch movies don't know jack about how to produce, or act, or edit, or manipulate film.

Kids aren't learning about computers because most kids don't care about computers. That's just fine. Focus on those who care.


> I've never seen a plumber upset that not everyone they meet knows how to do basic plumbing. Chefs don't get mad that people they meet can't cook a fancy souffle. Why is it that so many programmers expect the world to understand computers?

Because computers are tools, and we'd like people to be independent and to be able to use their tools efficiently.

How much does a plumber charge to change a $0.25 washer?

Some people have a bunch of data, and they need to do stuff to it, but they have no idea how to start or what to do. Not just 'how to create a pivot table' but 'what icon do I click on?'.

I agree that it's not a good idea to think that everyone can pipe sed and awk together, or can create amazing VBA tools, but it'd be nice if people could open a program and use it well.


You know, how to enable people to use their tools efficiently? Make it useable. Make a great UI so the users don't have to think "how do I do X in this program?", don't have to learn to program so they can fix your lame program, but can just use it.


This. And if they want to learn the what's and why's and get to the guts of it, they'll find a way outside of that UI as I'm sure that's how we all got to where we are in web, because we care and we're curious and neither of those traits are disappearing in kids.


Computers are not tools. Computers are platforms for which you can build and use tools. The programs, such as MS Office and Facebook, are the tools. Most people are content to just use tools, not build them.


>Computers are not tools.

I think this is a part of the trend that we're criticizing. Computers use to be tools, and to use them one necessarily had to learn something about them. Over the last decade or so, the computer has been transformed into a platform, one in which the user has decreasing control over. Instead of empowering its users its become yet another mechanism for marketing. Sure, the rising tide has lifted all boats, its users have become more empowered. But there was so much more potential (and still is). We're not seeing this potential being explored in the current era of startups (or at least they're not front and center).


> Because computers are tools, and we'd like people to be independent and to be able to use their tools efficiently.

Hence the drive towards UI simplicity.


Considering that the foundation of our realities is information, it seems foolish not to educate the masses on how to manipulate it efficiently. Here we have a cheap, general purpose information transforming machine and yet we have dimwits arguing that it's not so important to learn how to use them. The only other things which have transformed society as much as these useless machines are writing and the printing press. But we don't ask anyone to learn how to write, now do we?

To put it another way, knowledge is power. Having the ability to use a machine which provides access to, for all intents and purposes, unlimited knowledge is of paramount importance to society in the long term. It's just as important as literacy.


The average person doesn't need to be able to cook a fancy souffle - but how about an omelet? Most chefs I know wish more people had basic cooking skills.


The world would be a much better place if everyone knew the basics of multiple disciplines and had some basic knowledge about how all the machines they use everyday work.

That being said, in the real world people don't give a shit, and the companies that are successful in any industry recognize and accept this fact.


Computer programming is - or should be regarded as - more basic, more fundamental a skill than the ability to use/build/maintain specialized machines. A toaster processes sliced bread; whereas a computer processes information, which is global, ubiquitous and open-ended in its application.

The ability to write a computer program is too broadly powerful across a whole swath of human endeavours to keep in the hands of professional programmers, in the same way that the ability to read and write was too powerful to keep in the hands of professional scribes.

Our present-day society is inconceivable without the advent of social literacy, starting in the 15th century and accelerating through the 18th and 19th centuries. If we allow programming to become a disciplinary cul-de-sac instead of spreading it as widely as possible, what potential future society are we cutting off?


Let's not care about companies, let's care about people, human beings. Is it better for your health and independent and well-being to know how to cook the food you eat or to just buy burgers?

Same for computers. Say you have kids, will you not at least try to push them toward controlling such important and intimate things as their computers or phones? Would you agree with them when they tell you they don't care how Facebook work, what info they have, what other choice are available? All these means they need to be computer literate. If they don't they will be slaves.


> I've never seen a plumber upset that not everyone they meet knows how to do basic plumbing. Chefs don't get mad that people they meet can't cook a fancy souffle. Why is it that so many programmers expect the world to understand computers?

You haven't? Man, every "car person" I know gets pissed off whenever someone goes to a mechanic for an oil change or tire change...


Touche.


Oh, there is nothing like the smell of Luddites in the morning!(people against change mixed with fear and mixed with "those were the times...").

So I assume this person really knows how the clothes she wears are made. Of course if she wears socks or a sweater she will know knitting, or at least how to manage a sewing machine(everybody should know just the basics).

Of course she will also know how her car works and repair it when needed, if she had ever flied she will know how to pilot a plane.

Look at mee!!, I AM A GEEK! I know how to do all of the above so everybody should!!

Now you also need qualities that are not geek nature. It seems like we have to be everything, from playing good the piano and being a good lover to being the funniest at the parties and work in your job like a machine, while we are the best parents to our children and the best givers to the community.


I'm a little surprised to see the label of luddism applied to an argument that we should be encouraging more people to learn to program computers.

I can understand what you're saying here, but I'm not making an argument for the "good old days" of command line interfaces and arcane commands. What I'm arguing for is what you expressed in this comment:

"Of course if she wears socks or a sweater she will know knitting, or at least how to manage a sewing machine (everybody should know just the basics)."

I don't know how to knit but my wife does, and I can sew well enough to make a Hallowe'en costume, fix a button or hem a pair of pants.

All the time I see people manually, painstakingly processing data as a necessary part of their jobs - repetitive, time-consuming stuff that could easily be automated with a fairly simple script. If we made basic computer programming, like basic literacy and math, a part of core education curricula, we would empower many more people to write amateur code that's good enough for the task at hand, just as most amateur writing is good enough to communicate meaning without being slick and professional.

I'm not saying everyone should work as a software developer, any more than I'm saying everyone should work as a writer. However, the general ability to read and write is absolutely invaluable no matter the career, to the extent that people who are functionally illiterate are barely employable and can scarcely function in our society.

Similarly, I believe a general ability to write computer programs - to express a set of steps that execute some data processing task - can become invaluable as well and unlock a huge boost in general productivity.


>All the time I see people manually, painstakingly processing data as a necessary part of their jobs - repetitive, time-consuming stuff that could easily be automated with a fairly simple script. If we made basic computer programming, like basic literacy and math, a part of core education curricula, we would empower many more people to write amateur code that's good enough for the task at hand, just as most amateur writing is good enough to communicate meaning without being slick and professional.

I think that more important is the ability to at least be capable of identifying repetitive work and asking someone more skilled in programming to implement it. It's amazing what kind of basic things people miss when you're asking them about what could be automated. Sometimes, you need to sit next to them and watch them work just to identify the problem.


Agree, and to say more : the few great or successful people I met were all taking very seriously their control over new technologies. I was surprised by an old lawyer, he was knowing and even using almost all corners of his email client. Take control over your tools or they will control you.


It has to be said that we only have what the article calls social literacy - i.e. a largely correct assumption that everybody can read and write - because there are laws in place that make it so. Everybody has to learn reading and writing in school.

I don't think social computer-literacy can be achieved without making it a mandatory part of everybody's school curriculum.

So the question we have to debate is whether computer-literacy should be such a mandatory part of the curriculum. I am not completely certain on this, but my instincts tell me that the answer is Yes.


Maths is a mandatory part of the curriculum in the UK and I'm sure in the US and in many countries. Famously, we are not very good at it as a country. Teenagers are asking for 'real world' maths more, they don't like the abstract nature of mathematics as found in standard syllabi.

How about incorporating mathematical modelling as part of school maths? Using Excel/Calc or a programming environment such as Scratch?

http://scratch.mit.edu/

Any mileage? The history of school maths by the way is pretty political: no stats before about 1950 in the UK, and then only for the top 10%. Arithmetic for the workers.


The answer is no.


The problem is not UI simplicity, but rather the consumer monoculture of consumption destinations like facebook or even reddit. As the article points out, many young users rarely go beyond popular sites and fail to learn basic computing skills. This is a problem of users viewing the web as a new form of couch potato TV, but I would not blame simpler UI for this phenomenon.


  > This is a problem of users viewing the web as a new
  > form of couch potato TV
And this is a problem because?


Because it's a bad way of living your life.

And the non-judgmental, you're all special snowflakes, do whatever, is also.


> And the non-judgmental, you're all special snowflakes, do whatever, is also.

Oh? How are you so sure?


~3,000 years of recorded history point to it.


Is that so?


And this is a good thing in general. Computers are tools, after all. When I drive a car, I don't need to know how internal combustion engines, transmissions, steering etc work, like it was in the early days. The same is true for computers today. Why should the user be forced to learn all the peculiarities of the system if all he needs is a way to talk to his friends on the other side of the globe? If it works, it works, and if it doesn't you call an expert.

True, you should know a bit about cars if you want to do crazy endurance races and you should know a bit about computers if you want to use their full potential, but the vast majority of the users won't ever need that.


The "just works" however should be only the outer shell, and if you want to do more, or if you want to learn, this shell should be easy to remove. As you can open your car and look inside, mess with it.

Instead many modern systems are designed to be blocked in the most basic shell, and to deny all the rest. Like the Apple iPad for instance.


It needs to be pointed out that there were legal struggles last century over the right to repair automobiles. Car companies didn't want third parties to be allowed to fix their cars. At the time, governments generally decided in favour of consumer rights to their own purchases and the benefits of competition.

As a disturbing sidenote, the "right to repair" issue has re-emerged with the computerization of cars: http://www.r2rc.eu/



I don't actually see the problem. The analogy between learning computer programming and learning to read and write is flawed. The written word is a tool which, to be used, requires both parties to know how to read and write. There is no way to use written media unless you know how to read. Computer programs are not like that. You don't need to know programming to use programs, and there is really no problem if most people don't know how to program. The real analogy would be programming, and knowing how to operate a printing press. You need this knowledge to produce the end product (programs and printed books), but not to use it. And that's fine, really.


We're heading towards the Idiocracy future:

http://acousticmonster.com/wp-content/gallery/wtf-files/wind...


Necessity is the mother of invention. No necessity means no invention. Right now, we're building an entire generation that will be unable to produce computer innovation because they've become complacent: they believe that facebook is the end of the computer world. That would be fine if facebook actually were the end of the computer world, but it isn't. In thirty years we'll need those people to manage sprawling databases, automate more and more complex systems, and generally bend computers to their will, and they'll be unprepared and inadequate.


I don't see the relation here. If there was no facebook or other apps, people would not use the computer, they would watch TV, go to pub, or stuff like that. These people that use facebook, mostly, just don't care about computers, they want to interact with their friends and have fun. What would you suggest? Making facebook harder to use? Making people learn to program to publish a message on twitter?

Users are just users, they don't want to know how it works. It was always like this.


Most people never needs these things.

In the early 80's, reaching a million sold of home computers used mostly for games was a major milestone. The best selling home computer of all time by far - the C64 - "only" sold 20 million or so.

Most of the people using them never learned to do more than load and start games.

While we're probably building a generation where a smaller proportion of those who use computers extensively know details about them, it'd be pretty incredibly if the absolute numbers of people with development skills at the level of those of us who grew up with those home computers weren't significantly higher.

E.g. AIDE - an IDE for writing Android apps on Android devices, quite the special niche, shows up as 100k - 500k downloads on Google Play.

Ohloh reports projects with more than 10k participants and over half a million projects total, and 1.6 million total users (though that's likely over-reporting, as for unclaimed user names there might be substantial duplication).

I find it a pretty safe assumption that the number of programmers has never been higher - we just make up a smaller percentage of computer users.


Innovation is brought by those who care. I don't know, where this notion that everyone should care about computers comes from. There are tool makers, and there are tool users. The latter are always a minority compared to the former. There are generations of people who don't know how to farm or how to build an automobile. So what?


Comparing computer illiteracy with analphabetism seems a little exagerated. I remember seeing this same divide between geeks and people who couldn't care less ten years ago.

And about the quote: "The kids I have, and that is roughly two dozen of the brightest young digital artists a semester, often have no idea what Microsoft Word is. They can't tell a Mac from a PC. And forget Excel," he says. He struggles to get his students to use basic computing etiquette." Maybe they should raise the bar for entrants to this course... really.

update: how can those students be the brightest digital artists if they can't even tell if they are working on a Mac or PC?


As someone said "genius" or experts are always special. Just because everybody uses a computer, it does not mean that everybody will be the next Euler.

But it does not mean either than there are not going to be more Eulers. Quite the contrary, the barriers to know for those that want it are smaller than ever. And there is people that care. They are a minority, but they are there.

The worst thing you could do is to try to force it to all, as what was a pleasure becomes a sin as Feynman realized.


As a demonstration of the difference in barriers: When I was 10 I wanted to learn assembly. I didn't have any books about it. None of my friends did. This was 1985, so I didn't have internet access (nor would I have a modem for another 7 years). My local library had some low level books on electronics with vague descriptions of how CPU's worked. None of the local book stores had anything.

We "knew" we had to get machine instructions into memory somehow, but we didn't know how they were encoded, and we didn't know anything about how even BASIC was encoded in memory on our C64's. But we tried to figure it out, and got pretty much nowhere. At some point we resorted to trying to POKE the names of BASIC keywords into memory. Not only was it not assembler, it's not even how C64 BASIC does it. Our efforts petered out.

After ages I got hold of a pirated copy of an assembler written in BASIC that I could start figuring out. It got me a bit further, but not much. It was first one or two years later I finally got hold of decent quality assembler and tutorials and started learning assembler properly.

I do think it's sad that programming is not made more "visible", though. E.g. take automation. Most OS X apps supports being automated with AppleScript, but how many users even know it exists? Increasingly Linux applications supports automation via DBus, but it's the same situation.

I've not seen widespread usage of cross-application automation since ARexx on AmigaOS, which turned a large number of casual users into programmers (even if it was "only" for simple stuff to automate their own workflows) despite how horribly lacking ARexx is in user-friendliness, simply because it helped them get their stuff done better and they could start small (one line scripts to tell an application to do something when they pressed some icon, for example) and build on that.

Just making automation tools more discoverable would be a major step towards getting people more interested.


When I was younger between the ages of 9-15 I loved hardware hacking. I would take apart Xboxes, Play Stations and even my computers. I would install mod chips, homebrew software and even tacky lighting. Being apart of the hardcore gaming community (CS, Team Fortress, WoW) almost instilled the notion of modding your 'rig'. People would overclock and and liquid cool their PCs which is hardcore stuff. I feel like there are two things that have led to a decrease in the hardware hacking environment (which I feel is parallel to the author's point).

1.Manufacturers began to 'get it'. Computers began to look cool, no more beige boxes and lame looking side panels. With the birth of Aleinware, the Dell XPS and Falcon Northwest came a huge blow to the modding community. Instead of spending days modding your PC to make it look cool and perform well, you could buy one out of the box (sometimes even overclocked)! This started to eat away at the modding culture. Even console makers stepped their game up in terms of connectivity. They began to provide more codecs for video playback which was one of the main reasons to mod a console- to allow video playback.

2. Another thing happened which pulled me and others away from the modding/hacking community- I bought a Mac. Ever since I moved into the Apple ecosystem I haven't opened up a piece of hardware. I bought my first Macbook when I was 16 and loved it. It worked, it did everything I needed it to. I moved more into the less horsepower social game landscape where I didn't need the heavy duty processing power. Once I was in the Apple ecosystem I didn't need a home NAS server running with my torrents, I began buying my music, movies and TV shows. This content then streamed on my devices (Apple TV and mobile) perfectly. It required no hacking to work, it all came together effortlessly. I feel like this is happening to a lot of hackers. I deep down feel Apple has made people complacent with current technology and has really created a lack of need to hack your hardware and software.

This begs a question, has the paradigm shifted from enthusiasts pushing the envelope with hardware and software to the technology companies?


Not everyone needs to know how computers work. There will always be some who do, and most who don't. That's the way it has been and the way it probably always will be.


The objective of simplifying the UI is to reduce as much as possible what a user has to learn. The thinking is that the less a user has to learn, the more a user gets to do and as a result the greater satisfaction gained by the user. Satisfied users are good for sales, so UIs have been getting simpler to generate more money.

It's no surprise that UI simplicity deters kids from learning about how computers work, since UI simplicity is supposed to deter learning, or eliminate the need to learn as much as possible.


I wonder if this same argument was made when computers first had mouses and GUIs introduced...

It's a silly, pointless argument. Absolutely without merit.


So you build tools that attempt to make it easier to make mobile apps (PhoneGap, Trigger, Titanium etc) and the geeks cry "use native code!".

I'm looking forward to the day where we can build any website/app/program/idea by simply having a conversation with our computers (or the cloud). 12 year-old: "I want a mobile app that finds my friends on Facebook nearby", comp: "Sure, how's this?", 12yo: "Cool, I want to be able to post a message to all nearby friends", comp: "Here you go. Do you want it to send an SMS or a push notification or?"...etc...

As much as 98% of HN will hate this concept, the faster coding can become more modularized and easy enough for the mainstream to build whatever idea comes to mind, the better off society will be and the faster humanity will progress.


If we ever build computers smart enough for that, they'll probably also be smart enough to tell us to go f* ourselves. Even more probable if by that time we are still preoccupied with social networking trivialities.


True, computer literacy today is harder than it should be (because we are only in the early days). This being said, the core of programming (being able to formalize what you want) is a rare and valuable skill that isn't going to be automatized away any time soon.


Someone would have to program the computer either to the point that it can learn by itself, or that it can do everything already. Lots and lots of programmers would still be needed. And bug fixing would be an honored and stable vocation done by programming masters, it would be like the most minute and delicate brain surgery.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: