The drive to turn general-purpose computation devices into information appliances is something that I find interesting and a bit disturbing. I'm not sure that it's just a fault of streamlined UIs, though. People actively want to neuter their machines to present only what they want and are familiar with, and not a great deal else.
I'm not a Luddite, I don't believe in bashing up the machinery of the internet because it shall enslave us all. It is bewildering, though, that when given ever-better tools for learning and problem-solving, we've managed to infect everything with the same bullshit advertising and consumerism memes that plagued our parents.
Given the opportunity and mechanism for solving problems that were utterly infeasible mere decades ago, for connecting people of different walks of life across untold miles, and for storing several times over the combined sum of mankind's knowledge, a good chunk of startups appear to be either banking on advertising or data mining (for better advertising) as their core business.
Worse, our customers (read: product) flock in droves to our sites and give up their information, buy our products knowing full well our anti-competitive practices (secure boot, trusted computing, and all the rest) because they're too lazy to maintain their own systems, and cheerfully pull down any of their member that question the economic assumptions that condemn "piracy"--and we laud and encourage them!
From time to time we see folks making light of the FSF and Stallman and GNU and that whole mess. I'm basically convinced at this point that the only real mistake those neckbeards have made is making the assumption that the users are worth protecting and ensuring the freedom of.
Why are people concerned about educating the next generation of willful idiots?
Furthermore the same dumbing down is happening with websites: we lose control over the information or customization because we want a slick design. Just take a look at modern website with just one function that are posted almost daily on here. They just feel so empty compared to the old and clunky websites of just a few years ago.
I don't want to make it sound like a rant but I really wonder where we went wrong in society and decided, again, to take the simplest path and just ignore details and be dumb about a technology that governs us all.
You could ask the same question to any teacher, and maybe the answer is that without this education the world would be even worse. Actually the world was much worse before. Not so long ago, say a couple millenniums ago, an accomplished man was the one who killed the most enemies and a complete genocide was the glorious achievement of wars with neighbors. On Egyptian temples you can see warriors offering plates of enemies penises to the emperor.
Yes, people use computers daily. They also use telephones, cars, and projectors. They make use of cd players, mp3 players, and water fountains. Do you think that, back in the '30s, everyone who used the then-new FM radio had any idea how the hell it worked? Most people who watch movies don't know jack about how to produce, or act, or edit, or manipulate film.
Kids aren't learning about computers because most kids don't care about computers. That's just fine. Focus on those who care.
Because computers are tools, and we'd like people to be independent and to be able to use their tools efficiently.
How much does a plumber charge to change a $0.25 washer?
Some people have a bunch of data, and they need to do stuff to it, but they have no idea how to start or what to do. Not just 'how to create a pivot table' but 'what icon do I click on?'.
I agree that it's not a good idea to think that everyone can pipe sed and awk together, or can create amazing VBA tools, but it'd be nice if people could open a program and use it well.
I think this is a part of the trend that we're criticizing. Computers use to be tools, and to use them one necessarily had to learn something about them. Over the last decade or so, the computer has been transformed into a platform, one in which the user has decreasing control over. Instead of empowering its users its become yet another mechanism for marketing. Sure, the rising tide has lifted all boats, its users have become more empowered. But there was so much more potential (and still is). We're not seeing this potential being explored in the current era of startups (or at least they're not front and center).
Hence the drive towards UI simplicity.
To put it another way, knowledge is power. Having the ability to use a machine which provides access to, for all intents and purposes, unlimited knowledge is of paramount importance to society in the long term. It's just as important as literacy.
That being said, in the real world people don't give a shit, and the companies that are successful in any industry recognize and accept this fact.
The ability to write a computer program is too broadly powerful across a whole swath of human endeavours to keep in the hands of professional programmers, in the same way that the ability to read and write was too powerful to keep in the hands of professional scribes.
Our present-day society is inconceivable without the advent of social literacy, starting in the 15th century and accelerating through the 18th and 19th centuries. If we allow programming to become a disciplinary cul-de-sac instead of spreading it as widely as possible, what potential future society are we cutting off?
Same for computers. Say you have kids, will you not at least try to push them toward controlling such important and intimate things as their computers or phones? Would you agree with them when they tell you they don't care how Facebook work, what info they have, what other choice are available? All these means they need to be computer literate. If they don't they will be slaves.
You haven't? Man, every "car person" I know gets pissed off whenever someone goes to a mechanic for an oil change or tire change...
So I assume this person really knows how the clothes she wears are made. Of course if she wears socks or a sweater she will know knitting, or at least how to manage a sewing machine(everybody should know just the basics).
Of course she will also know how her car works and repair it when needed, if she had ever flied she will know how to pilot a plane.
Look at mee!!, I AM A GEEK! I know how to do all of the above so everybody should!!
Now you also need qualities that are not geek nature. It seems like we have to be everything, from playing good the piano and being a good lover to being the funniest at the parties and work in your job like a machine, while we are the best parents to our children and the best givers to the community.
I can understand what you're saying here, but I'm not making an argument for the "good old days" of command line interfaces and arcane commands. What I'm arguing for is what you expressed in this comment:
"Of course if she wears socks or a sweater she will know knitting, or at least how to manage a sewing machine (everybody should know just the basics)."
I don't know how to knit but my wife does, and I can sew well enough to make a Hallowe'en costume, fix a button or hem a pair of pants.
All the time I see people manually, painstakingly processing data as a necessary part of their jobs - repetitive, time-consuming stuff that could easily be automated with a fairly simple script. If we made basic computer programming, like basic literacy and math, a part of core education curricula, we would empower many more people to write amateur code that's good enough for the task at hand, just as most amateur writing is good enough to communicate meaning without being slick and professional.
I'm not saying everyone should work as a software developer, any more than I'm saying everyone should work as a writer. However, the general ability to read and write is absolutely invaluable no matter the career, to the extent that people who are functionally illiterate are barely employable and can scarcely function in our society.
Similarly, I believe a general ability to write computer programs - to express a set of steps that execute some data processing task - can become invaluable as well and unlock a huge boost in general productivity.
I think that more important is the ability to at least be capable of identifying repetitive work and asking someone more skilled in programming to implement it. It's amazing what kind of basic things people miss when you're asking them about what could be automated. Sometimes, you need to sit next to them and watch them work just to identify the problem.
I don't think social computer-literacy can be achieved without making it a mandatory part of everybody's school curriculum.
So the question we have to debate is whether computer-literacy should be such a mandatory part of the curriculum. I am not completely certain on this, but my instincts tell me that the answer is Yes.
How about incorporating mathematical modelling as part of school maths? Using Excel/Calc or a programming environment such as Scratch?
Any mileage? The history of school maths by the way is pretty political: no stats before about 1950 in the UK, and then only for the top 10%. Arithmetic for the workers.
> This is a problem of users viewing the web as a new
> form of couch potato TV
And the non-judgmental, you're all special snowflakes, do whatever, is also.
Oh? How are you so sure?
True, you should know a bit about cars if you want to do crazy endurance races and you should know a bit about computers if you want to use their full potential, but the vast majority of the users won't ever need that.
Instead many modern systems are designed to be blocked in the most basic shell, and to deny all the rest. Like the Apple iPad for instance.
As a disturbing sidenote, the "right to repair" issue has re-emerged with the computerization of cars: http://www.r2rc.eu/
Users are just users, they don't want to know how it works. It was always like this.
In the early 80's, reaching a million sold of home computers used mostly for games was a major milestone. The best selling home computer of all time by far - the C64 - "only" sold 20 million or so.
Most of the people using them never learned to do more than load and start games.
While we're probably building a generation where a smaller proportion of those who use computers extensively know details about them, it'd be pretty incredibly if the absolute numbers of people with development skills at the level of those of us who grew up with those home computers weren't significantly higher.
E.g. AIDE - an IDE for writing Android apps on Android devices, quite the special niche, shows up as 100k - 500k downloads on Google Play.
Ohloh reports projects with more than 10k participants and over half a million projects total, and 1.6 million total users (though that's likely over-reporting, as for unclaimed user names there might be substantial duplication).
I find it a pretty safe assumption that the number of programmers has never been higher - we just make up a smaller percentage of computer users.
And about the quote: "The kids I have, and that is roughly two dozen of the brightest young digital artists a semester, often have no idea what Microsoft Word is. They can't tell a Mac from a PC. And forget Excel," he says. He struggles to get his students to use basic computing etiquette."
Maybe they should raise the bar for entrants to this course... really.
update: how can those students be the brightest digital artists if they can't even tell if they are working on a Mac or PC?
But it does not mean either than there are not going to be more Eulers. Quite the contrary, the barriers to know for those that want it are smaller than ever. And there is people that care. They are a minority, but they are there.
The worst thing you could do is to try to force it to all, as what was a pleasure becomes a sin as Feynman realized.
We "knew" we had to get machine instructions into memory somehow, but we didn't know how they were encoded, and we didn't know anything about how even BASIC was encoded in memory on our C64's. But we tried to figure it out, and got pretty much nowhere. At some point we resorted to trying to POKE the names of BASIC keywords into memory. Not only was it not assembler, it's not even how C64 BASIC does it. Our efforts petered out.
After ages I got hold of a pirated copy of an assembler written in BASIC that I could start figuring out. It got me a bit further, but not much. It was first one or two years later I finally got hold of decent quality assembler and tutorials and started learning assembler properly.
I do think it's sad that programming is not made more "visible", though. E.g. take automation. Most OS X apps supports being automated with AppleScript, but how many users even know it exists? Increasingly Linux applications supports automation via DBus, but it's the same situation.
I've not seen widespread usage of cross-application automation since ARexx on AmigaOS, which turned a large number of casual users into programmers (even if it was "only" for simple stuff to automate their own workflows) despite how horribly lacking ARexx is in user-friendliness, simply because it helped them get their stuff done better and they could start small (one line scripts to tell an application to do something when they pressed some icon, for example) and build on that.
Just making automation tools more discoverable would be a major step towards getting people more interested.
1.Manufacturers began to 'get it'. Computers began to look cool, no more beige boxes and lame looking side panels. With the birth of Aleinware, the Dell XPS and Falcon Northwest came a huge blow to the modding community. Instead of spending days modding your PC to make it look cool and perform well, you could buy one out of the box (sometimes even overclocked)! This started to eat away at the modding culture. Even console makers stepped their game up in terms of connectivity. They began to provide more codecs for video playback which was one of the main reasons to mod a console- to allow video playback.
2. Another thing happened which pulled me and others away from the modding/hacking community- I bought a Mac. Ever since I moved into the Apple ecosystem I haven't opened up a piece of hardware. I bought my first Macbook when I was 16 and loved it. It worked, it did everything I needed it to. I moved more into the less horsepower social game landscape where I didn't need the heavy duty processing power. Once I was in the Apple ecosystem I didn't need a home NAS server running with my torrents, I began buying my music, movies and TV shows. This content then streamed on my devices (Apple TV and mobile) perfectly. It required no hacking to work, it all came together effortlessly. I feel like this is happening to a lot of hackers. I deep down feel Apple has made people complacent with current technology and has really created a lack of need to hack your hardware and software.
This begs a question, has the paradigm shifted from enthusiasts pushing the envelope with hardware and software to the technology companies?
It's no surprise that UI simplicity deters kids from learning about how computers work, since UI simplicity is supposed to deter learning, or eliminate the need to learn as much as possible.
It's a silly, pointless argument. Absolutely without merit.
I'm looking forward to the day where we can build any website/app/program/idea by simply having a conversation with our computers (or the cloud). 12 year-old: "I want a mobile app that finds my friends on Facebook nearby", comp: "Sure, how's this?", 12yo: "Cool, I want to be able to post a message to all nearby friends", comp: "Here you go. Do you want it to send an SMS or a push notification or?"...etc...
As much as 98% of HN will hate this concept, the faster coding can become more modularized and easy enough for the mainstream to build whatever idea comes to mind, the better off society will be and the faster humanity will progress.