
The Computer Revolution Has yet to Happen - brucehauman
https://medium.com/absurdist/the-computer-revolution-has-yet-to-happen-f1dbf983d477#.3czl0zgtl
======
IIAOPSW
I'm going to say something a bit controversial here.

Most people are not creative. Its true. There isn't some horde of people who
want to program but don't know it yet because they own a tablet instead of a
computer. There isn't some horde of musicians that will never know it because
their music comes from an mp3 instead of their own instrument. There isn't
some horde of artists who will never know it because their images come from a
camera instead of owning a paintbrush. Heck even on a web-forum where
contributing content is as low friction as sharing links, only ~10% of users
do it. And only 10% of that 10% actually make the content that's posted. 90%
of people are happy to passively consume content.

I wish the author were right. I wish there was this huge hidden demand for a
real computer revolution. I still think that when I buy a device I should
actually own it (which entails freedom to modify). But let's face it
idealistic nerd types. We lost. Most people are consumers not creators. Get
over it, go to work, program _for_ them, and wipe away your tears with a stack
of money.

~~~
sam_goody
I disagree. As soon as it became as easy to create video as it was to draw,
people created video content, not just consumed it. When sharing (youtube) was
easy, the levels of creation have grown unbounded.

The amount of media in all forms that is being created today is astounding.
That's digital drawing

If coding was as easy as writing we would find at least as much growth there.
I know dozens of people who have ideas that would try to code, if it was as
conceptually easy for them to do as is writing a memoir.

Such a revolution could definitely happen.

~~~
sanderjd
Is programming really harder than writing a memoir? It seems to be the other
way around to me.

~~~
noname123
IMHO,

Coding a simple mainentence script is equal to writing a diary entry.

Coding a CRUD web app is as hard as writing an average short story in a
college writing class.

Coding an app that uses heavily system programming principles, concurrency and
networking probably equal to MFA level creative writing and so on...

------
ised
To the author: I could not agree more. In the long term history of computing,
I would hope this stage we are in now is not the height of the "revolution".

In my humble opinion, to which I am entitled, current Apple hardware is still
well-designed like the Apple hardware of the past, but none of it resembles a
"bicycyle for the mind".

These phones and tablets are "computers" but are programmable only by
permission; they are consumption instruments that are meant to support some
plan to dominate the communications, media, entertainment industries. Not my
idea of a programmable, pocket-sized, networked computer.

All due respect to Apple and their wild commercial success, but looking to the
future, I get more excited about my RPi or Teensy than I do about my Apple
devices.

I have little interest in paying for a license to a bloated, complex,
proprietary IDE (Xcode) and seeking approval from an "app store" when I can
write ARM assembly from a netbook or laptop using a free and open source
assembler and run it instantly on the RPi.

The revolution is yet to come. I hope. kparc.com/o.htm

~~~
MCRed
Don't be silly- Apple gives Xcode away for free, it is not bloated nor
complex, nor particularly proprietary (who do you think made LLVM?) and you
don't need to seek anyone's approval to upload your software to github or to
run it on your own device. You can write Arm assembly on your laptop and run
it on your iPhone even easier than you can on an rPi.

You are really wasting time hating on Apple and spinning them (i you really
are an apple device owner) since Apple is the one who broke open the mobile
device so you could run software with out permission (Before you had to get
AT&T for Verizons permission to put code on your phone) and who made high
quality development tools and platforms available for free.

Apple is the one who shipped the Apple 1 with BASIC... and they haven't
stopped.

~~~
greggman
XCode is not free. It costs 1 Mac. Given the price difference between the
cheapest mac and the cheapest alternatives the cost of XCode something > $0

Posted from a mac

~~~
interpol_p
If we follow your argument to the logical extreme there is no such thing as
free software because the computer to run such software almost always costs
money?

(Unless you receive a computer as a gift or it is freely leased; but then that
applies to Macs as much as any computer.)

~~~
scintill76
The GNU toolchain (or whatever) is licensed to run on any hardware that you
can possibly use it on, which will often be cheaper and more accessible than a
Mac. Xcode requires Mac OS, which is only licensed for use on Apple hardware.

You can buy used Apple hardware fairly cheaply, but I think that post still
has a point about relative cost.

~~~
interpol_p
I'm not suggesting that hardware can't be cheaper.

I'm saying that if you define free software as software that must be available
on free hardware, then there is not much free software at all.

~~~
MawNicker
That's not his argument. His argument is that an apple computer is required
for Xcode. An apple computer is a regular computer except that it costs more
and comes with a bundle of software. Xcode is in the bundle. So... The price
is indeterminable but definitely not $0. He hedged specifically against your
reductio ad absurdem by mentioning the price difference.

~~~
interpol_p
The argument was made against the statement "Apple gives Xcode away for free."

They do give Xcode away for free. I didn't think that factoring a modern
development computer into the price was a reasonable stance to take against
that line.

~~~
TeMPOraL
Mac isn't _a_ modern development computer. It's a _very specific_ modern
development computer, and not necessarily the best in terms of quality/cost.

We're getting unnecessarily deep into dissecting this single line anyway. The
original poster's point was that Apple's hardware and software ecosystem,
while of great quality, still isn't the "bicycle for the mind" because it's
terribly locked down.

~~~
interpol_p
I find that argument to be bizarre. You can do computer science on paper.

As long as you can write programs and run them then who cares whether you
compile them directly to the hardware or interpret them, or whatever?

Apple locks down its distribution platform. It doesn't lock down your brain
from thinking or writing programs, and it doesn't lock down its computers from
running them for you.

------
kemiller
I think minecraft fits the bill pretty well actually. Kids make all kinds of
things in there, and it's not fragile in the way traditional programming is.

The article sort of gets hung up on form factor. Tablets are yet another
window into the universe of computing, and there's a lot of creation
happening.

I suspect that the real promised land will need VR for Lego-like construction
of components and/or AI-assisted compilers enabling a sort of DWIM
programming. As it is programming is just too fragile to be of interest to the
"laity."

~~~
malloryerik
I thought of Minecraft too, and the other games that can be modded.

The ComputerCraft mod is interesting, especially a version where you can use a
simple GUI inside Minecraft itself to create Lua scripts that control robots
"turtles" and computers in the game world:
[http://computercraftedu.com/](http://computercraftedu.com/)

A seven-year-old can make programs (maybe from smaller functions that they've
already written) that will hopefully find the diamonds, build the houses, and
manage the farms of their Minecraft life.

This seems possible because of the limited number of objects and actions in a
Minecraft world, while Minecraft is still rich enough for plenty of
experimentation, trial and error, and aha moments...

------
MCRed
I grew up before the generation of kids who booted into basic-- I started
programming with a soldering iron. I designed my first computer. Then I
designed a video card for it. Then I had to write a BIOS for it.

Here's the reality though, Hypercard was a failure. People didn't use BASIC on
the Apple II. some did, sure, but most didn't.

Most people are not programmers and not programming inclined. Apple gives away
the tools you need to build software for your iPad or iPHone-- in the form of
Xcode which is FREE FOR EVERYONE to use. It's one of the best tools out
there-- when Microsoft was still charging thousands a year, Apple put theirs
out for free (and of course, these days Linux and GCC and Ruby on Rails and
the whole programming tradition of open source puts even more tools in
people's hands.)

But heres the thing. Most people don't care. It's never been a better time to
be a programmer.

But the vast majority of people don't want to be.

~~~
nitrogen
_But heres the thing. Most people don 't care. It's never been a better time
to be a programmer.

But the vast majority of people don't want to be._

That just means the tools aren't done yet.

~~~
Pamar
Hammers and screwdrivers are - I think - pretty mature tools. Still lots of
people do not care for DIY and prefer to buy premade or pay someone else to
take care of custom solutions.

~~~
nitrogen
Hammers and screwdrivers still require quite a bit of thought and planning.
IMO, the tools of the "real computer revolution" won't be done until they're
at least as effortless as the Star Trek holodeck ("computer, give me a table
here, made out of metal").

------
ntumlin
In my opinion the most powerful and useful thing a computer does isn't flappy
birds or finding massive primes, but allowing people to communicate more
easily. Everything the computer does on top of letting people tell things to
other people half way across at world at the speed of light (or close enough)
is just icing on the cake.

------
corv
There are many more people capable of programming their devices nowadays but
I'd wager there's less programmers to each user now.

------
agumonkey
Here's my newest favorite computer
[http://museum.mit.edu/150/19](http://museum.mit.edu/150/19)

~~~
platz
[https://youtu.be/IOiZatlZtGU?t=22s](https://youtu.be/IOiZatlZtGU?t=22s)

------
joelg
For the record, the title was taken from the title of AKay's 1997 OOPSLA
keynote, which is well worth a watch.

[https://www.youtube.com/watch?v=oKg1hTOQXoY](https://www.youtube.com/watch?v=oKg1hTOQXoY)

------
bobby_9x
more than a minority of specialists?

All of the software, apps, and services out there should tell you that it's
more than this.

I've seen multiple articles talking about people as young as 10 years old
creating apps. This wasn't possible in the 80s and 90s.

I'm still not sure why adblock was thrown in there. It has only made it more
difficult for indy sites and thw average person to make money and is helping
to create an environment where only large corporations can survive.

The same revolution happened with the music industry: unless you are signed to
a major label, it will not pay the bills.

~~~
nitrogen
_I 've seen multiple articles talking about people as young as 10 years old
creating apps. This wasn't possible in the 80s and 90s._

Yeah it was. Some of us started programming at 5 or 6 years old in the 1980s
and 1990s. That doesn't mean we're better at it than people who start at later
ages, but it does show that it was possible for kids to code (and I'm sure
some even made money at it) back then.

~~~
TeMPOraL
And so it was in the late 90s / early 2000s. I first touched code when I was
6-7, but otherwise started seriously programming when I was 12-13. The _only_
thing you needed was a computer, a library in town, and parents who didn't
limit your computer time very much.

------
mercer
As a programmer and tinkerer (hacker?) I cannot agree more with the sentiment
of the article. And what follows is not a direct response to article, which I
found surprisingly positive altogether, but more of a general rumination on my
own relationship with 'computers' and what I often see happening among my
peers as well.

I think it's important not to only see things as a 'computer specialist',
especially if that perspective (perhaps rightfully) can lead to pessimism
these days.

Throughout my childhood, the main reason why computers excited me was the
promise of realizing all the sci-fi stuff I read about and saw on television:
tricorders, virtual reality, video communication, voice- and touch-interfaces,
zoom-in-and-enhance high-resolution maps, instant access to the knowledge of
the world through some kind of AI (all voice-enabled, obviously).

And now, all these things actually exist (to a _large_ degree), and in a
device that I carry in my pocket!

The child that I was did not for the most part care about _building_ these
tools, or being about the _modify_ and _inspect_ them. He cared about _using_
them. And he's excited about the immense progress in what feels like a very
short time.

This adult that I am, meanwhile, has a tendency to instead mostly complain
about wifi-issues, siri not picking up on my commands, inability to install
flux on my phone, app crashes, the new Google Maps interface, dropped Skype
calls, the state of front-end development, and so on.

However justified that may be, I've found that focusing on what that kid wants
and overcoming the issues that stand in the way has been a much better
motivator than focusing on that adult.

To name a specific example. Based on the articles and discussions here I
sometimes feel a bit... sad that I've mostly been working in web development
since coming of age. Apparently we're reinventing the wheel badly, javascript
is a pretty bad or at best mediocre language, html/css are terrible because
they were not intended for app development, npm is a shitty package manager,
and so on. Sometimes I even start feeling nostalgic for the good old days by
proxy.

But then, when I finish a little journalling/project logging tool that
scratches a personal itch, and I can instantly release that to the web and let
my brother play around with it, or when I write a little bookmarklet that
allows me to fold/unfold/upvote HN comments using the letters on my keyboard,
well, then I feel good again.

Because then I remember that not that long ago I wrote a game in Delphi. It
required trying to figure out how to so something based on random computer
magazines and a single Delphi for Dummies book, it required waiting days for
help from some dude in Florida who thankfully was happy to assist me. It
required putting the game on a floppy disk and hoping that as it was passed
along to my dad and his colleagues, it would somehow get into the hands of
others.

That's when I get excited again about working with computers, and the progress
we've made. And that's the mindset that makes it easier for me to try and
think about ways to get my younger siblings and others as excited about
_building_ and _tinkering_ as I am.

------
kordless
_The Computer Revolution Happens_ , would be a better subject to discuss. This
isn't something that is going to quit happening one day. It'll never stop
happening.

------
tejapr
Indeed.

I am particularly excited about Intel/Micron's 3D xpoint technology, which
will be on sale next year. Extrapolating the exponential rate of improvement
in storage technologies, I wouldn't be surprised if we had persistent TB
storage at SRAM speeds by 2040.

------
zepto
This is gibberish. The programming tools on the iPad are already quite
powerful - far better than those old basic powered micros.

There is also no reason to believe that XCode or the equivalent will not come
to iOS now that the devices are nearly performant enough.

As far as a completely open system like Smalltalk goes - that was a wonderful
world which I wish we lived in, but we don't - not because of Apple, but
because of malware, black-hats, and cyberwarfare.

~~~
_ph_
Apple does not have to open up the whole system - there are some good reasons
for limiting the access of applications to the system part of the filesystem
and the bare hardware.

However, Apple also very much limits the ability to run e.g. Squeak as an app
on IOS, you might run it, but not download source code over the internet. And
if you read the Anandtech iPad Pro test you can see how much software is held
back by the restrictions on exchanging data between apps.

I am writing this as an Apple user and owner of several IOS devices. IOS was a
huge accomplishment in touch UI usability, now it is time to develop the
computing part to similar levels.

~~~
type0
Ipad dumbs down personal computing by doing things in this way.
[https://www.youtube.com/watch?v=gTAghAJcO1o](https://www.youtube.com/watch?v=gTAghAJcO1o)
(Alan Kay talk, 14-15min)

~~~
zepto
I love Alan Kay, but he's just wrong there.

And since the PC and Android are not locked down - why don't we see the magic
there?

~~~
anonbanker
Windows 10 is a walled garden.

Android is Google's walled garden by default.

There is no magic in walled gardens.

