
Coding Horror - Welcome to the Post PC Era - kreutz
http://www.codinghorror.com/blog/2012/03/welcome-to-the-post-pc-era.html
======
zalew
Ok folks, let me know when you throw out your computers. I mean, seriously. I
don't want to hear the 100th story about how ipad changed your life unless you
literally get rid of your computer. If the number of people who _replaced_ a
computer with a tablet exceeds the number of people who _still have_ one, we
can talk about that post-pc mumbo-jumbo. So, how's gonna be?

~~~
polemic
+1.

They're still just expensive toys. And I really, really mean that.

Have I ever received an email from someone using an iPad?

Nope. Not once.

Has anything I've looked at online in the last week been created on an iPad?

Nope.

Post-PC? Try post- _TV_.

~~~
ugh
I write all my email on the iPad. I'm not really sure how you would know
whether someone wrote you an email with an iPad or not.

(This comment was written on an iPad while relaxing on a comfy chair. You can
tell because I'm not using typographically correct apostrophes.)

~~~
spicyj
(If you hold down on the apostrophe key, you can make curly ones.)

~~~
ugh
(I know. I do that sometimes, but usually it’s too much work for me.)

------
julian37
I think Atwood is right, PCs are on their way out simply because with tablets
there is so much less that can go wrong. He mentions tech support in the
article: THAT is why they will win, regardless of whether or not they will
fully replace PCs, or whether or not they are better suited to writing emails.
Millions of people who can't be bothered with anti-virus updates and the like
will just find it a much more pleasant experience: "it just works" indeed.

(I'm a Mac user and I find myself spending a lot less time on tech support
type issues than back in the day when I was using Linux and Windows, still
it's hard to deny that iPads are currently the epitome of simplicity, compared
to their power.)

And that's what I'm worried about: "No user serviceable parts inside". I'm
concerned the very reason tablets will take over the PC market will also mean
that tomorrow's kids' experience is very different from, and I would argue
poorer than, my experience when I got my first computer, a Commodore VIC 20.

Ironically, my VIC 20 also was very user friendly. You turn it on, it's on.
You put in a diskette -- well, it didn't load automatically but making it load
was a very simple incantation, and once the game started, it was started.
Things were simple and worked, for the most part, quite well. No tech support
needed. But the BASIC shell was there right at your fingertips -- hell, the
manual came with example BASIC programs.

Quibbling over the merits of BASIC aside, it was a simple experience but it
was also tremendously open.

Things got much more complicated since then; network connectivity in
particular introduced previously unknown threats to users, so of course this
is a simplified comparison. Still, I wonder, is giving up the freedom to
tinker with the system really the price tomorrow's consumers will have to pay
in order to buy simplicity?

And if you're thinking about replying with links to IDEs running on the iPad:
no, something that runs in the cloud and has no direct access whatsoever to
the internals of the machine does not qualify as a modern-day replacement for
this experience, no matter how sophisticated.

~~~
victork2
I think the whole question Tablet vs Computer is pretty dumb in the first
place. It's not a versus, it's a peaceful cohabitation, only some evangelist
and/or journalists want to push us to believe there's a war but that's really
not true.

Tablet and PC simply correspond to two totally different use of media. A
tablet is consumption centered whereas the PC is able to produce complex goods
and services than then can be consumed by a tablet. Will the tablet push away
PC usage to very low level in some areas? Probably! Will it replace a PC for
every usage ? Are you kidding me: Development, 3D Effect, Advanced Audio
Processing, Advanced Photo processing. That's just as stupid as saying that
photoshop is dead because Instagram can do sepia...

PS: I will absolutely never buy a tablet, I fight every day to stay on the
creator side of society, with a tablet I would just give up!

~~~
julian37
_It's not a versus, it's a peaceful cohabitation, only some evangelist and/or
journalists want to push us to believe there's a war but that's really not
true._

I wouldn't call it a war, but Atwood (who, by the way, is neither an
evangelist nor a journalist with an agenda) makes the point that tablets are
about to take over large shares of the PC market. And my point was that this
way, people (non-hackers) get even further removed from the "close to the
metal" experience that I and others in my generation used to have.

With all respect, I think you and dozens of other posters in this thread are
missing the point by "defending" the notebook and saying it won't get fully
replaced. Of course it won't, but that's not an interesting question. The
interesting question, in my mind, is: how can we avoid the TV-ization of
computing? How can we make sure future generations don't equate a computer
with a consumption device rather than a productive device? How can we make
sure they can find out that this "magical device" isn't actually driven by
fairy dust? Whether or not some share of the population uses notebooks, when
the default for the millions and billions of non-hackers out there is to use a
tablet, doesn't matter very much at that point.

Fortunately there are initiatives like the Raspberry Pi. It's telling that one
of the main initiators of that project, David Braben, is also an old-school
guy who grew up with computers in the 80s. Or so I'm guessing -- at any rate,
he wrote some of my favorite 80s games, Elite and later Frontier. But I fear
that if you plot the Raspberry's sales curve into Atwood's graph it won't look
anything like the iPad.

~~~
victork2
Don't worry, you're full of respect ;).

It's maybe a philosophical question but who _doesn't_ have an agenda? I mean
when you buy a device, don't you want to convince the other that you haven't
bought a stupid toy or a useless object? Everybody has an agenda, even someone
just buy an iPad he has incentives to convince that his device is great, the
only ones who don't have an agenda are the ones who genuinely don't care.

I have an agenda, I am trying to push people to see computers as something
else that a tool to consume media/ slack around. But from the current design
of tablets, especially Apple's iOs Devices it's getting really hard to see
something else than a "pure-consumption" product.

To continue on the philosophical level, I think humans deep down aspire to do
very little while receiving a lot of pleasure. The most productive of us have
managed to channel this desire towards building systems that, when they will
be working, will provide us with a lot of rewards with little additional work
(the startup!). In a way the self-driven programmers have achieved a Post-
Slacker era in their mind and god knows how hard it is to keep that up. The
iPad/iPhone/iPod Touch/Android Phones are enabler of the slacktivist part in
everyone of us, and that's one of the big reasons of their success.

Sure Raspberry Pi is great but, again, it's in the realm of post-slacktivism
and thus it will probably NEVER be popular. It doesn't mean it won't be
successful.

I hope this comment will find interested minds!

~~~
ImprovedSilence
I think tablets will replace "PC's". But I think it won't be as bad as you
think, You'll still have you dev environment, you mouse, keyboard, and
touchpad all working in harmony. Here's a snippet from another post I made
(discussing mobile vs desktop OS's, hope it provides some alternative
viewpoint:

"I'm pretty sure this is the direction all OS's we will be using are going to
end up in the coming years. Some people will cry and yell and holler that
there is no way something can be made for both keyboard, mouse, and touch
screen. I see this as simply a failure of vision. That's absolutely the way
things are going to be going, and it's coming sooner rather than later. (I
actually think the OSX launchpad is pretty close to allowing this
implementation) Devices are getting smaller and smaller. I use a Mackbook air.
but guess how it gets hooked up at home? That's right, wireless keyboard and
mouse, nice big monitor, I never see the actual computer. It's only a matter
of time before my laptop gets replaced by a tablet that has comparable
hardware specs. The OS allow for my normal desktop interfaces, along with a
nice touch screen interface. I'll use it to pick out movies to play on my TV,
from my couch. I"ll use the same device to write code at my desk (with big
monitor and keyboard). My kids will use it to play video games, both mobile,
and on the TV. I'll use it to send email in the backseat of a car moving at
65mph. (As a wireless comms guy, I fully appreciate the technology it takes to
perform that last action.) But the bottom line is, It's going to happen.
Sooner rather than later."

I guess my point is, mobile and desktop can blend perfectly. Those that don't
need a monitor or keyboard won't get one, devs and graphics guys will.
Technology will keep the size of processing power shrinking. Could you ever
imagine something as small as an MBA being a full on computer.

Last bit, and my only reservation of this whole "movement", is that, I hope
they keep it more open, more like OSX, vs overly walled garden a-la IOS. But
the signs are there, I think it can be done, and done very well, it's only a
matter of time.

edit: It seems you biggest problem address the issue that these devices are
built for consumption rather than creation. And I'll agree, that's my biggest
reservation about the mobile concept. I guess I could have been more clear
about that. I'd like my iPad (not that I actually have one) to be more like
OSX, where I can hook up a monitor and keyboard, and go to town on my OS.
While being able to switch into the "launch pad" mode while mobile.

------
ChuckMcM
Atwood's thesis is that the new iPad improved on those things that make
tablets useful.

I agree with him that the updated display is the killer feature. But you have
to give props to a high performance wireless network architecture too.

That being said, two tablets with identical specs, I prefer an Android tablet
and a more accessible ecosystem, the problem for Android right now is the R&D
and supply chain investment competition. Ten manufacturers each investing 10
million in their tablet design is not nearly as efficient as one person
investing 100 million in their tablet design.

I keep hoping Google will address that issue by providing
hardware/manufacturing R&D but it isn't one of their core competencies. When I
worked there folks would call me to get in introduction to the folks designing
the next Android phone, I'd chuckle and explain they had to go talk to
Motorola or HTC. Perhaps now with their Motorola acquisition they will be in a
place to make that investment, time will tell.

------
martythemaniak
"But until the iPhone and iPad, near as I can tell, nobody else was even
trying to improve resolution on computer displays"

The original Droid came out in Nov '09 and had a 265ppi screen (same as the
new iPad). When I upgraded from my 1st gen iPhone to the Nexus One shortly
after that and put the two side-by-side, I simply could not stop admiring how
incredible the screen was.

Sorry, but it really was just simple economics and we would've gotten high-DPI
displays regardless.

~~~
Xuzz
I don't think it's about iPhone vs Droid bs Nexus: it's about phones and
tablets versus desktops and laptops. Most laptops today aren't sold with a
200+ DPI screen. I think I'd have a hard time finding any. And yet, most
phones (and tablets, with the new iPad) do have these high-density, _better_ ,
screens. Jeff's point is that the switch to phones and tablets is what enabled
that improvement to occur.

------
VonLipwig
I disagree with the post.

"""The slope of this graph is the whole story. The complicated general purpose
computers are at the bottom, and the simpler specialized computers are at the
top."""

This is a terrible graph. The Mac has always been a premium niche computer. 28
years ago computing was in its infancy and computers were expensive,
unnetworked with limited benefit to the average household.

Compare this the iPad / iPod Touch and iPhone. These are main stream devices
that achieved immediate traction. It is unrealistic to compare Mac sales with
those of main stream devices.

You then have the desktop market as a whole. If you compare any single company
or brand of desktop against iPad sales, desktop sales would look in trouble.
However, if you compare desktop sales to tablet sales it is clear that tablets
are still only in their infancy.

BUT TABLETS ARE SELLING FASTER THAN DESKTOPS!

I don't know if this true but it doesn't matter and it doesn't say much about
the state of desktops. Everyone has a computer, the market is saturated. No
one has a tablet. It makes sense that tablet sales would rocket.

Buying tablets also make a lot of sense for people who just consume content.
The fact is though that the iPad isn't great for productivity. It is far
better for consuming content. This is what most people do.

However if you want to program, edit images, write a novel, maintain
spreadsheets, make movies etc a desktop / laptop is what you need.

The PC isn't dead and it isn't dying. It has reached a point where people only
buy replacements. Now yes, some people may switch perminantly to a tablet.
Thats fine. For the forseeable future though there will be a large market of
business and consumers who require more than a tablet can provide.

\------

The article also touches on how great the new iPad screen is. I don't think
its all that. I walked past the demo of 'the new iPad' twice before asking a
sales guy to point to which one was 'the new iPad'. Yes, if you put the screen
close to your face you see less pixelation on icons. For general web surfing
though I saw no perceived difference. Hell, I don't really see any pixelation
on my iPad2. Perhaps I am not holding it close enough to my face...

~~~
seanmcdirmid
As has been pointed out in the replies about 30 times already, PostPC doesn't
mean the PC is dead, it only means that the PC market is mature and innovation
slows way down as everyone focuses on newer more dynamic markets. The fact
that people feel no need to upgrade their PCs anymore yet they are still
rushing out to buy the new iPad is symptomatic of that. Surely I'll buy a new
PC in the future to replace my old one, but I don't expect my sister or dad to
do the same.

I once had a workmate in 1999 whose sentiment was similar to your post, just
focusing on a different device. We got into an argument about whether the
desktop PC market was over or not. He didn't really believe that laptops were
really more than niche devices, they were too slow, displays were too small,
that most people would use desktop PCs for decades to come. That you could
never do more than surf the web on it, you couldn't hack serious code with it.

That tablets are for consumption only...really? Diagramming (e.g.,
OmniGraffle), vector graphic production, sketching, mixing and producing
music, editing images, updating a spreadsheet...why not? Many are using
tablets for production already. Every year someone says iPad can't do X, the
next year someone releases something that does X and it actually doesn't suck.

I often use my iPad 2 in bed (~1 foot distance), the screen is really close
and I can see the pixels. The new iPad is absolutely frigging amazing, I'll
never look at my crappy DELL/HP monitors at work in a positive way again (yes,
I can see the pixels!). Why have we been stuck at the same crappy 1920x1200
pixel resolution for at least 5 years now? Could it be that no one care about
innovating in the PC market because they can't make any money? That a 10" iPad
has a higher resolution than my 24" PC monitor is totally Post PC.

~~~
przemelek
Simply wait ;-) probably with age your sight will decrease and at one point
you simply will stop to see difference. I guess that 70% or more of people in
theirs 30's will have problem with seeing difference (except brighter colors).
Of course in one or two years everything will have screens with more pixels,
but from some point it will be always only marketing thing.

As for now if you look on 22 cm long screen with resolution 1280 pixels for
width, and you keep screen in 0.5 m distance it means that one pixel have size
of 1.17 minute, so some, and maybe even most of people aren't able to
distinguish single pixels.

Some people are using tablets for work, some are doing the same on phones, but
for many computers are much more comfortable. It is very personal thing, for
me tablet is cool for playing in some addictive games like Cut the Rope or
Where's my water? but I will kill anybody who will propose to change my
computer which I'm using for development for any tablet. The same, after
reading several books on tablet I much more like reading on Kindle, it hasn't
this ugly glossy screen when I can see me instead of text ;-)

~~~
seanmcdirmid
Higher PPI is more important, not less, as your eyesight degrades. Turns out
pixelation is bad for your eyes. The advantage to having a higher PPI is that
things are clearer, not smaller as some windows aficionados have come to
expect.

Nobody is significantly investing in desktop pc display technology right now
as far as I know, not samsung or LG, it's very stagnate. That's the point, PC
R&D seems as good as dead. It's a big shame, I would really love a decent
display for my workstation.

------
pygorex
The day I can code an iPad app on the iPad is the day I enter the post-pc era.

Or code, test & deploy web apps from an iPad.

Or design & layout web pages on an iPad.

We live in a post-mainframe era even though mainframes are still alive an
kicking. But a mainframe isn't required create, test and deploy PC software.
Once PCs are no longer required to create, test and deploy code for mobile
devices we'll be in a post-PC era - that is, once PCs are no longer a required
part of the general purpose computing ecosystem.

~~~
trimbo
In that way, I think of the iPad a lot like I think of game consoles. They're
not self-hosting (or whatever you want to call it), and they're not designed
to be anytime soon. They're designed to be a software/content distribution
channel for a hardware manufacturer who takes a royalty of that in addition to
selling the device.

------
stcredzero
_But Steve Jobs certainly saw the Post PC era looming as far back as 1996:

The desktop computer industry is dead. Innovation has virtually ceased.
Microsoft dominates with very little innovation. That's over. Apple lost. The
desktop market has entered the dark ages, and it's going to be in the dark
ages for the next 10 years, or certainly for the rest of this decade.

If I were running Apple, I would milk the Macintosh for all it's worth – and
get busy on the next great thing. The PC wars are over. Done. Microsoft won a
long time ago._

The fact that Steve jobs could see this coming from back in 1996 is
incredible. Add to that, that Steve also knew all through the Tablet PC
initiatives of the early 2000's that it was too soon, technology wasn't able
to properly support the post-PC form factors yet, and that styluses were a
loser proposition. Bill Gates also saw the same forces at play, but he
responded in the early 2000's with a failed attempt to move the PC into the
next era. I guess that's a natural reaction for the winner of the PC era -- to
try and carry it on.

You can see the same forces at play in the audio industry. Audio has gone
beyond the early DIY tinkerer days, through a period of standardization and
ubiquity, through an era where big, complicated, elaborate machinery was a
status display, to a time of maturity where convenience and design prevails
and the technology fades into the background.

The problem with audio, is that it doesn't let us extrapolate to the future.
What's next? I can see a period where feature-phones increase in capability to
the point where they're like the 1st generation smartphones, but with networks
that are 1000X more capable -- just in time for widespread adoption by the
developing world.

Will the developing world leapfrog the developed world in much the same way
that it skipped over the era of hardwired networks straight to mobile? Will
they be coming to economic power just as the digital realm has obsoleted many
forms of material wealth?

~~~
arn
I agree the quote from Steve Jobs was impressive, but it also fits the well
known quote that goes "the best way to predict the future, is to invent it"

------
nikatwork
As happens all too often at HN, these comments are devolving into a pedantic
argument about definitions and irrelevant technicalities.

Atwood does not say "hey throw out your PCs". In fact, he opens by stating
that _PCs are already ubiquitous_. This article examines the idea that major
innovation is currently happening in the post PC area. He posits that the new
iPad display is a killer feature.

This is a much more interesting talking point than "o noes but my Linux
netbook is kewler".

------
mcantelon
Laptops were way more revolutionary than *pads.

"Post-pc" is just a talking point. The post-PC era will eliminate PCs like
McDonalds eliminated kitchens.

~~~
to3m
Unconvinced.

It might seem like laptops are revolutionary, or were, but useful ones have
been around for 20 years, and to what end? They've become progressively more
useful in terms of screen quality and battery life, but they have consistently
failed to produce any kind of revolution. And why should they? They're just a
more expensive version of your desktop computer, that's small enough to carry
around.

I'm happy to class this as useful, but the laptop's persistent insistence on
being merely a small version of the desktop falls somewhat short of
"revolutionary".

~~~
mcantelon
Laptops made new types of work and socializing possible. Affordable laptops
enabled people to cowork, work nomadically, come together for "data potlucks",
etc. Laptops largely replaced turntables and hardware instruments. *pads
merely iterates on this initial revolution, making it more convenient and
providing training wheels for those not born into technology-influenced
culture.

------
danso
>> __iPad 3 reviews that complain "all they did was improve the display" are
clueless bordering on stupidity. Tablets are pretty much by definition all
display; nothing is more fundamental to the tablet experience than the quality
of the display. __

While I agree that the display improvements are noteworthy, I would say that
tablets are not "by definition all display." The tactile interaction is as
much a part of a tablet's worth.

~~~
experiment0
iOS devices have had there tactile interaction down since the beginning. They
are so responsive compared to Android devices (from my experience), and so I
don't see where Apple could have improved in this area.

~~~
w1ntermute
I don't know which Android devices you've used, but I simply haven't found
this to be the case. From my experience, the touch response on high Android
devices (such as the Galaxy Nexus) is indistinguishable from iOS devices.

~~~
dasil003
I'm on a Nexus One at the moment. The touch is good when it works, but it's
buggy. The more quickly you tap it the more likely it is to into buggy mode
where a big dead spot develops in the lower center part of the screen. It's
absolutely infuriating, and it really makes me much more envious of iPhone
users than any of the apps.

If the Galaxy Nexus is really on par with the iPhone (and I'm highly skeptical
of that) then I might stick with Android for my next phone, but otherwise I
have to go iPhone just based on touch characteristics alone.

~~~
commandar
Didn't the Nexus One ship with a notoriously poor capacitive sensor? I recall
it being blamed for the N1 struggling with multi-touch compared to other
phones.

EDIT:

[http://androidandme.com/2010/03/news/is-multitouch-broken-
on...](http://androidandme.com/2010/03/news/is-multitouch-broken-on-the-nexus-
one/)

------
program247365
Jeff is making the point that [Apple is] "fundamentally innovating in
computing as a whole...".

I see the iOS devices (iPhone/iPod Touch/iPad), as just the beginning. Apple
has made astounding progress on these mobile devices where their competitors
haven't really caught up to yet:

1) The display, amazingly higher-res than HDTVs now (2048-by-1536-pixel
resolution at 264 pixels per inch (ppi))

2) Connectivity (Wifi/multiple carriers/LTE even)

3) Battery life to weight ratio (9hrs using cellular network, at 1.46lbs)*

4) Graphics/Processor punch (Dual A5X processor with quad-core graphics)

5) An ecosystem of applications that are on the whole, of good quality, and
generate a lot of revenue (25 Billion apps sold [2])

When I say "just the beginning", I really mean that. And I don't mean that
Apple will always be at the top, probably not by a long shot. What I want to
ask, and I wonder why everyone else isn't asking is, what's next? Apple won't
be at the tippy top forever, so won't the next generation of innovators please
stand up?

[1] Anyone remember lugging around their 8+ lbs Dell laptop, that had a
battery that lasted maybe 2 hours in 2005? I do. My back still hurts.

[2] [http://www.apple.com/pr/library/2012/03/05Apples-App-
Store-D...](http://www.apple.com/pr/library/2012/03/05Apples-App-Store-
Downloads-Top-25-Billion.html)

------
RyanMcGreal
I realize I'm straying dangerously close to Get Off My Lawn territory, but I
worry about the post-PC era. As a programmer, I worry about turning into a
sharecropper on a closed platform [1].

[1]
[http://www.tbray.org/ongoing/When/200x/2003/07/12/WebsThePla...](http://www.tbray.org/ongoing/When/200x/2003/07/12/WebsThePlace)

As a consumer, I worry about "buying" a closed, networked device and
discovering that I don't really own it, but merely have a licence to use it
under draconian terms of service. I don't want to "buy" a book or a movie or a
piece of software, only to have it yanked off my device remotely.

Meanwhile, the government of my country (Canada) is rolling out a new
copyright law that will make it a criminal offence to circumvent "digital
locks", even if it's for legitimate, legal personal use.

However, I still hold out hope that as the technology matures, costs fall and
competitors catch up to Apple's tight vertical integration, it becomes more
feasible to run a full-featured open source device with a healthy ecosystem
built around it.

------
firefoxman1
_"nothing is more fundamental to the tablet experience than the quality of the
display"_

Well, I'd have to say the OS's interface and overall UX is more important.
Maybe not for some, but it's a big deal for me.

------
mark_l_watson
I love my iPad 2, don't get me wrong. The thing is, for me the iPad
supplements my MacBook Pro: acts as a second monitor, I use it for reading,
and it is just large enough to be good for watching movies.

The thing is, most of my computer use is programming and writing. I need a
keyboard, support for a term window, Emacs and IntelliJ, TeXShop, etc.

I might be able to earn a living using just an iPad by running a term app and
doing development using a remote Linux server with SSH, Emacs, etc. and do
writing using Pages, but that would be like running a race hopping on one
foot.

I think that "iPad 5" might do it for me however: a larger physical screen
size, about half the weight, and great IDE support for doing Lisp, Java,
Clojure, etc. I'll wait 2 years and see. I think that this will require new
paradigms in writing and programming tools. Mind is open.

------
ezy
Not yet.

My wife is not an advanced user of computers by any stretch of the
imagination, but I would never buy an iPad for her.

She has a DSLR and 60k photos in iPhoto (~320G). iPads will eventually store
this much in 3 years, but you will want it backed up somewhere (and in 3 years
the "cloud" _still_ won't be doing this much (personal) data at a reasonable
personal cost compared to a physical device). And she will not be tolerant of
doing edits or looking at photos on a 10" screen.

Not to mention that once you increase the screen size, portability goes out
the window -- you will have a monitor. And guess what? Touch input doesn't
really work ergonomically on a 24" monitor.

There are solutions, Airplay will obviously be involved, as will BT input
devices. But it's not quite ready for prime-time yet.

~~~
alain94040
Everyone's case is different of course, but it's very possible that soon, your
wife will be on iPad only.

For large monitors, AirPlay to the TV works great today. I can snap a picture
on my iPhone, step inside my house and the next second, show the picture on a
50" TV to all my guests. Instantaneous. I don't have to connect to a computer
at all. It even works for videos.

Combine that with a card from Eye-Fi and cloud storage, and I'd say that in
two years tops, you'll feel like it's a completely normal and safe way to
manipulate your pictures. Use the iPad to arrange them in collections. Show
them on the TV. Store them online.

~~~
jiggy2011
The only thing missing there is editing, if you are trying to do something
beyond adding a few filters you are probably going to want either a mouse or
something closer to a proper graphics tablet (which is quite different from a
capacitive touchscreen).

------
malux85
Yep, the article is right BUT it's for consumers. The iPad has allowed my
grandmother (who is so old fashioned she doesn't believe woman should _drive_
) to pick up a device and video call me on the other side of the world - we
told her "It's not a computer, it's a new invention" and she picked it up and
invested time in learning it.

Most people here are actually producers when it comes to computers, so most
people say things like "I still need to product (spreadsheet | programming |
lolcat images)" and the pc will never go away. But I think this article was
talking about your websurfing, facebook updating, web _reading_ consumers.

------
wisty
To all the people who are sad - it will be OK.

Workstations killed mainframes, because you didn't need that stupid command
line. You could point and click. But real software engineers would install
UNIX tools, or SSH (or telnet) into a server, and take advantage of the pretty
display for richer reports.

PCs killed workstations, because they were cheaper, and their OS was more user
friendly. Unfortunately, "user friendly" meant "dumbed down". You could take
advantage of innovations in IDEs, word processors, email programs, and web
browser; but it was harder to instal UNIX tools. You could still SSH into a
server (which would be a workstation, if you were on a tight budget) and get
UNIX stuff done. You could also configure X, and do workstation stuff.

Now, tablets are killing PCs, because they are cheaper, more portable, and
more "user friendly" (dumbed down). There's probably no way to install UNIX
tools. But you can get a virtual server for peanuts (or set up a home PC),
hook up a USB keyboard, and an SSH "app", and everything is fine. The display
is a bit small, but many tablets these days can power a HD display (look at
the Raspberry Pi, which uses mobile phone components). Apple may or may not
include the right port, but Android tablets can. And you can take advantage of
innovations in touch interfaces.

------
bicknergseng
"Very, very small PCs – the kind you could fit in your pocket – are starting
to have the same amount of computing grunt as a high end desktop PC of, say, 5
years ago. And that was plenty, even back then, for a relatively inefficient
general purpose operating system."

I would argue that it's more than they need, that a service like OnLive is
before its time but not more than a few years from hitting prime. I would say
get ready for the Post Hardware Era.

~~~
jiggy2011
That's going to need some serious network infrastructure to back it up. You
will essentially have every computer user in the world constantly streaming
very high resolution video for 8 hours+ a day.

The 1GB/month limit most carriers provide is going to be nowhere near enough,
you will need more like 1/2 TB per month and it needs to be provided reliably.

~~~
icebraining
I think that's completely possible if we change the model of our current
infrastructure.

Here in Portugal, our main (wired) ISP has been putting Foneras[1] in every
subscribers' home for a couple of years or so, which means that every
subscriber they have can connect for free anywhere there's a home with their
service. A couple of months ago they launched an iPhone app which can use
their network of Wifi routers to make calls through them - they essentially
became a mobile provider using Wifi.

I can see this approach taking place on a larger scale; not only related to
connectivity, but to actual computing: imagine that each router was also a
beefy server, particularly in terms of CPU and GPU capabilities; these would
provide the computing needs for wireless devices connected to them.

Using existing ways of moving the "state" of an application between machines,
you could have your wireless devices hopping from server to server as you
moved around, without having a great impact on the rest of the network.

This would still mean hardware in each home, of course, but it'd be a black
box hidden from sight and chosen, bought and managed by some company.

[1]: <http://www.fon.com/en/info/whatsfon>

~~~
jiggy2011
The crux of the problem is that bandwidth is currently oversold since it is
largely unused most of the time.

Let's compare my session here on HN with how it would be if I was streaming my
web browser from somewhere else.

Currently I pull down a few KB of ascii every i load up a comment thread and
then spend maybe 20-30 minutes reading it during which time no bandwidth is
used.

If I was streaming this from a browser in the cloud, I would be pulling down
an order of magnitude more data since scrolling down the page could easily be
a few MB vs no data at all.

You could have hardware "basestations" as you suggest which would be an
interesting idea. Of course locating these geographically would be a difficult
task. You might end up with rural areas where it would be impossible to do any
computing at all because no basestation was in range, or you would end up with
urban areas where there would be massive demand.

You would also have to account for what happens if somebody decides to have a
big party or event somewhere and suddenly the local basestation computer goes
from only having to service 10 - 100 clients to suddenly having to service
1000+?

~~~
icebraining
_You could have hardware "basestations" as you suggest which would be an
interesting idea. Of course locating these geographically would be a difficult
task. You might end up with rural areas where it would be impossible to do any
computing at all because no basestation was in range, or you would end up with
urban areas where there would be massive demand._

Rural would probably fallback to mobile networks and real datacenters,
possibly with degraded service. Urban would just be adjusted based on trends -
places with higher demand would get more powerful boxes; the same as scaling
any other servers, really.

 _You would also have to account for what happens if somebody decides to have
a big party or event somewhere and suddenly the local basestation computer
goes from only having to service 10 - 100 clients to suddenly having to
service 1000+?_

Assuming a decent infrastructure, you'd only need to have enough wireless
bandwidth; computing resources wouldn't necessarily have to be tied to local
devices. So a party would just mean that more machines from the neighborhood
would help via wired connections (I'm assuming basestations in the same
neighborhood could speak to each other using local routers, without having to
do a round trip to the ISP datacenter).

Of course, this still means that machines shouldn't be running at 100%
capacity, but that's just common sense.

~~~
jiggy2011
It's certainly any interesting concept with some merit. However it would
change the game from being "damn I can't get any signal, no internet for me
but I can still work offline" to "no signal, I literally can't do any
computing at all!"

So what I could see happening is people at that point carrying around
"personal compute servers" with them for just such an event, or at least
having on in their home/office.

Also bear in mind that CPUs/GPUs appear to be getting pretty cheap as well as
pretty small and you would need one anyway to do the video decompression so
having something on your device capable of doing the basic UI etc isn't much
of a stretch.

Perhaps you end up with 3 tiers of processing, i.e your device, your local
node and "the cloud" (i.e a proper datacenter). The issue is making all of
this transparent from a usability perspective.

~~~
icebraining
Oh, I don't think we'll ever return to actual dumb terminals - I think the
difference will be that some services will be available and others won't. Or
some others will just be scheduled until you get near a basestation.

------
ericfrenkiel
I was so inspired by this article that I actually went to the Apple store in
San Francisco and purchased the iPad this evening. After spending a couple
hours with it, I'm certainly impressed by the hardware, but I've already
decided to return the device tomorrow.

It's a great couch device, but it's too heavy to carry with you. I don't carry
a murse, and I'd rather sling a Macbook Air than this for the extra
functionality.

Minor things include it running too hot, and the flip cover leaving striated
streaks on the display. In my opinion, iOS doesn't scale properly at this size
- there is too much white space between the icons on the home screen, for
example, and I would prefer to see widgets to show more information without
having to open an app.

Ultimately, whether you're happy with the iPad is how much you buy into the
Apple cloud versus the Google cloud.

In all truth, there is very little difference between the iPad 2 and the iPad
HD. If you squint and peer very close to the screen you can see how high
quality the screen is, but looking at the device from where it rests in your
hands it's almost impossible to tell the difference.

------
sdfjkl
Just today in the IT department (this is why it's good to occasionally work in
the office): "So my old PC at home finally bought it. I've seen you're all
using Macs (this guy is observant) and I've been wondering if I should get one
of them. Or could I just get an iPad instead? I've been tempted to get one of
them anyways".

Turns out the iPad does everything he and the wife want, which is mostly
surfing the web, listening to iTunes[1] and getting pictures from their point-
and-shoot camera to someplace safe. Remaining issue: printing the odd
document. But you can get an AirPrint enabled smearjet for £35.

Our recommendation: Get the iPad now, see if it does everything you want. If
it does, get another iPad for the wife instead of that Mac you'd share between
you.

[1] Explaining iTunes Match made his eyes light up: "So I won't need to back
it up[2] to an USB disk anymore? And I don't need to buy the 64 GB iPad to fit
it all on?" (Hey, it's like someone designed this service for people like
him).

[2] Most casual computer users seem to do this these days because they've
already learnt the hard lesson of losing all their stuff once.

------
kriskeyman
Two years ago I bought my mom an iPad. I was hesitant to do so because I
wasn't sure if she would actually use it. A year ago I bought her a new iMac
(upgrade from a previous gen. iMac). She likes it, but now considers the iPad
to be her main computing device.

My 5 year old son spends 90% of his supervised 'computer time' on an iPad. He
doesn't understand the concept of a pixel, or why it's good that we don't see
them. He has an older iMac in his room, but still prefers using the iPad as
his 'main computing device'.

Talk about burning a candle at both ends! The term 'desktop' is archaic when
thinking about a computer as a tool. Designers and engineers on the forefront
of technology are spending their time creating devices for the 'Post PC'
generation... not creating a better 'desktop'.

Reading some of the comments here, I cannot help but picture the accountant
who still keeps his numbers in a [paper] notebook. At least they still make
pencils!

------
thwest
PCs will stick around, of course. The worry for me is that when PCs serve only
engineers and content creators the production scales will diminish, and prices
increase.

------
PhilipG
For me, the time per day I spend on a PC has not changed since getting a
tablet. However I can count on my hands the number of times I have bothered
turn on TV since getting tablet. The tablet has replaced the TV for me.

------
melling
Apple already announced the Post-PC era. It'll be a slow transition for a
couple of more years. We really better input options. Fingers are great but
sometimes I want precision.

What's interesting about Jeff coming around is that he's a pretty hard-core
Windows guy. It was funny to have Macs come up on the StackOverFlow podcast.
Joel would recommend it for family, etc and you could hear it in Jeff's voice
that he didn't see the need. We'll find out in 6 months if it's going to be a
3 horse race.

Now Jeff has a reason to learn C. Not to worry, ObjC on iOS is not your
father's C. It's a lot more fun.

------
mangoman
As a hacker, I feel like I am missing out on the wonders of tablets and the
Ipad that I see non-hackers getting extremely excited about. I can get super
excited about my smartphone, but that's because the device(s) it replaced were
infinitely less exciting (my dumbphone, my mp3 player). But for Ipads, I just
feel like everything I could do with them can be done on my laptop better. My
laptop has a bigger screen so movies is a more pleasant experience. I can play
more fun games on my laptop (The many many years of flash games is probably
more vast than the games on the Ipad). I can compose documents more
comfortably than i can type on a touch screen. I can CODE on my laptop!
Granted, my laptop doesn't have 10 hours of battery, or that level of pixel
density (where i completely agree with Atwood, displays need to get better),
but I have yet to find a reason to buy a tablet. I think that non-hackers who
don't really use that many features of a PC should get VERY excited about
tablets, but until I can program on a tablet comfortably, I don't think I
could bite the bullet and buy one.

------
j_baker
Email has been around (in some form) since 1965. However, I think it's a bit
too early to say that we're living in a "post-mail era". Will such a thing
every truly happen? I don't see that happening until we invent teleporters for
all the physical items we need shipped.

By that same token, are we _truly_ in a post-PC era, or is it simply an era
where tablet use is on the rise and PC use is on the fall? Nifty charts and
screen resolutions aside, I don't see any evidence that we're past the age of
the PC. My parents still haven't replaced their computer with an iPad. Neither
have any of my old friends from school. I use my tablet for more and more, but
I just still can't avoid using my laptop for certain things. And I most
certainly don't have a tablet or phone hooked up to my television. And how
many offices have you seen where all the PCs have been replaced by mobile
devices?

In short, while I see things moving more towards tablets and mobile devices,
I'll be convinced that we're in a post-PC era when I see it. Until then, it's
all arbitrary speculation.

~~~
stcredzero
_However, I think it's a bit too early to say that we're living in a "post-
mail era". Will such a thing every truly happen? I don't see that happening
until we invent teleporters for all the physical items we need shipped._

Well, we are definitely past the start of the post document-mail era. As for
the rest, it's just a matter of implementation: is the advanced fabrication
tech in your living room, across town, or across the ocean? So long as you get
your product in a timely fashion, what's the difference? (Answer:
manufacturing and supply-chain costs and how these impact profits and the
consumer.)

~~~
j_baker
True, fewer and fewer documents get mailed. But I think it's still a bit early
to say that we're in a post-mailed-document era. I still get documents mailed
to me every day! There are still some uses of physical mail that haven't been
replaced by email.

~~~
stcredzero
_There are still some uses of physical mail that haven't been replaced by
email._

Yes, but it's clearly just a matter of time. There's no infrastructure or
technology we need to wait on. There's no document transmission function that
email couldn't fulfill for more than 90% of the populace, no barriers to that
becoming a norm.

Cars didn't cause 100% abandonment of horses instantaneously. Before the
development of the largest trucks and most capable offroad vehicles, there
were certain uses only draft animals could fulfill for quite awhile. Still I
think the post horse-carriage era did begin with the first commercially sold
cars.

------
test4334
Screen resolutions are not growing because Windows doesn't scale display well.

Real world: many people and a business I know use HD resolution on Full HD
displays (and 800x600 on 1280x1024 displays) just because letters are bigger.

I really hate current standard resolution: 1366x768. It fits 20 lines of code
in Visual Studio. Even 10 years ago we had better monitors.

------
ivanhoe
There is an exponential relation between required dpi and distance of the eye
from the printed material or monitor (further you are, less and less dpi is
needed). Therefore I'm not sure retina screens are such a great idea for big
desktop screens, in a usual desktop setup you probably just wouldn't notice
that much difference.

~~~
orangecat
If you can see individual pixels, as you should easily be able to on typical
desktop and laptop displays, then you can use higher resolutions.

------
eslaught
"all the existing HCI research tells us that higher resolution displays are a
deep fundamental improvement in computing"

Citations, please?

~~~
chalst
See jeff's post:

[http://www.codinghorror.com/blog/2006/12/printer-and-
screen-...](http://www.codinghorror.com/blog/2006/12/printer-and-screen-
resolution.html) Printer and Screen Resolution (December 2006)

He recapitulates Tufte's argument that information density is curcial to speed
of finding information and observes that stagnation in the improvement of
screen resolution is holding us back.

------
bitsweet
_Updates. Toolbars. Service Packs. Settings. Anti-virus. Filesystems. Control
panels. All the stuff you hate when your Mom calls you for tech support?_

Sounds more like a Windows problem then a pc problem. Since converting my
family to Macs, I haven't had to handle any of this stuff but when they used
Windows it was a monthly annoyance.

~~~
MarkMc
Let's not go overboard - both Mac and iOS have Updates and Settings and
service packs, it's just that these are simpler to use than on Windows.

Also the first call I got from my Mum after buying her an iPad 2 was to ask
how to increase the size of the Notes font (turns out it couldn't be done).

~~~
Xuzz
(Try Settings -> General -> Accessibility -> Large Text: it seems to work on
my iPad for bumping notes to a larger font size.)

------
asdfpoiu
Open hardware to the rescue of the PC!

I think the (not-too-distant) future of home computing will have a lot in
common with today's PCs. Articles like this one always seem to underestimate
the amount of real creative or productive work people want to do -- something
which is still a challenge on the iPads of today. Have YOU compared typing on
a small screen to a big one?

With custom manufacturing becoming affordable, as traditional computers start
dying, open source hardware will start replacing them. I think we are seeing
the first steps in this direction with Raspberry PI, hobbyist open hardware
people, 3D printing (in the future: cheap chip printing?), CryogenMod, etc.

~~~
jiggy2011
Interesting, maybe in the future you have 2 options either a fully closed
ecosystem or a fully open one.

I wonder if you will be able to use the open one to create stuff for the
closed one?

~~~
orangecat
_I wonder if you will be able to use the open one to create stuff for the
closed one?_

More disturbingly, I wonder if the open ones will be allowed to have a
fraction of the functionality. Hollywood would love to only allow Internet
access from locked down devices, and if 98% of the population ends up on
appliances, things like that actually become feasible to enforce.

~~~
jiggy2011
Perhaps though the 2 devices will have massively different use cases.

This is already happening for me to an extent, I run a dual boot of Linux and
Windows 7. Essentially Linux is my "programming workstation" and Windows 7 is
my entertainment.

I would quite happily replace my Windows 7 boot with something more like iOS
since the only applications I really use there are Steam and Netflix.

I'm actually quite glad that Linux does not support most of my games, since
this keeps my productivity high.

The only irritation is when I want to watch something from netflix in the
background whilst working on some code but even that is easily overcome by
running netflix from a PS3 on my TV beside my computer.

Regards restricting internet access only to certain devices. At a government
level that would be difficult legislation to pass.

I can see domestic ISPs only supporting specific devices perhaps to keep their
support costs down, but I'm sure there will always be ways around this. I
remember the guy who came around to install my first broadband connection
insisting that I would not be able to access the internet at all from my Linux
laptop until I showed him how I did it.

------
shahar2k
I'd argue that there is one last barrier to these slate based computers
(smartphones, tablets) and that is to take these screens and make them
sunlight readable (reflective), color, same retina resolution, and same
refresh rate.

once the screen no longer requires a backlight, and can be readable any place
a book can, computers will truly be ubiquitous. as for creation tools on these
machines they are beginning to arrive now, and they will become more and more
mature as time goes on. hopefully a less controlled ecosystem will win out but
that much I dont know I can predict.

------
jroseattle
I usually agree with most everything Jeff writes, but I'm highly skeptical
that what pushed the "post-PC" era over the top was better display support.

I've seen the screens, they look nice and crisp and clear and...I kind of
don't care. Oh, the views on it are great, but it offers me nothing new.

Quite frankly, as product releases go, I question whether Steve would have let
this out the door. It's a nice product, but insanely great? In comparison to
prior developments, I don't think it meets that bar.

------
ridruejo
It is probably not much the hardware itself as the concept. Everybody can
figure out how to use a tablet, the app store, etc. but not everybody can
figure out how to use Windows or Ubuntu or OS X. We are close to the point in
which everything a 'normal' person (not a developer) may want to do with a
computer, such as listen to music, watching videos, receiving and sending
email, buying plane tickets he or she can do with an iPad.

------
gtruilmopl
I've been netbooking it for my chillin' computin' for the past couple years.
Since getting an 800x400 AMOLED screen on a smartphone, I haven't even thought
about getting a new netbook. There are all sorts of useful form factors. the
iPad _IS_ a PC. It just uses a different CPU and proprietary hardware.

Now... a small and light PC that is completely intractable with via a
touchscreen, wireless data, etc...

That's a goddamn miracle.

Bloggers just love to historicize the present.

------
netcan
I think the demand for tablet-like-PCs is potentially huge. If I'm right,
their first couple of years will probably tell us where the frontiers between
PCs and post-PCs are and what the world will look like.

------
pixelcort
I'm curious how that chart would look taking into account US and international
population growth in the last few decades. Perhaps devices per capita is a
better ratio?

------
greggman
Really? Hi-def = no more pc? What does an iPad 3 do that an iPad 2 doesn't?
How did hd res change anything?

The end of the PC era may be near but it's not because of 10inch hd screens.

~~~
rimantas

      > Really? Hi-def = no more pc?
    

No. Post-X does not mean "no more X". It never meant that.

------
cowkingdeluxe
An apple product changed everything? Again? Does this one even connect via USB
for data transfer? What about HDMI?

------
ck2
It's 264 PPI, not DPI.

My monitor is 2048 x 1152 23 inches.

20.05" × 11.28" (50.92cm × 28.64cm) = 102.16 PPI, 0.2486mm dot pitch, 10437
PPI²

At one foot away I still cannot see the individual dots.

There's been a 200 PPI option for over a decade now:

<https://en.wikipedia.org/wiki/IBM_T220/T221_LCD_monitors>

If the ipad3 has added expense or limited availability to get ppi above 100,
it was a waste.

~~~
sounds
One reason Apple jumped straight to 264 PPI is to exactly double each
dimension. This makes the 2x scaling beautiful.

(I'm repeating hearsay, but it seems intuitively obvious to me - feel free to
correct me.)

~~~
ck2
A smartphone with 854x480 resolution on a 3.7" display also has 264 PPI

(Motorola Droid, Defy, etc.)

Might be related. It's "just" the same kind of matrix scaled to a three times
larger display.

------
justinhj
I think the article is correct except for its timing. The iPad 1 did all of
this, the updated display resolution really doesn't change the device
fundamentally. Sure the fonts and icons look great to Jeff, who just bought
his first iPad. But guess what? They looked great on iPad 1 too.

------
rmk2
What irks me the most about the "post pc era" is its genesis. It is portrayed
as a revolution that will fundamentally change computing.

However, by dubbing the tablet the "post-pc", its very definition still
depends on the pc, being a definition ex negativo. Many of the arguments here
are about whether the tablet can do _everything a pc does_.

The tablet does not revolutionise cultural artifacts or cultural goods, it
merely allows for a consumption that is somewhat different from the
consumption the pc offers. The consumed goods do not change, social media,
media, art and science remain exactly the same, namely consumable goods.

The established dichotomy between the pc and the tablet is a fake one, used to
artificially create demand that can then be met through new supplies.

The tablet does not change the mode of production, it is merely another facet
of consumption, this time even more openly advertised as such by being
advertised a "consumer device".

Calling it innovation is in my opinion a misrepresentation, because it
basically reinvents the wheel once more, it undergoes the same syntheses and
evolution as the pc did, just accelerated thanks to already available
knowledge.

The tablet is a clever remix of technologies already available. It combines
different modes of consumption previously divided, but it only changes the
mode of reception, not production.

True innovation on the other hand has to change the modes of production first.
The tablet cannot achieve anything other items cannot _also_ do, and is thus
not innovation or revolution, it is merely evolution, or even just another
playing field added alongside others.

Film, recorded music, books and the radio changed the means of production, as
did the pc. The walkman, the tablet and the tv may have changed the modes of
reception, but nothing new can be created through them that wasn't already
available beforehand and is merely refined, not substantially altered.

The "war" of tablets vs pc is just another sales pitch to make either more
appealing to their individual audiences.

>> edit (for ease of understanding, an example)

As an example that is less abstract, take the "Hipster". Wearing clothing that
is tattered might be "cool" to you and you wear it "ironically, as a
statement". If you are poor however, you might have to wear tattered clothes
because you lack the means to afford something else. Ironically or not, you
are fundamentally still wearing tattered clothes. Irony does not influence the
plane of action, but instead the plane of ideas, or in other words: ideology.
It's the same in the debate of tablet vs pc: Whether you type a blog-post on
your touchscreened tablet or with a keyboard on your pc doesn't change the
underlying action, just the mode of action. Whether you prefer one over the
other is ideology, the basic action remains unchanged.

>> end edit

------
jQueryIsAwesome
Lets play to the clairvoyant, the post-tablet era is going to be glasses-PC
(circa 2025) and after that is going to be micro holographic touch projectors
(circa 2040); and those are going to be more like terminals with everything in
the cloud.

------
gcb
a guy that writes a blog about coding happy that the new 'era' will not allow
you any coding

~~~
rimantas
Really? Where did you read that? I thought he wrote how people who never wrote
a line of code in their life and never intended to do so now have their needs
fulfilled by this device.

------
twelvechairs
The author conflates 'PC' with 'desktop' and 'general purpose computer', which
I take issue with. Three points:

Firstly: The iPad is a PC, a 'personal computer'

Secondly: The quotes refer to 'desktop computer' era being over (though one
uses the term 'PC' in the context of 'PC wars'), so why not use 'desktop
computer era is over' as the title?(more accurate though less shocking)

Thirdly: Through all this conflation, the author also tries to say that the
'general purpose computer' era is over. Personally, I believe there is a big
future in 'general-purpose computer' tablets/phones/etc. (non 'desktop'), as
well as in desktops and there is not a shred of evidence in the article which
challenges this view...

------
gavingmiller
The lack of a decent point in this article makes me wonder whether Apple paid
Atwood to write this piece.

~~~
jonursenbach
Do you _really_ think that Apple with all of their billions of dollars would
pay someone like Jeff Atwood to write this, or anything? Come on.

~~~
robryan
There would have been a cost involved recently with giving a group of
journalists private briefings on osx mountain lion recently. Which would have
been done to get more press.

