
The ‘Post-PC Era’ Never Really Happened and Likely Won’t - kerng
https://techpinions.com/the-post-pc-era-never-really-happened-and-likely-wont/53610
======
paulgerhardt
I worked at the Post-PC lab in Colorado for a while. The premise, at least
when the term was coined, wasn't ever that Desktops or the new (at that time)
"Pen Computing" interfaces would ever go _away_. Rather it was always that
"computerization" would invade all walks of life until suddenly everything
would become computer.

Much of the research we did was looking at life would be like when suddenly
everything you interact with would have a programmable interface, rather than
only being able to program things through your PC.

30 years on much of that has turned out to be more or less true with cars,
lightbulbs, door locks, and so on. But of course like all "predictions" of the
future it's made using the terminology that was in vogue at the time and
sounds funny to us now that it's here and other terms like IoT eclipsed their
popularity.

------
Isamu
> Steve Jobs’ vaunted, much quoted ‘Post-PC Era’.

Actually this traces back to an aside by Steve Jobs, where speaking of the
iPad he said "we think these are post-PC devices, we really do".

This was immediately taken to be a statement that he felt that PC was going
away, rather than that the devices were not trying to be PCs in a new form
factor, but something post-PC.

People should have gotten a clue when Apple did not stop selling laptops and
desktops.

~~~
nradov
But Apple has reduced their R&D investment in laptops and desktops, at least
on a relative basis. They haven't updated their "pro" desktop line in years. I
suspect they will eventually offer a more powerful line of iOS devices to
address content creation use cases, and gradually phase out the legacy MacOS
product line. Unifying their product lines around a single OS would eventually
deliver a lot of savings.

------
Aloha
I've been waiting for a while for someone to call this.

While consumers may have moved on to do most of their device time in front of
a phone, tablet or phablet, the rest of us still spend many hours a day in
front of a computer.

There is still a sizable functionality gap between a tablet or phone, and a
laptop or desktop computer.

~~~
bryanlarsen
Banks still use mainframes, and mainframes still have capabilities that
laptops generally don't have. Neither argument means we're not in a post-
mainframe era.

Today the majority of computer & internet use happens on a phone. But it's
still a small majority, so we cannot yet call it a post-PC era. If/when that
turns into a substantial majority we'll be in the post-PC era, even if you and
I still primarily use PC's.

~~~
hyperman1
Interesting: Can you provide some examples of Mainframe capabilities not
available to laptops or x86 machines in general?

~~~
larrik
Good question. I used to sell AS/400's, which aren't quite mainframes, but fit
the typical picture in your head (green screen, dot-matrix printers, high
price tags and all that).

1) Stability.

That stuff changes VERY rarely, and when it does it tends to be very backwards
compatible. Our software had binaries compiled in the 80's still fully
operational.

2) Reliability.

You don't reboot the server. You don't wipe it and reinstall it. You don't
even upgrade it or replace it that often.

You can do things like add or remove drives from your RAID array while the
machine is still at full utilization.

Heck, IBM AS/400's will detect the error, order the part, and get an IBM guy
there in the morning to install it. You just have to let him in the door.

In fact, you can hot-swap things like the RAID controller card itself, which
is WAY not something you can do on x86.

They last forever, too. It wasn't uncommon for us to have machines in daily
use for 10+ years without a single bit of maintenance or an IT guy even
looking in its direction. This is for a business critical machine, mind you.

3) Single vendor

Ever buy a Dell or HP server and have an issue, and you spend 2 weeks trying
to get HP/Dell vs Microsoft vs some other company to admit fault and fix the
damn issue? Then you find it online yourself because that's nuts? Well, IBM is
responsible for hardware and software on these machines (besides what software
you put on it) so if it isn't working, you call them and it's _their_ damn
problem to get it sorted. Kind of like Apple, except they'll send as many
people as necessary for as long as necessary to sort that out for you.

4) Entrenchment

It works, why bother changing it? It doesn't even slow down as it ages,
either, since you aren't upgrading a GUI OS like Windows or whatever.

~~~
starsinspace
Now this sounds fascinating about the stability and reliability, but it leaves
me wondering how such decent systems can from the same company which makes the
horror that is IBM Notes...

~~~
dingaling
IBM was more like a federation than a single coherent entity. System/38 and
AS/400 evolved in Rochester MN, mainframes in White Plains NY, and PC software
all over the place. None of the divisions had much to do with each other.

------
buboard
Except it has happened. It used to be that, for 100% of computer-using people
, switching from entertainment (games) to something creative in the computer
(making games ) was a download away. Nowadays a miniscule amount of people
uses a computer in "work mode" (i.e. not a laptop).

~~~
fipple
I don’t know about making games but music making software on iOS is a hell of
a lot cheaper and easier to get started with than on PC.

~~~
buboard
ok. lets except music. touch interfaces naturally lend themselves to it. then
again i m not reeally sure you can have a finished, mastered song using an iOS
app.

~~~
mikestew
Finished, mastered songs have been done on much less than the iOS version of
GarageBand. By people you've probably heard of, too. Twenty, thirty years ago
I might not have killed someone for a copy of iOS GarageBand, I would have
been willing to seriously maim.

~~~
pq0ak2nnd
I find it interesting to consider the implications of the quality that comes
out of mastering on iOS vs a real studio (where you have a trained engineer
and dedicated equipment): the former is of lower quality, but it doesn't
really matter because we've become of culture of "good enough" that settles
for sub-par achievements. I would say that falling standards have made content
creation more prolific, to the disappointment of those who care (and not just
perfectionists). A culture raised on overcompressed and pitch fixed
productions just won't care about dynamics and good mixing. To quote a friend
when discussing food: "If all you've ever had is Velveeta..."

~~~
mikestew
_but it doesn 't really matter because we've become of culture of "good
enough" that settles for sub-par achievements_

Congratulations, that strikes me as one of the more elitist things I've read
in a while. We are not a society of "good enough", we are a society that
decided we don't need to go to the high priests in order to cut a demo.
Because what you might call "quality" others might call "over-produced".
Because some people would rather produce quality music instead of twiddle
knobs. All kinds of reasons that GarageBand is adequate to replace a room full
of equipment, if for no other reason other than every kid with an iPhone who
wants to be a rock star can pretend to put an album together (which I'm sure
will be of basement quality).

 _A culture raised on overcompressed and pitch fixed productions just won 't
care about dynamics and good mixing._

And how are they supposed to learn that for less than $200/hour? If only there
were free software...

(And don't for a minute think I'm denigrating the importance of quality studio
work. It's just not always necessary.)

------
travbrack
I'm counting the days until I can just plug my smartphone into a KVM dock and
have it be as useful as a full-blown desktop.

~~~
sevensor
That was the whole idea with the Ubuntu phone -- convergence. I don't know
that it will ever come to pass. No matter how powerful the thing in your
pocket is, I have a box next to my desk that can hold 50 of them. And we're
always going to find ways to use more power.

~~~
394549
> No matter how powerful the thing in your pocket is, I have a box next to my
> desk that can hold 50 of them. And we're always going to find ways to use
> more power.

Yeah, but convergence could mean transferring a memory image to the dock's
larger RAM, transferring control to a beefier processor in the dock, and then
halting the phone's CPU. In a docked state, the only phone hardware in active
use would be it's storage.

Convergence should be more about seamlessly transferring state from one device
to another than using one device with different peripherals for all things.

~~~
jjeaff
I think the idea that Kurzweil and others have put forth is that your device
state will be stored in the cloud and the devices themselves will be so
unimportant as to just be laying around or available wherever you go. You
could just walk up to any terminal or grab a phone from a bin at the grocery
store and your biometrics would instantly identify you and you would pick up
where you left off.

------
scott_s
I find the authors claims odd. Yes, people who used computers for work before
everyone had a smart phone and tablets were plentiful still use computers at
work. That hasn't changed. But what people use at _home_ for things that are
not _work_ has most definitely changed. I know people in their 20s and early
30s who do not have a computer with a keyboard - and if they do, I've never
seen them use it at home. In 2008, I would expect most people in their 20s and
early 30s to have a laptop at home, that they used for basic internet stuff. I
don't expect that anymore.

The author seems to think only about professional activities. But from what I
have seen, home consumption and day-to-day activities have moved off of
"computers" and on to devices like smart phones and tablets. This is true for
me personally: I have a work laptop, but I don't have a functioning one at
home anymore. At home, I only use my tablet and smart phone. Years ago, my
personal laptop broke, and I realized there was no point in replacing it.

~~~
ascagnel_
I think this is also why tablet sales have slowed down in recent years -- once
you get to about 2013/2014, the benefits gained from buying a new tablet
basically leveled off for most users, when the average use case is email, web
browsing, watching video, and maybe some light gaming. The same thing happened
with cheap laptops in the early 2010s.

------
ksec
From who's perspective ? From Steve Jobs and Consumer? Of course we are in
Post PC era. There are roughly 300M+ iPad in active uses, more than double of
Mac users in a much shorter time frame. You want everyone to owe a PC? Now
lots more people are accessing the internet via mobile.

The post Pc era Steve said never meant it will completely replace PC. He said
it himself in I think D8 interviews, where PC are like trucks, they are
useful, they won't be gone, but we just need much less of it.

And that "steady" PC shipment mentioned? You should thanks PC gaming for that.
And especially China. At peak there were over 100M Chinese player playing PUBG
at the same time.

The problem with tablet right now is that Apple did little to nothing making
iPad with Keyboard usage better. The UX for lots of Pc tasks, which are
answering email, excel, are sub optimal, especially so for those who's job are
mostly those tasks.

------
Nasrudith
It depends on how you define post PC with all seriousness. Some people already
use only a phone or tablet as their computing device. We aren't even post
mainframe era shockingly as some still use mainframes instead of server racks.
And that is actual ones not virtualizations of ones with proven but basically
unmaintainable software.

------
Spooky23
The author is delusional. Post-PC doesn't mean "buy iPads". The nail in the
coffin of PCs was placed with the release of EC2 2006 and hammered by the 2007
iPhone release.

PCs today are a dead-end legacy technology. All of the PC technologies on the
edge (things like enterprise laptops, POS terminals, etc) are being replaced
by mobile or virtual devices. Companies are getting sick of the IT business
and are increasingly allowing BYO devices. The legacy Windows stuff is going
to get delivered with remote apps, virtual desktop, etc. It's already
happening.

How do you justify spending $750-1,000 every 36 months on device replacement
when you can spend $20/month for amazon workspaces and buy a cheap thin client
every 8 years or have employees BYO?

If we weren't in the post-PC era, Microsoft wouldn't be porting their entire
stack of Office products into web delivery platforms. When was the last time
you landed a new fat-client application at work?

~~~
whatshisface
Somewhere, underneath all the spreadsheets for moving money around, and
underneath the middle managers writing memos in Word, are the actual workers
that read the memos and do the work that goes in to the spreadsheets. The
workstation market was the first big application of one-user computers, and
will probably be their last foothold.

~~~
jarfil
Reading memos is best done on a tablet or smartphone. Doing the work depends
on the kind of work, only content creators and programmers need a full PC;
everyone else who works in the non-digital physical world, doesn't.

~~~
AnimalMuppet
I see no aspect of reading memos that is better on a tablet or smartphone than
it is on a PC. However, I do see some ways in which it is inferior on a
smartphone - screen size in particular. So "best done on a tablet or
smartphone" seems like a tremendous overstatement at best, and flat-out wrong
at worst.

------
cicero
I hope this means that Apple will give the Macintosh line more attention.

