
CES2011:  The end of the PC era - roadnottaken
http://www.asymco.com/2011/01/06/this-is-the-most-exciting-ces-ever/
======
kenjackson
Intel's arch enemy has been AMD and IBM in the past -- and MS has supported
running on architectures for both of those companies over Intel in the past.

Remember that x86-64 came from AMD and Intel supported it only after it was
clear that it was going to be the 64bit architecture, not Itanium. And Intel
was pretty POed that MS went with IBM over Intel for XBox360.

My point, these things happened in the past and will continue to happen.

The end of the PC era began with the emergence of the web. Everything else
we're seeing a simply a side-effect of the web. Ironically, MS saw this coming
15 years ago, and their attempts at slowing it may have inadvertantly
accelerated it.

------
code_duck
I think what I'll be using in 5 years is a 4 ghz device the size of my phone,
which can be connected to external displays and input devices. I can already
use bluetooth keyboards and mice with my phone. Once it can attach to a
1900x1200 monitor I won't need anything else.

~~~
bryanlarsen
It might take less than 5 years. Both Motorola and Lenovo introduced an
Android phone with a "laptop dock". Google for the "motorola atrix dock".
It'll be a little while before it's useful, but it doesn't look like it'll
take 5 years.

~~~
code_duck
It might be that long before we get what I really want, a monitor that rolls
up into a little scroll I can stick in my pocket!

------
potency
I disagree. The PC era is evolving, not ending. How many people still have and
use a traditional desktop or laptop computer at home? Microsoft is filling a
growing demand for an OS on smaller form factor devices by supporting ARM
processors with the next release of Windows, but the traditional PC market is
still alive and kicking, its just not "cool" anymore; it's seen as more of a
utilitarian tool than a novelty at this point, whereas smartphones and
iTablets currently retain the novelty factor, thus their pervasiveness at CES.

~~~
pohl
I hope you don't mind. I just thought I'd try this on for size...

 _I disagree. The Mainframe era is evolving, not ending. How many people still
have and use a mainframe when they interact with their bank? PCs are filling a
growing demand for an OS on a smaller form factor, but the traditional
mainframe market is still alive and kicking, its just not "cool" anymore; it's
seen as more of a utilitarian tool than a novelty at this point, whereas PCs
currently retain the novelty factor, thus their pervasiveness at BigIronExpo._

Yup. I actually recall people making this argument back in those days. And
they were right in a sense: big iron still exists in its niche, after all.
Regardless, the PC revolution was still a massive shift — large enough to
justify the rise of the PCs as heralding the demise of big iron.

~~~
nihilocrat
When I saw the iPad launched I was very concerned we are moving towards a
market of consumer-only devices; devices which you can't use creatively to
make content, program applications, etc.. Things like programming will be
considered a "specialist" activity and new generations of nerds will have
nothing to cut their teeth on because the only PCs are development
workstations in corporate offices. Consumer computing will turn into pressing
buttons and seeing magic happen, with no ability for curious people to figure
out what's going on behind the scenes or modify it. A good litmus test is :
can you use applications on the device to make entirely new applications for
that device? You can do this on a Ti-86 calculator, but not on an iPad (please
correct me if I'm wrong).

I'm not entirely convinced things will pan out like this, but the thought
still occurred to me.

~~~
Splines
We've been heading in that direction for some time, but it's only because of
the increasing complexity of our devices.

Televisions, radios, cars, and home appliances are all devices that are harder
to tinker with today than when they were first invented. Computer hardware,
for the most part (outside of hobby kits) is all magic black boxes plugged
into each other. Something wrong with your video card? There's not much you
can do about it.

Luckily, the hobby/amateur market in computer hardware/software (and other
hobbies, like cars) is still alive and kicking. IMO though, corporations seem
to like taking steps to make that sort of work harder (e.g., utilizing
encryption to ensure "blessed" hardware/software can only be used).

~~~
jokermatt999
_IMO though, corporations seem to like taking steps to make that sort of work
harder (e.g., utilizing encryption to ensure "blessed" hardware/software can
only be used)._

I've given some thought to this, and I don't think that it can mostly be
blamed on the corporations. Sure, they're part of it, but I think the ever
increasing complexity of technology is mostly what causes things to be harder
to tinker on. There's far more to understand than in older technology, and it
makes tinkering/hacking on things far more difficult.

It's kind of like classic cars vs cars today. Older cars were much more
tinkerable/hackable/etc, but newer ones have certainly benefited from the
increased complexity in the form of better gas mileage, safety, etc at the
cost of hackability.

~~~
nitrogen
I don't think complexity is nearly the enemy of hackability that manufacturer
lockdown is. Modern cars can still be hacked -- everything from OBD-2
interfaces to modified firmware for engine control units. FPGAs provide a way
to tinker with hardware concepts.

The demise of hackability is and will be due to DRM and the hardware
equivalents (like impossible-to-find proprietary screwdrivers).

------
roadnottaken
The PC era won't end as long as people need to use keyboards to input text. A
lot of home/casual computing might evolve away from PCs, but they'll always be
on your desk at work and I'm guessing most people will keep at least one
traditional desktop/laptop at home for decades to come.

~~~
cryptoz
> I'm guessing most people will keep at least one traditional desktop/laptop
> at home for decades to come.

I think you're being _very_ shortsighted. "Most people" haven't even had a
laptop for a _single_ decade yet. Most people have had one for 1-10 years I'd
say. Also, "most people" haven't had desktops for more than a decade and a
half.

What on Earth makes you think these form factors / devices will still exist in
most people's homes for _decades_?! To me, "decades" implies at least 20
years....

So do you think, in year 2031, we'll still use desktop computers? That's
crazy. No, we will have moved on.

Keyboards will be out-phased for _most users_ by 2031. I image maybe devs will
still use them, and they'll be common in specialty scenarios, but they won't
be for "most users".

> they'll always be on your desk at work

Also shortsighted. Workstations will evolve too, even if this is slower.

~~~
roadnottaken
I'll bet you $1000 that _most people_ will still regularly use keyboards in
2031. Isn't there a website somewhere where we can do this?

~~~
cryptoz
Haha okay, I will probably take that bet! But we need to decide on more
specific definitions, as hinted by pingswept. What defines a keyboard, and
what defines "most people"?

Does "keyboard" include both physical and virtual keyboards? Most people on
planet Earth, or most people in the USA? Or by income?

If we can settle these specifics, I think we should indeed register at
longbets.

~~~
roadnottaken
Maybe we should just do a HN Poll. Note: I'm also betting on the fact that
I'll have an extra $1000 by 2031. :)

~~~
roadnottaken
HN Poll: <http://news.ycombinator.com/item?id=2076992>

------
blinkingled
When is the end of jumping the gun going to begin? :p

Some people are just too eager to see the PC die - rest assured that it will
some day but you need to be a bit more patient. Windows on ARM or ARM based
tablet announcements do not make a PC dead - not so fast.

~~~
irons
The PC doesn't have to die for the PC era to gracefully decline, any more than
Henry Ford had to personally shoot every horse.

~~~
threepointone
[OT] That's a brilliant way of putting it, and had me in chuckles. Thanks for
driving the point home :)

~~~
iamdave
Henry Ford... "Driving" the point home.

Oh you guys..

------
yread
Who were the Windows-exclusive OEM customers? I thought all of them tried
releasing Linux netbooks they just didn't sell

~~~
kprobst
I have an HP mini110 that runs Windows 7 32-bit. I got it as a gift this
Christmas and I remember thinking "oh boy, let's see how this thing runs". But
it does just fine. I mean it's no speed demon, but for the limited purpose of
the form factor, it's perfectly fine. I have to admit I was surprised that it
works so well. I kinda expected Windows to be a hog on that hardware (1GB RAM
fixed, with no upgrade path), but it really isn't. Even the 5400RPM HDD seems
to chug along well, contrary to my prior experiences with laptops running
slower hard drives. I would have preferred an SSD though. And the battery life
is insanely good.

For ~$350 or so, it's a really good device.

------
InclinedPlane
I don't know why commentary has taken this bizarre turn lately of expressing
predictions 5,10, or even 15 years in the future in the present tense.

Sure, the mobile OS market has crazy momentum right now and may well end up
obsoleting the entire PC industry. However, that hasn't happened yet. The PC
industry is still bigger, people developing PC apps are still making much more
money.

This is not the end of the PC era, at best it's the prelude to the beginning
of the end.

------
kayoone
Why Intel ? Some years ago AMD was ahead of them in terms of x86 cpus...even
more years ago there were companies like Cyrix or Via all building x86 chips
that ran windows very well.

------
jister
One thing's for sure...it's not going to happen anytime soon.

------
jasonwatkinspdx
The irony is we're now finally giving up x86 years after the point where
moving to a risc architecture would have offered the biggest jump in
price/performance (p5 era).

------
zmanian
Whats the era after the PC called? Is this mobile computing era? Or the ubi-
comp era? I'm inclined towards the latter.

~~~
junkbit
Augmented Reality

~~~
nooneelse
Not yet. One important thing about real reality is the persistence. Looking
through a pocket-able screen can give you a window into augmented reality, but
there is a lack of persistence to that user interface. Afaik the only thing
that will really bring about an era of augmented reality is good huds that
carry little to no social stigma.

------
ConceptDog
One thing is abundantly clear:

If you're a developer or a designer and you aren't learning at least Android
or preferably all three mobile platforms, you're looking at having a huge hole
in your skill set by the end of the year.

~~~
uytrtyuikuyt
Really ? All my enterprise DB systems are going to be obsoleted by a cell
phone that can do facebook?

I remember 10 years ago when the web was going to make big iron irrelevent,
and 20years ago when PCs were going to replace mainframes

------
ergo98
Intel exclusivity? What?

x86 is an architecture, like ARM, that has seen many contenders come and go
over the years. Intel's dominance is because of engineering excellence, albeit
coupled with some questionable business practices. Show me an ARM device that
compares with a Core i7-950. Show me one that compares with one with 3 of the
4 cores turned off.

Microsoft, for that matter, has had versions of Windows for alternate
platforms going back well over a decade. Those platforms got abandoned not out
of some strategic affinity to x86 (much less to Intel), but simply because
they didn't offer the advantage to be worth the bother: Instead they were
pursued during the "the Future is RISC" era and it was preparing for the
purported demise of x86.

As to manufacturers that were "Microsoft exclusive", show me even _one_. Such
a thing hasn't existed in years.

There is nothing I have ever read on Asymco that isn't full of glossy big
conclusions, yet absolutely falls apart under any scrutiny at all. It is
garbage analysis.

I _love_ the smartphone era. I love the fact that embedded chips in everything
from your HVAC control to your PVR to inside televisions and cars, etc, are
all massively benefiting from this explosion of reasonable performance, low
power devices.

But let's get real. The _only_ reason x86 is behind right now is because ARM
had been working on low power processors for years, and the chips came up that
it was the hot area. Intel isn't sitting on their hands, though, and I have no
doubt they'll have some killer solutions in the very near future. People are
so quick to jump to conclusions though, carried on by simplistic analysis of
the sort that Asymco performs.

~~~
wtallis
_"But let's get real. The only reason x86 is behind right now is because ARM
had been working on low power processors for years, and the chips came up that
it was the hot area. Intel isn't sitting on their hands, though, and I have no
doubt they'll have some killer solutions in the very near future."_

This reads like something from 2007, back when the Atom was almost ready.

When Atom arrived, it simply drew too much power for it to compete with ARM
SoCs, and in the intervening years, the ARM chips have been improving more
quickly than the Intel chips.

It's been more than 2.5 years since the Atom shipped. When are we going to
start seeing the results of Intel's earnest effort?

~~~
ergo98
Atom was created for the netbook market, where the target for low power
consumption was significantly higher than it was for smartphones and from that
tablets.

How many netbooks shipped with ARM? Atom did what was necessary to win that
race, even if it wasn't prepared for the subsequent very low power device
explosion. The Atom chips themselves are actually remarkably efficient -- with
a TDP down to 0.65W for some models, well in the ARM territory -- but Intel
never bothered to make efficient supporting circuitry, while failing to full
incorporate it into a SoC.

ARM is improving dramatically because it became a critical field with some low
hanging advances that were possible. The pace of innovation is impressive, but
it has slowed dramatically in the past year -- recall that the Tegra 2 was
demoed over a year ago. Today, the high point of the ARM ecosystem is _still_
the Tegra 2. People today are doing the sort of naive extrapolation that we've
seen over and over again.

~~~
ippisl
Intel did bother to make ATOM soc's.

They enabled other companies do an ATOM soc using 3rd party fab. nobody took
them on their offer.

At the same time , they did there own SOC's. for example the CE4100 ,which is
especially fitted for digital tv's and includes graphics(powervr) + memory
controller + dsp + hd decode on chip.

It has took the place of tegra in the boxee box, because the tegra wasn't able
to decode high profile H.264 HD streams.pretty disappointing for the tegra
after all the hype.

But still , the power consumption is high : 11.2W for the boxee box(it's
common for streamers to have half that). but intel claims it hadn't done sleep
mode on the ce4100.

Edit: sony will also use the ce4XXX(sodaville) for tv's.

