
PC obsolescence is obsolete - evo_9
http://www.extremetech.com/computing/134760-pc-obsolescence-is-obsolete
======
MattRogish
I think this pretty well reflects Microsoft's long-term consumer risk, too.

More than half of MSFT's revenue
([http://www.tannerhelland.com/4273/microsoft-money-
updated-20...](http://www.tannerhelland.com/4273/microsoft-money-
updated-2012/)) comes from Windows and Office (and over half the profit,
Windows [http://news.yahoo.com/microsoft-defer-revenue-windows-
upgrad...](http://news.yahoo.com/microsoft-defer-revenue-windows-upgrade-
offer-140940839--finance.html)).

Back in the 90s and early 00s, the rate of change in the PC market was
amazing. The hardware you could buy in 1998 basically didn't exist in 1996, so
there always was something new and exciting to spur new hardware sales.

New hardware sales represent the bulk of Microsoft's Windows licensing (the
so-called "Microsoft Tax") revenue and so a large part of MSFT's profits. All
MS had to do was be the default OS and their profit would grow without lifting
a finger (obviously that's greatly simplified, MS had to do a lot of things).

Now that this hardware revolution is no longer taking place, MS' Windows
licensing rev (and thus the huge profits) are threatened. Competition in the
PC market have driven hardware profits to zero. MS and their clone makers have
painted themselves into a corner.

Contrast this with Apple's strategy - produce desirable hardware that people
buy for non-technical reasons (design, fashion, software, etc.) instead of raw
figures (CPU, Memory, etc.). Folks will upgrade because they _want_ to, not
because they _need_ to. That's a lot harder to pull off, but Apple seems to be
doing a great job of it.

Ultimately, I think Apple's strategy is going to make a lot more sense in the
long run. Drive the cost of the operating system upgrades to zero (I'd be
shocked if the next OSX upgrade costs anything; I was quite surprised that
Mountain Lion was {edit: _not_ } a fee upgrade), limit the age of device that
can run it, and create products that people want.

~~~
Zirro
"I was quite surprised that Mountain Lion was a fee upgrade"

My knowledge of American anti-competition laws is lacking, but I remember back
when they were able to release iOS-upgrades for the iPhone, but not the iPod
Touch for free. At the time, there was supposedly something on the legal side
of things which they were afraid to run afoul with by releasing free iPod
Touch updates.

I suppose, if they were to release new versions of OS X for free, they may
risk being sued for trying to push competitors out of the market by giving
away a product that others are trying to sell.

No, it doesn't seem right to me, but sometimes laws are outdated or simply
abused.

~~~
greedo
It was due to the way Apple had been accounting for OS upgrades, not due to
competitive/monopoly concerns.

~~~
WildUtah
Specifically it's because Apple likes to take the Research and Experimentation
Tax Credit for operating system development. The R&E credit requires that you
work on a new product and not on simple refinement of an existing product and
its features. There are a lot of subjective factors involved in qualifying and
Apple felt more secure by requiring some money to upgrade. The big advantage
was not the income but the big fat check from the federal government at the
end of the year.

The R&E credit is a pork barrel scam for big companies with clever accounting
departments, of course. But if it's there, why not take it?

~~~
MattRogish
According to this article, [http://www.nytimes.com/2012/04/29/business/apples-
tax-strate...](http://www.nytimes.com/2012/04/29/business/apples-tax-strategy-
aims-at-low-tax-states-and-nations.html?pagewanted=all) :

"In 1996, 1999 and 2000, for instance, the California Legislature increased
the state’s research and development tax credit, permitting hundreds of
companies, including Apple, to avoid billions in state taxes, according to
legislative analysts. Apple has reported tax savings of $412 million from
research and development credits of all sorts since 1996."

$412M on the hundreds of billions Apple has earned since 1996 are pennies.
That doesn't seem that much... Are there more?

------
ZoFreX
> Now, compare an early Core i7 system (Nehalem) against what’s shipping today
> (Ivy Bridge). Clock speeds are up a bit, and there are cheaper/lower-end
> options available, but the Core i7-920 that launched at 2.67GHz with four
> cores and eight threads is still absolutely capable of powering the latest
> games and applications.

A big, big factor in the slowdown of obsolecense is that the current
generation of consoles has stretched out far longer than is normal. Because
the consoles have fairly PC-esque capabilities (arguably more parity than
there has ever been before), many games released today are released on
consoles and PC, often on a single code-base, at the very least on a single
asset base.

Once the next generation of consoles is out, we will see games taking
advantage of that extra power, and in turn PC requirements for new games will
start to climb again.

~~~
quaunaut
This can't be emphasized enough- the _only_ reason nothing has moved forward
is because gaming hasn't paved the way. Most major hardware advancements on
the desktop we've had have been a result of devices added for higher
requirements in games. And right now, with the death of the B-tier game
studio, there's no one pushing PC-only experiences that really has a reason to
ignore the consoles. That's it.

Now, whether the next console tier will actually allow that or not? I don't
know. Investors seem to be catching wind to just how low-margin the games
business is, and as a result seem to think social gaming is the way to go(if
Zynga's large and horrible decline hasn't convinced you that they are wrong
yet, give it time. It will.)- which means less marketing dollars justifying
that new console, and thusly, less reason to push the desktop.

But we haven't stopped because there's no more need. Just, the market is in a
bit of a wonky place right now. Console manufacturers would love to have a new
one to sell, but developers are pushing back because to actually develop at
the kind of fidelity required to go past where we are now, your _average_
game's budget will quickly double or more, with few returns in terms of
profits- those margins just keep getting thinner.

~~~
scott_s
You're putting the cart before the horse. The reason that performance has not
improved in the desktop _and_ gaming are due to hitting fundamental
limitations in the design of hardware.

~~~
transpostmeta
The PlayStation 3 has 256 MB of RAM, not exactly on par with a current PC
gaming system, which can easily have 16 GB.

I would argue that the rising production costs to produce games that actually
make use of ultra-advanced hardware are more to blame for the slowing of the
graphics race.

You can always do things faster by doing things in parallel. It just gets much
harder to program them.

~~~
scott_s
Gaming PCs also have a general purpose OS on them. It's not an apples-to-
apples comparison. One is a general purpose computer, the other is a device
designed for a single purpose.

But that's beside the point: the games industry drives the graphics card
business, but that is only a one part of the consumer computer business.
General purpose processors are a larger part, and processor performance
improvement is not the exponential curve it was a decade ago.

------
monkeyfacebag
No doubt it has been replaced by _device_ obsolescence, where the screen,
camera, wireless radio, etc. are bundled together and improvements in any one
component compels the purchase of a whole new unit.

~~~
Silhouette
Sad, but true. Faced with the reality that well-designed, flexible, modular
systems have simply become too good, manufacturers are now effectively
building in forced obsolescence, either tying components together (the Apple
way) or arbitrarily declaring that only certain combinations are permitted
(which also works with consumables, like the way my HP colour printer refused
to print a black and white document the other day until I replaced an empty
colour toner cartridge).

Software seems to be going the same way. SaaS is a fairly transparent rip-off
in many cases. Certain big-ticket professional software vendors restrict sales
to a network of resellers they can control, who dutifully pull last year's
version as soon as this year's is available, so if you want to buy an extra
copy to go with the 10 you already have, you can't (unless you upgrade your
entire organisation to maintain compatibility).

Bridging the gap nicely are drivers. Here we see a range of dubious
techniques. One popular one seems to be releasing equipment with drivers for
certian specific operating systems that are available at the time, but then
conveniently not releasing any drivers to support new operating systems
released within what would otherwise be the normal lifetime of the device,
thus artificially shortening its life often by many years. Another good trick
(I'm looking at you nVidia) is to take essentially the same hardware, but sell
one version with nerfed drivers and another premium (sorry, "workstation")
version with "certified" drivers that actually use the hardware to its full
capabilities.

I'm a little surprised that we haven't yet seen more people pushing back
against these obviously customer-hostile trends, because a lot of people are
spending a lot more money and suffering a lot more hassle than they should
have to. Certainly some rip-off merchants do come unstuck; I've heard about
what would have been extremely lucrative sales of very expensive specialist
equipment that got flushed when the prospective customer discovered some form
of artificial nerfing in software and took their cash to someone else who
didn't do that. But it seems that everyday hardware and software are in one
big race to the bottom.

Even premium products, like those high-end graphics cards and professional
software installations with their multi-thousand price tags per unit, seem to
be tolerated because _almost everyone_ is doing it now. Perhaps this is partly
because the more honest competition mostly comes from smaller organisations in
the hardware field or in software's case via independent software houses and
open source communities, and these kinds of suppliers are (rightly or wrongly,
probably a bit of both) regarded as not being up to the job of supplying
Serious Business Customers(TM).

What I haven't yet figured out is why this is still happening. Do you know
anyone who, in either a personal or a professional capacity, actually thinks
any of this stuff is done to help them, or that spending money on these things
is a good deal? In a logical market, we would expect competition to spring up
and exploit that vulnerability, promoting brands based on honest dealings and
good quality, almost certainly charging a higher price for it, but with a
greater perceived value that justifies that price. And yet, this doesn't seem
to be happening, which suggests that many of the markets in technology
industries are not effectively competitive, or some of the big name players
are in practice dominant or outright monopolies even if they're not formally
recognised as such.

~~~
cynicalkane
The computer market is not a conspiracy of manufacturers that get together and
decide to screw customers. And absent a conspiracy, there's no profit in that.
Actually, I'd argue that the computer world has been increasingly tending
toward openness, not away from it; but consumer devices are special a use case
where almost all of computer-buying humanity simply doesn't care about the
things you care about.

Anyway, I'd like to see a "modular, flexible" hardware device that meets the
specs of an IPhone or Macbook Air. As for printers and gfx drivers, those have
sucked since the beginning of time.

~~~
wes-exp
"The computer market is not a conspiracy of manufacturers that get together
and decide to screw customers."

I agree, but for the record, there have been a variety of customer-screwing
conspiracies within the computer market. If memory serves correctly, price-
fixing arrangements have occurred in at least RAM and LCDs. Not to mention
various anti-trust issues.

~~~
talmand
If my memory serves there has been quite a few of them over the last couple of
decades, people just tend to forget them over time.

Plus a few lawsuits were avoided or defeated because of the generosity of the
US federal government making laws that makes it more difficult to be a
consumer; the DMCA comes to mind. Laws pushed by the very companies that are
not in a conspiracy but willing to work together for their own benefit when
needed. You want a third-party ink cartridge? Screw you because the law says
you can't and it's even illegal for you to try.

Nope, nothing to see here.

------
donpdonp
Core i7? PC Hardware hit a usability plateau with the Core 2 Duo. For modest
tasks, my 4.5 year old thinkpad and home desktop are both Core 2 Duo and
perform most tasks fabulously. I do look forward to an upgrade but I have
thought the same thing that this article states - that PCs are lasting much
much longer than they used to.

What is more interesting is smaller hardware. Mini-ITX systems, The
RaspberryPi, and ATMEL/Arduino systems. Writing C code for a 16k ATMEL is a
great way to appreciate the hardware in an 'older' PC.

~~~
freehunter
It is funny that hardware has taken a turn towards slower/lower-end. Appliance
computing, I guess. There was a mad dash to make everything as fast as
possible, then when it was faster than really required, attention shifted to
making computers that are specifically designed for various tasks. Why have
one computer that's not really good at anything (but does it fast enough to
not care) when you can have a half dozen tiny and power-efficient computers
that are great at their job (so good that you don't notice how slow they
really are)?

~~~
cupwithyourname
Maybe I live in a different universe, but I feel like computers are so slow
these days. How is everyone claiming that are sufficiently fast?

Starting up Outlook takes 10 full seconds (just the splash screen consumes
half of that time). When I open up a web browser or click a link in an email,
it takes about 5 seconds to fully load. Even distilling a file to pdf or
loading Facebook takes almost 5 seconds.

I don't think it's an issue of my computer being particularly slow. Everyone
at work with their quad core processors and 6gb of ram seem to just sit and
wait without noticing. Have you ever been in someone's office while they
search for an email? It takes minutes! And don't even try doing any work while
Adobe is trying to update or the Antivirus scanner is running...

I spend so much time _waiting_ for a computer. Waiting for IDEs to redraw,
waiting for my computer to let me disconnect my usb drive, waiting for splash
screens. What is going on?

~~~
jzawodn
Do you have an SSD?

~~~
cupwithyourname
I do on one computer, and it does make things noticeably faster, but still
feels slow compared to how people are describing their computers. I am
constantly waiting a couple seconds in between actions.

------
farinasa
Software and games have traditionally driven obsolescence. Now that software
is shifting toward efficiency, remote processing, and web based interfaces, we
don't need the power as much anymore.

It was easy to double the processing requirements of software when we were at
a low exponent of moore's law, but now that is looked at as poor development.
We're able to do more with less code.

Also, since facebook and other web services have diluted the demographic with
people who mostly browse the web, the demand for power has waned.

However, I am annoyed with all the posts asserting that desktops are dead.
This is far from true. Gamers will not give up their desktops. For power uses,
a laptop just won't support 4 1080p monitors. I personally prefer having a
desktop so I don't have to mess with docking/plugging in my laptop to a ton of
cables when the desktop does a better job anyway. The desktop as serves as my
media consumption station and is connected to the TV.

------
swalsh
I used to do a lot of server work. Mostly virtualization in a lab environment.
We created an elaborate test rig which literally the more computers you threw
at the problem the faster you could iterate. About 4 years ago I left, but
looked back at hardware a few weeks ago. It blew my mind how things have
doubled since then.

I think servers for real work are still important, and they're not done at
getting better faster.

~~~
greedo
The server market is almost as stagnant though. Multicore CPUs are important
because of server virtualization, and it's always nice to be able to put 1/4TB
of memory into a system, but there's a dearth of reasons to upgrade short of
EOL or support being discontinued.

------
leviathant
It really depends on what you do with your computer. I had a Core 2 Quad
system that was perfectly fine for me in 2008, but as I dabble further in HD
video editing, I dropped money on a new system a month or two ago. Now I can
preview effects in real-time, and the rendering time is so much faster.
Likewise, working with photos from my Canon T2i. The ability to chuck 24gb of
memory in the system doesn't hurt either, and SSDs? I actually put my 4-year
old computer on Craigslist for free because it seemed pretty useless to me
after upgrading to this new system.

I'm looking forward to what kind of technology will be available in 2016.

------
mhurron
I am absolutely amazed, that picture at the beginning of the article seems to
have taken on a life of it's own.

More on topic, quick PC obsolescence was always something of a myth. Parts
were only really obsolete if you absolutely had to run the latest shooter at
the absolute highest settings known to man. So it wasn't that it was obsolete,
it was that the user needed to justify their desire to buy the latest for
bragging rights.

~~~
HCIdivision17
Less on topic, but more on the picture: I was wondering about that image.
Something struck me as off. And then I realized that the computer had a
steering wheel. So for those interested, yes, it _is_ odd for a computer to
have a steering wheel that large.

<http://www.snopes.com/inboxer/hoaxes/computer.asp>

~~~
keithpeter
"Many a prognosticator who has tried to envision the future has been tripped
up by a failure to correctly anticipate the direction of technological
change."

I straight away thought of Kubrik's 2001. The complete failure to imagine
_mobile_ phones. Floyd uses a fixed _video phone_ to phone home from the space
ship. Personal mobile technology is the thing.

Back nearer the topic: When I can hook my phone up to a monitor and keyboard
I'm done.

<http://www.ubuntu.com/devices/android>

~~~
russell
If you remember "Stranger in a Strange Land" (1960s) set around the year 2000,
one of the characters, a reporter I believe, had a mobile phone that was the
size of a briefcase. Another character marveled that he must be rather well
off because mobile phones were so expensive.

------
steveh73
I would say you're right for most cases. But not for gamers or "power users":

* Game developers constantly up the ante. Try playing Battlefield 3 at 2560x1440 on Ultra on your 2008 PC and see how you go, if it even starts at all.

* Virtualisation has resulted in consolidation (for me at least). Whereas I might have had to keep around a Windows box to do testing, now I just fire up a VM on my laptop.

------
fjorder
The relentless evolution of the PC was driven largely by PC gaming. In the
90's and early 2000's you could rely on the fact that a PC bought one year
would struggle with games made the following year and be incapable of playing
at least some games made 2-3 years later. As such, gamers eagerly adopted new
hardware, overclocked, overspent by factors of 2 or 3 in the vain hope that it
would buy them a few extra months (when in fact it was more sensible to buy a
cheaper PC and upgrade more often).

Then a funny thing happened. The video game market outgrew the movie market.
It started making financial sense to invest tens of millions in the
development of a single game. The cost of the content (i.e. character designs,
level designs, script, voice acting, motion capture, etc.) far outstripped the
cost of coding the engine. It became a complete no-brainer to port games to as
many platforms as possible to maximize your audience. The corollary to this is
that it became necessary to _design_ games such that it was possible to port
them to as many platforms as possible. That meant that most developers stopped
exclusively developing for one platform and instead started designing for them
all. While you can always tack on some superficial eye-candy on when the
hardware supports it, games could not be designed to have basic gameplay
requiring more resources than the slowest platform was capable of. Thus,
consoles became an anchor that brought the relentless march of PC game
requirements to a screeching halt.

The Xbox360 and PS3, not to mention the Wii, are handily out-performed by
today's desktops by a large degree. Mobile phones are poised to pass them
within a couple of years! However, the games you can play on a PC are largely
identical to the console versions. Sure, there are some improved graphics,
additional content, etc., but these differences are superficial. In fact, if
your hardware isn't the latest and greatest, most games can simply scale back
the eye-candy to maintain performance. Even PC-only developers, such as
Blizzard, have recognized that they must build games with conservative minimum
requirements in today's market.

The end result is that PC gamers can run new games with only superficial
inferiority to what the latest generation of desktop hardware is capable of.
They no longer _need_ to upgrade their desktops. At least, they won't until
the next generation of consoles arrive. When that happens there will be many
new games designed with greatly increased resources in mind and a big wave of
desktop upgrades will ensue.

In the end, PC obsolescence isn't really dead. It's just quantized
differently.

------
johngalt
_CPU_ obsolescence

OTOH everything else has been playing catch up. Disk, Network, Bus etc... Sure
my CPU has hit the ceiling, but it spends 95% of its time idle anyway.
Broadband five years ago was probably a 3-5Mbps link, and storage was a
7200rpm platter. Now broadband is 20mbps and storage is an SSD.

------
fein
I'd say that this is even more valid for home servers. Of the 7 spare boxes I
have, five are 5+ years old while two are 2 year old shit home laptops
(Inspiron yay!).

These were all under $50 a piece when you factor in that 3 were free. All of
these guys sit on ubuntu server or archlinux and never have problems. Hell,
you can go buy surplus/ old 620's from corporate shops for < $100 and build
yourself a full Beowulf cluster for under $500.

~~~
jewel
What's the power usage of old PCs like that? I've been using a sheeva plug as
my server at home because it only uses something like 5 watts.

It seems like ARM servers are the best choice for home servers. They have
adequate processing power and are cheap.

~~~
icefox
I also have a sheeva plug as my server, but if i did it over I would choose to
grab an mac mini. My big jump to the sheeva was the power usage, but then I
found out the mini only uses 12W! Along with the 12W you get a x86, with hd,
cd drive, wifi, etc on a box that I can easily load linux (upgrading the
sheeva's kernel isn't the easiest...) and in a few years when done with it I
can easily sell the mini unlike the sheeva.

------
__alexs
According to cpubenchmark.net the Nehalem Core i7-920 in (from November 2008)
mentioned in the article has about half the performance of the Ivy Bridge Core
i7-3770K released in April. In 2004 the best we had was Pentium 4's which were
around 8x slower than the i7-920 that came 4 years later.

------
pjmlp
This is quite true.

My parents are still using a desktop system from 2002 for all their stuff, and
it still working quite well.

This is the problem most OEMs are facing nowadays, as there this no need to
upgrade every few years like it used to be.

------
Axsuul
You also have to factor in that life expectancy of these machines. Usually
processors last about 5 years and by the time that happens you're probably
going to want to get the latest which will most likely be a different socket
and require a new motherboard.

Furthermore, PC enthusiasts and gamers will never be satisfied. These folk
live for the next hardware releases to get even the tiniest gain on their
overclocks or go from 2x AA to 4x AA. The PC enthusiast community is a force
to be reckoned with. Don't believe me? Just check out the plethora of forums
out there and see how active they are. I don't see that going away anytime
soon.

------
FilterJoe
For those who don't game or edit videos, this has been true for about 8 years.
I retired my 2004 1GB RAM Dell desktop in 2012 and still keep it as a backup.
The ever increasing speed of browsers and the shift of my work from the
desktop to the cloud kept my system operating at the same speed it did 8 years
ago - not blazing fast, but good enough.

I expect my Sandy Bridge replacement system to go another 7-10 years before I
replace it.

------
TylerE
Yep, seems pretty accurate to me.

In fact, my home system is pretty much identical to their hypothetical 2008
i7-920 system.

I have made a couple of upgrades over the years (Replaced the original Radeon
4870x2 with a GeForce 560 about 9 months ago when the ATI card died, recently
upgraded from the original 6GB of RAM to 24GB.

Looking at what's other there now there's nothing that makes sense to upgrade
to.

~~~
dman
Raytraced games probably come closest to what would cause people to upgrade.
But even Intel seems to be backing away from raytraced games off late.

~~~
sp332
Intel backed off because its futuristic hardware platform was too futuristic
to make a good GPU.
<http://en.wikipedia.org/wiki/Larrabee_(microarchitecture)>

------
adsr
I'm not sure I agree with this fully. There are compromises made in software
as a result of lacking resources, with these resources now available some of
these compromises can be avoided. For example, lets keep more resources in
memory at all times, lets use more resource hungry algorithms that just was
not feasible before, etc.

------
keithpeter
Let's turn this thread on its head

Does anyone think a desktop PC with a 20 year life is possible now? What kind
of design?

~~~
JackC
If nothing else, I think power consumption is a dealbreaker for a 20-year
desktop PC. In (say) five years, there will be an iPhone with the same specs,
and then you'll be running a desktop PC for 15 years at 100 times the power
consumption for no reason. A good analogy is the way otherwise perfectly-good
CRTs disappeared so quickly in favor of relatively expensive and low-contrast
flat-panels. I'm using extreme numbers, but at that timescale even small
efficiency increases will make it worth upgrading, and efficiency increases
are a huge focus right now.

The place to look for a 20-year computer is probably the same place they are
now -- embedded systems with minimal power consumption and ample specs for the
limited job they're designed for. E.g., cars.

~~~
keithpeter
Interesting view which I had not thought of previously in quite that way.

Devil's advocate: My old workstation has a 440w power supply but uses much
less than that most of the time. A small percentage of the total electrical
power that my house uses. What pressure is there to switch the workstation for
a more efficient PC?

------
pasbesoin
Two (well, three) things have helped me extend the effective life of a few
machines for self and family. 1. (Particularly for laptops,) Purchase top-end
discreet graphics processing.); 2. Max out the RAM before it becomes terribly
expensive (as its form factor is passed by); 3. When storage space becomes
constrained, throw a new hard drive in there.

Particularly the graphics processing on laptops. Inevitably, this ends up
making the difference between a machine becoming "a little slow" versus it
becoming "unusable", for general, day-to-day use.

In the next ten years, who knows? But I'm betting that going with the best
graphics processing available (again, particularly in laptops, where this is
often un-upgradable) is going to be a good investment in terms of extending a
unit's effective lifespan.

------
AznHisoka
When I first read this, I thought this said "PG obsolescence is obsolete"

------
ableal
Dennard scaling: <http://en.wikipedia.org/wiki/Robert_H._Dennard>

It's about the shrinking of MOSFETS that happened over the 35 years since he
wrote about it. My electronics textbooks mentioned the scaling, but did not
make this named attribution (perhaps just a bibliographic reference).

