Hacker News new | comments | show | ask | jobs | submit login
The PC is not dead, we just don't need new ones (idiallo.com)
566 points by firefoxd 1141 days ago | hide | past | web | 396 comments | favorite

I've felt this way since I built my last desktop in 2008. I was sort-of waiting for the "gee its time to upgrade" mark to roll around in 3 or 4 years, but it hasn't happened yet. Any games I want to play it still runs very well, and it still feels very fast to me even compared to modern off-the-shelf systems.

When my friends ask for laptop-buying advice I tell them if they like the keyboard and screen, then its just plain hard to be disappointed with anything new.

I think I can pinpoint when this happened - It was the SSD. Getting an SSD was the last upgrade I ever needed.


Above that, PCs aren't necessary for a lot of people, because people do not need $2000 Facebook and email machines. For the median person, if you bought a PC in 2006, then got an iPad (as a gift or for yourself) and started using it a lot, you might find that you stopped turning on your PC. How could you justify the price of a new one then?

Yet if there was a major cultural shift to just tablets (which are great devices in their own right), I would be very worried. It's hard(er) to create new content on a tablet, and I don't really want that becoming the default computer for any generation.

I think its extremely healthy to have the lowest bar possible to go from "Hey I like that" to "Can I do that? Can I make it myself?"

I think its something hackers, especially those with children should ask themselves: Would I still be me, if I had grown up around primarily content consumption computing devices instead of more general purpose laptops and desktops?

Tablets are knocking the sales off of low-end PCs, but we as a society need the cheap PC to remain viable, if we want to turn as many children as possible into creators, engineers, tinkerers, and hackers.

We would be better off steering into the skid. History has plenty of examples of people who've tried to hang onto the old ways 'because that's how I learned it'.

The way forward isn't to try and keep cheap PCs viable for creativity's sake, but to ensure that creative desires are being met on the newer devices. Would I have learned memory management and dual booting if I'd had a tablet instead of a 386? Probably not. But now that same money buys a high end tablet and a pile of hours for an EC2 micro instance.

Would I still be me? No, I would be even better. All those weeks wasted fighting with modem racks for my BBS I'd gladly trade for weeks spent on a Nexus 10 and a linode.

As a kid, my parents were really anti-videogames (except for a few educational ones), anti-TV (we had one, but it was in its own room and rarely used), and would only let me use the home computer for homework.

As a result, I spent all of my childhood and teenage years reading (mostly fantasy and sci-fi; several books a week, at my peak) and programming. I taught myself programming in 7th grade because I really wanted to play Sim City, but having no gaming computers/consoles, I couldn't. However, I did have a programmable calculator for my math class.

I spent all of middle school and high school programming my own games on this underpowered (a few mhz of CPU and a few kb of memory), inconvenient (typing thousands of lines of code on this tiny keyboard is a feat) device (and yes, I did make a turn-by-turn Sim City for the TI-82). While I was doing this, a lot of my friends were playing Warcraft 3, The Sims, and watching dumb TV shows (early to mid 2000s, reality TV was just becoming big in France).

Similarly, my girlfriend's parents were very anti-TV etc. As a result, she spent her childhood and teenage years drawing, and eventually she went to a top art school (RISD) and now makes a living from her illustration and teaching art.

I'm not sure who I would be if I had grown up with an iPad and a playstation, or who my girlfriend would be if her parents had let her watch TV; no one can tell. However, I think our situations worked out really well for us; and when I have kids/teenagers (not for another decade or so), I will most definitely give them a life analog to what I experienced rather than a media-consumption heavy one. For instance, I haven't had TV ever since I moved out of my parent's house (I do have a projector for watching movies from the computer), and intend to keep it that way.

I obviously have a fairly tech heavy life right now, as a tech worker in SF, but I am trying to cut down. I am noticing that I whip out my phone every time I have a spare 30 seconds, I have several laptops and iPads laying around the house, etc. - and I like it less and less. I'm slowly selling away my excess devices, and am thinking of getting rid of my iPhone when my contract runs out (and just get a dumbphone as a replacement for emergencies). I bought a really cheap netbook, installed archlinux+xmonad on it, and am using it as my primary machine at home for web browsing + programming + LaTeX. It's harder to get distracted with this machine.

In 2011, Tom Preston-Werner said the only hardware he had at home were a waffle maker, a microwave and a bike[0]- I like that mentality.

19 year old me would have pre-ordered a Google Glass from day one and used it with religious fervor; now, I am absolutely not interested in such a device, as I know it is just the ad billboard of the 21st century manufactured by Google.

I still have a few guilty digital pleasures; for example, I buy a lot of used video games that I wanted to play in my childhood and never could (mostly Game Boy Advance/Game Cube- the upside is that you can get 20 of those games used on Amazon for the price of 3 new current-gen games).

My hope is that over the next few months and years, I will revert to reading as much as I used to, and spend less time on Facebook/Twitter/etc. (HN is not completely in that bucket, as it leads me to write introspective comments like this, which I think is good). I think a big part of it is removing the devices that will call for your attention. My Game Boy Advance or MP3 player will never call for my attention- it just waits for me to use it. However, my iPad or iPhone will call for my attention every few minutes, which is not liberating at all. Tech should be liberating.


[0]: http://tom.preston-werner.usesthis.com

Quite interesting thank you for sharing. The idea that constraints can guide creativity. It reminds me of pg's essay on distractions.

I've got something of a different view. I was never limited TV or game consumption. My parents had the mindset of 'you'll have to learn to self moderation at some point, better to do it when the stakes are small'. They wanted me to learn to recognize when something I enjoyed was having a negative impact. They would guide me subtly by asking me to think about time spent in various areas and what that meant. Now as an adult I've not struggled with moderation in any area. Never got pulled into MMO style games, diet isn't a challenge etc... My biggest worry is that I work too much and don't make enough time for fun.

I remember finding it odd when I'd visit a friends house and they were only allowed a certain number of sodas per day, and had regimented rules about computer time. Later on as adults, these friends almost universally struggled with various addictions real and digital (I've seen lives ruined by MMOs). To me it appeared that they weren't able to manage their own desires without outside inputs. Obviously it was a different experience for you.

Ultimately, all humans are pretty different, and there is likely no one size fits all.

My younger brother always had a really hard time with the constraints my parent set - he would watch TV and play video games in secret, and even sneakily used their credit cards to pay for the MMO he played (he was in his early teens, and my parents didn't give us any allowance- they'd just give us some money if we wanted to go out with friends, or if we wanted a book they'd buy it for us. That worked for me, but not him). At some point my parents did try to not set any limits on his TV watching habits etc., thinking he would learn to self-moderate, as you described; for those few years, he basically spent his life in front of the TV and video games, doing literally nothing else. He's in his early 20s now, and still having a hard time with such matters.

What worked for you may or may not have worked for me, and definitely didn't work on my brother; what worked for me would have probably not worked for you, and definitely did not work for my brother.

Humans are interesting creatures, aren't they :)

I had one or two good friends whose parents were more like yours, and I loved going to their house because we would play video games until 4am and eat pizza and drink unlimited Coca Cola. Doing this once or twice a month was like heaven for me, and my parents were aware of it but were fine with it happening- I guess they thought "as long as it's not in my house and not too often, it's fine". However, one of those friends did not do immensely well later in college/life, to which my parents respond "I'm not surprised".

How I wish those questions had clear cut black/white answers :P

You sound like I imagine my son in 10 years ...

The reason I occasionally let him stay with friends for the junk food/games/TV binge with friends is so that he learns all types of experiences. I don't believe in banning anything, just regulating. How else can we understand our world if we don't experience it for ourselves?

If I were to host the junk food/TV/games binges, I'd have to buy the console and games and stock the junk food ... Things that I just don't want to do .. So it's easier to let someone else do it.

In return, I take others kids on bush walks, to BBQs, roller skating, etc - so their kids get to do something different too.

There may be some "it doesn't happen in my house" as you suggest, but it's not my primary or secondary motivator.

Your observations are interesting. I'm in the early days of this process with my kids, but I do limit their tv and game time. I've found (like the grandparent poster) that an absence of TV and games leads to other pursuits: sports, reading, programming, writing books, etc. To give you an example: I just got the national school results. My son kicked ass in a school that underperformed. One of the key differences I've observed is ... TV and games. Other parents buy the books and encourage kids to do homework, but TV and games are too tempting for developing / young minds. The other thing I notice is that after a while, routine kicks in and study has become fun for my kids. The other obvious issue I notice is the mental fatigue that my kids have suffered when staying with friends for lengthy gaming sessions. We talked about his they felt, why they felt that way and side effects (short attention, lethargy, etc). But, they had no regrets and wanted to do it again. Reminds me of smokers... They know it's bad, but don't care. When my kids are old enough, they can be responsible for themselves. Until then, they'll have to live with my constraints, healthy minds and bodies. I'll just have to accept your possible scenario that it will all collapse into a screaming heap in their early adulthood. Let's say, I'm dubious about your claim, particularly as most levels of freedom and responsibility are gradual with most kids ... But I have seen stranger things.

Having thought more about this. I'm leaning more towards the pro-limitations side of the argument. Clearly games are more addictive now. I doubt many lives were ruined over pac-man and missile command. If young me had grown up around WoW perhaps I'd be different?

You mention the gaming binges at friend's houses. If you have the opportunity, ask the parents if those marathon sessions happen on a normal night. There are a few instances in my youth were I recall overindulgence, and most were at the urging of friends. Friends who were manic in their desire to get as much time in as possible before their parents returned. The idea of 'limits against overuse don't apply the second you leave my house' is precisely what got many of my friends in trouble later in life. Merely trying to assist with perspective. Please don't take this as a critique of your view (as I'm in agreement) or parenting ability (congrats on the high test scores).

>I'm dubious about your claim, particularly as most levels of freedom and responsibility are gradual with most kids ... But I have seen stranger things.

I'd like to point out that the scenario I describe wasn't a free for all. Falling short of expectations was met with discipline and restrictions as any child could expect. Simply that regulation of tv/game time was a personal choice provided that expectations were met.

The important part is the guidance. Finding the best way to show the kiddos how to recognize a bad habit. I'm thinking a good middle ground would be a discussion where the time limits are decided, but then applied universally. Something doesn't stop being a bad habit just because you're at a friends house.

Oh man, if there's one thing I could change in my childhood, it'd be trading half of my gaming time with programming time. I was unfortunately growing up in the golden age of web programming and games and 'chose' the latter. Unfortunately, you can't do much with just 20 years of gaming experience. My parents mostly let me do whatever because I already had a skill then (basic tech support) which we thought would guarantee a self-supporting job after college...


It's becoming increasingly important to get college right the first time through if only because of insane, rising costs. If it were cheap, I could probably find a class on self-moderation.

On the bright side, I find most of today's games boring, so it's rather easy to get productive.

Consider community or tech college. It's still not cheap, but it's way more affordable than a university or state college. They also tend to be more flexible and understanding since most of their students are part-time. Employers care most about your skills, and a year or two at a community college can give you more non-academic skills than any Computer Science bachelor degree.

This is why I love my Kindle Paperwhite so much. I could read on my Ipad, but it's so easy to jump into something else with that. The Kindle's limitations let me get on with reading without being tempted by other things.

I think you may be my doppelganger, but I started off on a TI-83+. ;)

There is some special about introducing yourself to programming on limited hardware like that. It's the sort of device that you can grow into, and which challenges you to get innovative when you start to reach its limitations. I don't believe that sort of environment is the only environment that somebody with a hacker mentality and approach can be created, but it does seem to be a good way to do it.

I am thinking of getting rid of my iPhone when my contract runs out (and just get a dumbphone as a replacement for emergencies

Iphones are great for tourtists. Unneccesary otherwise.

Except when you are in another country, and roaming charges are outrageous.

I am unsure if you can attribute your abilities to your anti-TV upbringing. I grew up with parents who had a relaxed attitude about video games and TV, and thus consumed a lot of both.

But I also spent so much of my time drawing, painting, and programming because I love creating things. I am unsure if TV had any negative effects. The ownership and pleasure you get from your own creations is unlike anything that TV or games can provide, and it is utterly addictive.

"However, my iPad or iPhone will call for my attention every few minutes, which is not liberating at all."

This feeling has been creeping on me ever since I succumbed to a smartphone for work a few years ago.

Instead of releasing me from some of the lower-level tasks I deal with day to day, this phone, these devices, nag me to pay attention to all of my friends back home, all of my old classmates, former colleagues, and all of the things they're producing / forwarding / commenting on.

I moved out of the US this summer and it took me a while to get a SIM here. I had roaming data on my work phone but couldn't use it much for the obvious cost.

It's been liberating to just live inside my head and in my immediate environment these past months. When I finally got a local number, I put the SIM in a 20EUR Samsung flip phone. I find few issues with eventually just living with that phone.

You can turn all that nagging off, so the only notifications you'll get, will be calls and texts. I really think some people blame the smartphone for being a distracting element, and it's not wrong that it can be distracting, but it's because the user allow it to. You can turn off all notifications, so you decide when to see if there is something of interest. For instance, my phone does not tell me when a mail arrives - the little badge will show how many new mails if I choose to flip to the screen with the mail app. Facebook never notifies, not even with a badge - I decide when I want to see Facebook (which rarely is on the phone, that app is horribly slow). The same goes for most of my apps. Only when people try to contact me directly by call or text will it go off, and if I'm particularly busy I will set it to "do not disturb" and rarely be bothered by it.

"You can turn all that nagging off, so the only notifications you'll get, will be calls and texts."

Very true.

I think what I've feeling, and trying to describe, is a rejection of expanding my consciousness into the Internet. A large portion of my social groups use those social networks as extensions of themselves, for communication and interaction.

With immediate access to those channels it's difficult to ignore the draw of that technology. And not using those services regularly ends up being the same as not using them at all.

I know it all too well. At some point I felt so annoyed by the smartphones intrusion in my life, that I began to ignore the notifications. It was great for a while, but I still felt a slight nagging. So I did the logical thing, and turned it off entirely. Haven't missed anything of significance, and I'm still in touch with the people that I want to be in touch with over the social networks.

Beautifully worded :)

I've got a 4th gen iPad, a laptop, an external monitor + kb + mouse and a Lumia 920 right now. I still wonder at times if I should cut back a bit, but most of these devices have their uses for me.

Why did your parents, who only let you use the regular computer for homework, let you use your calculator for programming? Is it possible that your identity-building time spent creating SimCity for the TI-82 was not something deliberately enabled by your parents' somewhat closed-minded attitude toward technology, and was, instead, a "waste of time" that they simply failed to realize was happening?

Because they had no clue that a calculator could be used for other things than calculating :)

If you want to read more, trash the ipads and push Michelet Histoire de France (19 vols) from project Gutenberg to your kindle.

Same here by the way, no tv, my laptops are my wife second hand ones, never owned an Apple device.

I fear the ergonomics of mobile computers do more damage than desktops. Source: own experience

I'm not sure if you mean that the comfort of working on a tablet connected to remote virtual servers is higher than that of a single local laptop or desktop.

But if that is your final point, I very much disagree. Besides the fact that a computer keyboard and mouse are often vastly superior to the cramped peripherals you typically see connected to a tablet, large screens and user interfaces designed for large screens often have a huge ergonomic and productivity advantage over the tiny, touch centric interfaces for tablets.

Note: the previous paragraph only applies to the current state of apps for creative and engineering activities. It could be that five years from now, tablets and their software for creating content have evolved to a point where they rival or surpass desktop software. Or they might not. Either way, my point is that we're not there now, and creating software etc on tablets is a sub optimal experience today.

Long form writing seems close to impossible on a tablet. A long hacker news comment is possible, but once you are looking at even a modest book report, I can't see it.

Maybe with some sort of accessory keyboard, but then you are looking at more of an ad hoc laptop than a pure tablet.

You'd expect long-form writing to be unthinkable on today's mobiles, but plenty of people wrote whole novels on Palm PDAs with only the on-screen keyboard/Graffiti. I also performed extensive text entry on Palm PDAs for my middle/high school notes and assignments, and didn't consider it a bad experience. And these examples, and the above commenter with his Sim City clone for the TI-82, demonstrate, when you make up your mind that you're going to do something on a limited device, it's often surprisingly possible.

I do not disagree, just wanted to mention that nowadays it is still possible to learn multi-booting with Android tablets and custom ROMs

I'm starting to see a trend of tablets-as-laptops where people have a case that integrates a keyboard and they type their papers or whatever else on their iPads or other types of tablets.

Having a 1.5 or 2 pound laptop, with a 12 hour battery life that you can detach the keyboard from for $300 is a much better form factor than the current typical laptop. Many of these tablets also come with wacom pen digitizers or touch, allowing a creative input that is missing in many laptop form factors.

Also you can still create web-apps and other such things with things like node.js and so on android tablets today. Javascript really is the BASIC of this generation.

I won't be surprised to see full IDEs that could be viable in creating general purpose apps in near future. I really think Android & iOS will eventually become the next 'desktop' OS with a full suite of apps as powerful as the current desktop set of applications. Concerns about tablets as consumption only devices will go away probably within the next decade as the world transitions to these 'mobile' OSes.

Android-based IDEs already exist. I should know. I was looking for one the moment I bought my ASUS Transformer a couple years ago. I found the only thing I couldn't do with the device was write and compile programs. I would bring it to hacking sessions, but I couldn't test code on it. At the time, the only option was a text editor that could style and check code syntax.

The main downsides of these devices for programming is screen real estate, CPU speed, and support from major IDE/compiler vendors.

Example: https://play.google.com/store/apps/details?id=com.aide.ui

I am following you, but cannot make myself comfortable with non-elastic screen size. I like my tablet to be 8", while anything less than 12 is not good for a laptop (I tried with UMPCs and Netbooks). However, I certainly can imagine myself plugging 8" inch tablet into 27" monitor and keyboard/mouse combo for comfortable work at office.

>because people do not need $2000 Facebook and email machines

Devil's advocate: if that is true, then why are macbook pros such a hot sell? I'm typing this here in a college library's lobby. When I look around, I see roughly 3/4 of the laptop-using students are using a macbook pro, with a few macbook airs littered around. If I were to walk around and glance at what people were working on, it'd probably be something like 70% youtube/facebook and 30% using some word processor.

My point is that the consumer's decision to buy or abandon a product isn't solely driven by how good the product is, the value of the new item as a "status icon" also has to be taken into account. All you need to get the customer to justify that $2000 price tag is a culture of rabid consumerism and the garauntee that they will be cooler than their friends if they buy this extremely expensive laptop that does all sorts of things they will never ever use.

In your example involving the 2006 PC and the new iPad, I would argue that a huge contributor to the consumer's abandoning of the PC is because it's nowhere near the potency of a status icon as an iPad is.

The MBP is one of the most functional devices available. The wanky, hipster appeal is way overplayed. Plenty of people use them inspite of their image.

Phones satisfy the mobile convergence thing between organisers, phones, cameras and handheld games. Tablets satisfy the "computer as a bicycle" vision of Jobs and bring portable computing to the masses. But neither are really full featured enough for a developer or a college student.

When it comes to a full featured, keyboard equipped, programmable device there is bugger all that is light, powerful, has a long battery life and ships with a decent environment (posix, term, ssh) out of the box. Dell's XPS 13 DE delivers a bit of the picture but it doesn't have the quality bundled iWork apps that would appeal to a student and has worse battery life. They are getting there though.

The rest of the laptop market is hindered by shipping with Windows only and the difficulty of getting pre and post-sales driver support for the developers and students who require an environment like Linux. The industry has really fucked themselves trying to keep to the model of the glory days.

Hopefully Steam will help set things straight by creating a large mainstream market that more niche users like developers, students and scientists can benefit from.

Manufacturers are going to have to stop the race to the bottom and start building less models, higher quality and in bigger volume if they are to compete with Apple in price, quality and profitability. And they are going to have to explore well integrated Chrome OS and Linux packages and work on developing drivers with better performance and power efficiency.

> The MBP is one of the most functional devices available. The wanky, hipster appeal is way overplayed. Plenty of people use them inspite of their image.

They are fine machines, but if you don't care about subjective things, like how a laptop looks, how much it weighs (an extra pound or 2 never really genuinely matters to a use case), and you only care about getting exactly what you need (not want) for the lowest price, then there is no way you would ever buy a MBP. They are a product that we convince ourselves that we need or deserve because we lust after it, even when a cheaper alternative would do just fine.

I have bought macs for a while, but if I have to be really honest with myself there was always a cheaper alternative that was 'good enough', it's just that I subjectively want apple's products because they are 'nicer'.

> The rest of the laptop market is hindered by shipping with Windows only and the difficulty of getting pre and post-sales driver support for the developers and students who require an environment like Linux. The industry has really fucked themselves trying to keep to the model of the glory days.

Quite good for those of us that work across multiple OSs.

Just install GNU/Linux on VMWare with CPU virtualization enabled and get two OS in a laptop with the minimum of fuss.

Just install GNU/Linux on VMWare with CPU virtualization enabled and get two OS in a laptop with the minimum of fuss.

And far less battery life.

I often run Ubuntu or Windows on my MBA with VMWare to compile some software. It's one of the best methods to soak your battery empty quickly.

Well, I do have a laptop with Windows 7 and a netbook from Asus that was sold with a Linux distribution.

I use GNU/Linux since 1995 and on a laptop still looks like 1995 to me, in terms of graphics card, battery, hibernation and wireless support.

My Asus, which was sold with a supported Linux hardware, does not hibernate.

In terms of battery life I am yet to find a laptop where GNU/Linux lasts longer than Windows.

People who believe in being able to buy social status are the ones perfectly fooled by advertisements! Someone can be butt ugly, poor and stupid, but still have the highest social status and popularity in most social environments. What makes one an attractor of popularity is demonstrating power, will and the ability to achieve what you want.

Who would be more popular?

A) A rich guy who thinks he can buy friends, by inviting everybody to an expensive cocktail to an exclusive restaurant?

B) A guy who own lots of expensive high-tech and luxury car + his own house?

C) A guy with charme and charisma, who can make anyone feel special, or have a fun time, even in the dirtiest place?

Choose yourself. I believe you all experienced A,B and C already. But I would rely on the scientific evidence that C) has a higher long term chance of staying an attractor of popularity. (I'm sorry, I couldn't put the references together, but hope you understand)

Side story: A weird guy in my old class possessed a bluetooth wireless headphones with integrated mp3 player in 2006, a 3000EUR laptop and all sorts of other very expensive gimmickry. That didn't make him cooler at all. People still didn't like him and it just made him more vulnerable to attacks of the some of the more primitive pupils.

> People who believe in being able to buy social status

Trust me, you can buy social status. If you pay enough, you have the status.

> Someone can be butt ugly, poor and stupid, but still have the highest social status and popularity in most social environments.

Most social environments he exposes himself to, yes. Most social environments, no.

He said that a product being a status icon justifies a high price, not that a high price makes something a status icon.

edit: parent comment has completely changed since I replied

Yea, guys can buy people out of their league, but their wives are usually misserable, and cheat. Women have a hard time doing the same. Actually, the whole success thing works in reverse for women. They become less attractive with money, unless they earned that money with their mind. Even then the men these women want still don't want want them. I was trying to soften the blows I will get for this thread.

In Marin, I see so many attractive women who settled for some fugly dude just because he has money. Maybe, that's what keeps guy's striving for the next Facebook?

I could go on and on about this subject, but I'll get clobbered.

If you can't achieve C, you sure can use your money to achieve A or B.

Also if you're lucky enough to be able to carry off C, you're more likely to have options A and B at your disposal.

I'd say laptops have been improving quite a bit in non-computational ways. Weight, battery life, and screen quality particularly.

A MBP can be brought around to show how cool you are, a Dell Workstation not so much. Maybe if somene builds a handle for the Mac Pro borg cylinder. ;)

Because MacBook Pros have HDMI output, more pixels, larger hard-drives, and run Starcraft.

Because people like new things.

My grandma taught me the saying "More money than sense." . It's proven incredibly wise

When you start collage, you normally get a new computer. But otherwise I agree with your premise. A high powered desktop PC is no longer a status symbol for anyone compared with laptops, tablets, or phones.

While many people but them for the technical side as pointed out in other replies, there is no denying that other buy them as a status symbol, in the same way they will buy a BMW for a car.

When my friends ask for laptop-buying advice I tell them if they like the keyboard and screen, then its just plain hard to be disappointed with anything new.

That's exactly what I'm disappointed with on everything new. The Thinkpad T60p, from 2006 remains superior on both points to everything new from my point of view.

That's exactly what I'm disappointed with on everything new. The Thinkpad T60p, from 2006 remains superior on both points to everything new from my point of view.

While a very fine machine in it's day, the screen, battery life, processing power, disk speed, size, and lack of heat on a modern machine like the 13" Retina MacBook Pro are on another level. You're talking about 7 years of evolution.

Well, I have T61 guts and an SSD in mine. It isn't so far behind as you might expect.

I've used a 15" Retina Macbook Pro. The screen is not better. Sure, it's higher density and brighter, but the viewing angles are not better. Subjectively, I'd say the color reproduction is worse (I used them next to each other). Reflections, glare and fingerprints are significantly worse on the MBP.

Yea, I have a ten year old Toshiba P26-S2607 I still use on a daily basis.

I have a one year old MBP that I haven't used much; I'm not sure why, but the screens seem the same?

I do know one thing about most old laptops; they were made to last longer than 2 years.

That said--HP--as made crappy laptops for quite some time. I bought one a few years ago and it was horrid on an engineering basis, but it looked Slick.

I had an HP laptop that I bought around 2009. It lasted a year before the insane heat issues started; I agree with you completely regarding the engineering issues. In the end, I ended up buying a Lenovo about two years ago, and it's been perfect for me.

Getting back to the original story, I don't think I've bought a desktop since 2003. I prefer using them, but laptops are just so convenient.

Are you sure it was simple heat issues and not a defective Nvidia GPU? A bunch of them in that time period were prone to premature failure. Many HPs used them, as did a few Thinkpads.

I also enjoy quite old laptop (HP2530p, 2009, 12", Core2 Duo 1.9GHz, sturdy, lightweight). I upgraded to 256GB SSD and 8GB RAM. Plenty enough for the next couple of years, me thinks :) Perfectly runs W8.1/LinuxMint/Android-x86ICS

Completely irrelevant to the rest of the discussion at hand, but if you are going run Android as a Netbook OS on stock PC hardware with keyboard and mouse/trackpad, I've found that everything from Jellybean and up is incredibly much better to work with than Android 4.0.

They've done some deep level fixes which just makes everything flow and stick together as one "netbook" user-experience in a much better way.

My experience is mostly from Asus Tarnsformer type devices and not regular X86 laptop hardware, but I suspect the same improvements should be valid in X86 country. You certainly have nothing to lose by trying it [1].

My 2 cents.

[1] http://www.android-x86.org/releases/build-20130725

Agree, and to wander further afield: how's your luck running apps? The Google apps work great, I use Opera and some hack to get Flash games to work, but the vast majority of apps just fail to load.

In spite of the problems, it's amazingly fast and usable, esp. compared to Windows/Ubuntu.

running the stable android-x86 on a Samsung NB505

Better than newer T series? I've been looking into Lenovos because the keyboard is the main differentiator I care about on a laptop.

I've typed on a T530. It was pretty good, but the removal of several keys I actually use and the relocation of others bothers me. I don't think the feel is quite as good on the T530 as on the T60, but it's still one of the best laptop keyboards.

Yes! I have an X1C and they have messed the home, del, pgup, pgdn island, and placed print screen between the right alt and ctrl. I'm tripping over the keys all the time.

How hard can it be to design keyboards with all keys in the correct place? Thinkpads used to be the only ones which got this design issue right.

I think Lenovo wanted increasingly large trackpads with increasingly short screens. They've mentioned in their design blog that some of the changes are driven by "consumerization".

The problem with that is Thinkpads were never meant for mainstream consumers, Lenovo isn't going to out-Apple Apple, and if it wanted to try, it would do better using a model line that doesn't have a business-oriented reputation going back decades.

I'm hoping and waiting for some third party like infocomp to supply proper aftermarket thinkpad keypads for the new lenovo machines, with the buttons restored to proper order. Otherwise I will not be "upgrading", ever.

Don't get me started on the single audio socket for both recording and playback.

I have the same laptop running debian. Even runs small VMs ok and has survived multiple liquid accidents and dropping

I've upgraded it with a T61 motherboard, 8gb RAM, a tweaked BIOS and an SSD. It runs not-so-small VMs now. I try not to drop it or spill things in it, but the spill-resistant keyboard is good peace of mind.

This is a common generational worry, right? We learned X a certain way, that pedagogy was necessarily linked to the technology at the time, we worry in retrospect that that specific technology was a necessary condition for ever learning X.

I'm only 28, and I do it, too-- e.g., how will kids expand their imagination and learn about the world without only having paper books to immerse themselves in for hours at a time? How will anyone learn the basics of programming without finding QuickBASIC on an old Packard-Bell 386, playing around with Gorillas or Snakes, or entering their own code from books in the library?

I think there will be a sufficient number of hacker types around for the cynical and simple reason that corporations need to inspire kids to learn how to code so that they can hire folks in two decades. This ought to inspire a token amount of educational support and tool building so that entry-level development will always be accessible to kids.

These days, everyone has access to an excellent cross platform learning platform for programming: the browser. So yeah, I think if anything programming is more accessible then ever.

Not that I disagree with what you've said, but browsers aren't a platform for learning to program on a tablet.

The worry isn't unfounded though and in the case of programming it has actually been measured, the decline in quality of CS students is what inspired the Raspberry Pi project:


people do not need $2000 Facebook and email machines

I dunno, Facebook and GMail seem to get slower each month. They're unusably slow, even in Chrome on Windows, on both my and my wife's circa-2008 machines.

HTML5 combined with crazy CSS can make a C2D crawl these days. It's not just Flash anymore that's sucking CPU power down.

I still use Facebook on old iPad 2. And it goes faster as the same day I've bought it.

That's generally the solution to the problem of accessing the Web on old PCs: request the mobile version of the website. Heck, with enough trickery you can even get Opera Mini to run.

I would pay for a silent PC. I paid a lot for a quiet Apple PowerPC Mac pro back in the day and built a quiet PC but they are not silent and I have moved to a quieter place and I can hear them (yes the Mac runs Fedora now). I have some silent ARM machines but they don't quite cut it yet though maybe the new quad core one will.

Silent PC Review ( http://www.silentpcreview.com ) occasionally reviews complete systems. Last year they measured an i7 machine from Puget Systems at 11dB in their anechoic room ( http://www.silentpcreview.com/Puget_Serenity_Pro ) which is the best I've seen for an off-the-shelf PC.

I haven't used a Puget machine myself, but I've relied on Silent PC Review's component recommendations for nearly a decade, and never been disappointed.

I have seen full size ATX cases from Silverstone with passive cooling, where the entire case is machine out of aluminum and acts like an enormous heat sync. Have yet to find a good fanless power supply. The new Mac Pro supposed to be only 12DB. It's hard to beat that, most rooms have ambient noise of at least 18db.

Seasonic seems to have a pretty good fanless power supply.

> The new Mac Pro supposed to be only 12DB.

Until the dust hits it...

Solution: clean it once in a while :)

>Solution: clean it once in a while :)

Teach me oh great one. Honestly I've never managed to clean a fan. You can brush off the obvious dust but the noise comes from dust getting into the fan itself.

Spray it with compressed air inside and out. You can get into the slit between the fan housing and the motor and blow it out there as well. Most fans start making noise due to bearing failure / wobble, not the dust. I have a small home server with 4 fans. After almost 10 years now 3 out of the 4 fans are as quite as they were on day one, the 4th has developed a bearing issue and needs to be swapped out eventually. I am assuming that Apple will use hire quality fans then the $3 crap fans I am using.

>Most fans start making noise due to bearing failure / wobble, not the dust.

That exactly is the issue. They don't just mysteriously develop bearing failure / wobble...they do so because dust gets into the bearings. And that exactly is my point...no amount of compressed air can fix dust in the ball bearing grease. I just end up replacing all the fans after a few years...

There's some complete 'mini' systems that achieve this, presumably by pushing the the transformer out of the computer and into a wall-wart or similar: http://www.damnsmalllinux.org/store/Mini_ITX_Systems/533MHz_... http://www.amazon.com/Intel-D2500-Fanless-Mini-ITX-D2500CCE/...

Buying these as stand-alone components seems to be more of a challenge.

My MacBook Air (2012) is completely silent most of time (except viewing videos in Flash or Silverlight). Given is silent, I can actually hear the power supply of my monitor buzzing! (actually it's very quiet buzzing) - be careful what you wish for I guess.

but the video is so loud!

Yeah I know it's awful, if my Celeron 450Mhz could play DVDs at no effort, I don't know why this can't play videos on 5-10% CPU and not spin up the fan.

Most video encoding is mp4-type now (rather than the mp2-type used by dvds). Mp4 is typically a lot more processor intensive.

Any even vaguely modern laptop offloads mpeg 4 part 10 decoding onto a decoder chip or the GPU.

The older Mac laptops had (SMCUtil?) a utility to change the threshold for the fan - I know lots of Wintel systems have this in BIOS/EFI.

I have one, from http://www.quietpc.com/

It's completely silent, you can only tell that it's on by looking at LEDs.

Mac mini has fans but is basically silent.

things with fans are not silent enough - the mac pro seemed ok but now it seems too noisy

I guess that's what I get for living in the city...

my silent PC solution from 2008. Still the best value for money if architecture of your house supports it :) http://solnyshok.blogspot.com/2008/03/my-silent-pc-solution....

https://store.tinygreenpc.com/tiny-green-pcs/intense-pc.html Very silent fanless pc. I have one and I am glad.

If an i3 is fast enough for you, and you don't need high end graphics, you might want to consider an Intel "Next unit of computing" system... You'll need to get laptop components for it though. I've considered it, but not that into quiet, and want a bit more power.


There are passive coolers for i7s out there. You can even find passively cooled GPUs and PSUs. It's pretty cheap to build a silent pc.

It's tricky getting a passive GPU to actually work, mine has a tendency to overheat and halt the system rather than throttle down under load. It required a fair bit of tweaking to case fans and driver underclock settings to get it stable.

Those passive coolers generally still rely on some airflow. With no case fans at all, I'm told they won't be able to hold up under any decent load.

Intel NUCs [1] are well engineered and extremely quiet.

[1] http://www.intel.com/content/www/us/en/nuc/nuc-kit-d54250wyk...

In the Netherlands there is https://ikbenstilcomputers.nl selling high-end fanless computers.

I believe there will be a major cultural shift to tablets/phones/handheld.

The social implications worry me -- mainly that the most popular handheld devices (iOS) are _locked down_, you can't actually install whatever software you want on it.

I don't know if the actual experience of using Android, for non-techies, might end up seeming similar?

The social implications of this worry me. We spend increasing amounts of time on our computers, and have decreasing power and freedom over what software they run how.

I think your concern about it being harder to create on tablets, and the social implications therein -- is also legit, but it worries me less than the loss of control over our devices. People will find a way to create, although the nature of what they create will be affected by the affordances of the device, for sure. (there will be a lot more 140 char poetry, heheh)

Not disagreeing with your overall concern about locked down platforms...

mainly that the most popular handheld devices (iOS) are _locked down_

With a 80% market-share[1] I think it's safe to safe Android is by far the most popular handheld platform, with iOS being for the niche market.

[1] http://techcrunch.com/2013/08/07/android-nears-80-market-sha...

Most game makers are targeting consoles, which just went through a longer than usual shelf life. With the new ones coming online, we will see more PC upgrades from gamers.

Also resolution stopped at 1080p for a decade. Now that 4k is happening, there is an easy path to performance for gpu manufacturers.

Resolution dropped

High-end for 4:3 was 2048x1536 (3 megapixels)

High-end for 16:10 was 1920x1200 (2.2 megapixels)

High-end for 16:9 was until very recently 1920x1080 (2 megapixels)

Just to clarify, high end for last two ratios are 2560x1600 and 2560x1440. Both were available at least two years ago when I built my current machine, although after some quick research the 16:9 format had some available as early as 2007.

My laptop bias is showing. Those were the highest resolutions available on laptops. They were also the highest commonly-available resolutions for desktop screens, though none of those was the highest that existed.

It's really _only_ code that is harder to create on tablets though, and it strikes me as extremely narrow-minded to write off the music, art, text, huge Minecraft sculptures, photos and videos of singing, dancing, playing, that have been gleefully created by people young and old on these devices.

It's the exact sort of snobbery that has almost completely killed art, dance, drama and music in many schools, as if the only valuable acts of creation left to humanity are engineering and science (which, incidentally, are both wonderfully served by the innumerable education apps on these locked-down, post-apocalyptic devices).

Code is harder because editing text is harder. Text input and manipulation is just not very good on tablets. Even if you plug a keyboard into iOS editing is still worse than WordPerfect circa 1988.

You might type stuff in OK, but manipulating text is awful.

Yeah, I have no idea how Shakespeare got anything done.

Sure. 884,421 words[1]. And with that workload, I bet he used the most efficient technology available at the time to do it.

[1] http://www.opensourceshakespeare.org/stats/

I agree, I brought a high-end Dell 2.5 years ago and it was starting to feel like upgrade time (sluggish performance etc) a few months ago. I then stuck in a 240gb SSD and its faster than it ever was, even when I first got it.

I agree with the general consensus here regarding how computers in general have been aging better now. Especially with different computing devices available today, it's hard for the average person to justify buying a new computer within 3-4 years of theirs.

At the same time, I think even the "average" person would want more than a 128gb ssd, and therefore today's entry-level harddrives equipped with these small ssd's won't age that well. I know, I know, ssd's come in larger sizes --- but they become significantly more expensive, and most entry-level notebooks (with ssd's) come with 128gb. As a comparison, it's almost weird that years ago you could get a 500gb harddrive without giving it a second financial thought. As such, I think that if there is anything that a normal person might want, it's more harddrive space as they fill up their small-ish SSDs --- so that they don't need to worry about deleting things when they have too many pictures, games, etc. The average person won't want chrome to take half the number of milliseconds to open a new tab, or their games to go from 40 fps to 60fps. But, to me it seems easy to fill up these smaller harddrives, and many people might be looking for a new computer to deal with that.

Before someone mentions it: YES, cloud solutions and external solutions exist. But is it part of mainstream usage to store your stuff on an external hdd? Also, wouldn't people anyway want a future computer where they didn't need to do that? I'm not claiming they have terabytes of data, but I think over the course of 3-4 years, people could pretty easily accumulate > 256gb of data. Otherwise, is there a free and easy cloud solution that gives > 50gb of space that people use a lot today? (Not to my knowledge)

everyone I know has a 64-128GB SSD and a 1TB internal HDD, and a 1TB external HDD for backups.

Quick solution: USB3 (or e-sata or thunderbolt) external hard drive. Alternatively, NAS (that you can cloud-ify).

I've never needed a hugh hard drive. External HD's work fine for me.

Flickr gives you a terabyte free.

surely no one wants to rely on a third party service that is only available online and at the service's whim for their photos.

Same here. I feel like we've reached the end of the relevancy of Moore's Law. My PC at home is from early 2008, and it's still an excellent machine. It played all my games wonderfully, even new ones, even 3D ones. I can't imagine a game looking better than The Witcher 2.

Actually, this isn't entirely true; a few months ago, my WinXP install started to play up, to I bought an SSD and installed Win7 on it. Now it runs better than ever. A recent OS helps a lot, as does a SSD.

I can see one reason why I might still want to upgrade, though: The Witcher 3. I doubt it's going to run well on my by then 6 year old machine. But maybe a new graphics card is all I'll need. Or maybe it'll even just work.

My Macbook Pro is a lot more recent, but it also feels like it might last forever. It can handle everything I throw at it. Why would I ever need something more powerful than this?

If I want anything new from my computers now, it's stuff like smaller size, less noise, less power use, etc. They're powerful enough.

For high performance gaming, I wonder if console gaming has something to do with it. With publishers now focusing on consoles instead of PCs, graphics may be held back to a degree due to that.

But another thing to consider is that the console release cycle has also slowed down, because there's less of a need to upgrade there as well. So you see the lack of desire to upgrade trend emerging for consoles as well as PCs.

I do think that nearly everyone who wants a PC has one at this point. That plus no desire to upgrade means slower sales. If people were actually ditching their PCs entirely, that would be a different story.

I use my 2009 iMac for local heavy lifting and watching TV. (Heavier lifting I use the cloud.) I will upgrade my desktop once the LCD goes retina.

I will continue to upgrade my laptop frequently. Lighter, smaller, faster, longer lived, more durable. Every new laptop has increased my productivity, flexibility.

I just bought a 2013 MBA 13". Most amazing machine I've ever had. Now that I'm accustomed to 13" (vs 15"), I will likely buy a 2013 MBP 13" retina. I'm certain that I'll be very happy.

I agree. Much of this is just because the processing is moving off of the machine too. When you were doing all of your own computations, any incremental thing you did required a stronger PC. Now that computation is happening on the server.

There are programmers, mathematical and financial users who are still stretching their desktops, but for most of the rest of us the need to upgrade is going away. It's almost like it's time to upgrade when there's just too much clutter on the old machine.

I agree with the point about keyboard and screen.

I wanted a gaming laptop, but once I got into that category, I'll be honest--the deciding factor for me was keyboard layout. I'm a developer so it's really important to me to have special keys in the right place, and to have them easily distinguishable by touch.

Nothing is worse than arrow keys with no gap separating them, or an F5 that blends in, or page up/down in some unusable position.

Got some Asus model, and it's great.

> because people do not need $2000 Facebook and email machines

I agree with your post but just wanted to point out a Facebook/Email PC does not cost 2000 dollars anymore (and has never cost that much for a long time) :) You can get around with a 300 dollars laptop just fine of that kind of usage.

you can use a wireless/usb mouse and keyboard with a laptop or [windows] tablet and its pretty close to a PC... you can also use an external monitor on many of them. PCs need to become the size of a raspberry pi, that's all.

You're just not playing the latest games or doing any intensive computations, or just don't care that you could finish your task 3 times faster than on your 2008 CPU.

What's your framerate in Battlefield 3 on a 64 player map at 1920x1080, high settings? Or Crysis 3?

You just picked the most expensive calculation you could, more or less, and asked how a general-purpose PC handles it.

A brand new general-purpose PC handles that situation equally well--unplayable.

No, the most expensive would be a 3-monitor setup at 2560x1440 on ultra settings.

That's the thing...most games are still not multi-threaded well, and single threaded performance hasn't gone up more than maybe 20-30%, they just tacked on more cores.

That's actually not true. Modern high end CPUs are multiple times faster than the CPUs from 2008. The architecture changed too, not just the frequency.

I have an i7 920, purchased in December 2008. Please point out which consumer processors (No $3000 xeons) have "multiple times" greater single threaded performance.


"For what" is the obvious question. Web development with a remote testing environment, office applications, email, web browsing - sure, a Core 2 Duo is more than good enough if your software environment is kept in order. Audio / video / photoshop, gaming, developing software that does math, data analysis - you can never get fast enough.

The limiting factor is if your computer's feedback loop is tighter than your brain's perception loop. If you can type a letter and the letter appears, your computer is fast enough for word processing. But, if you can run a data analysis job and it's done before you release the "enter" key, it just means you should really be doing better analyses over more data. Certain use cases grow like goldfish to the limits of their environment.

Even with gaming there isn't as much of a push as there used to be to constantly be on the cutting edge. This is mostly do to the fact that the industry as a whole focuses primarily on consoles first now and thus consoles tend to be the gating "LCD" target. If your PC is at least as good or a little bit better than a console released in 2005 or 2007 you're set. Of course, there will soon be a bump forward here with the next gen Xbox and Sony systems coming out in a month.

I fit into a lot of the special cases here: Developer, gamer, amateur photographer with many gigabytes of RAW files and even I don't feel the need to upgrade systems like I used to. Now it is about an every 3-4 year thing whereas previously it was yearly or more.

Emphatic agreement. I wind up helping folks a lot with writing high performance software, and it's very easy to get to the point where the time to run a model is totally determined by how fast the CPU and IO are. I'm talking problems where naively written C would take an hour, but if I'm careful and use some blend of very clever c or Haskell, the computation is done in 2-5 minutes

in machine learning, as soon as you stop using linear models which are normally quite fast to compute, models will be as slow as you can tolerate.

for example: random forest, gradient boosting, gam, etc -- you will typically do parameter searches and the models you get are as good as your willingness to wait. Good software will run at a significant fraction of memory bus speed and the faster that bus goes the better your models will be.

exactly! This winds up being a memory locality / layout problem often times!

eg: most CPU memory traffic is in "cachelines" size chunks so you're best off trying to organize information so you can use all bandwidth! I've a few ideas on this i'm trying to bake into an array/matrix library i hope to release soon. :)

What kind of problems are those? I would love to find problems, ultimately examples, where smart Haskell and c blends are superior to pure c.

It's not neccesarily which language is faster, but the which algorithm is faster. He said naively written C, which mean the algorithm may be entirely different and run in O(n2) and much slower than a Haskell version which use a different algorithm and run in O(nlogn).

actually in the specific example I'm thinking about, i'm talking about memory locality being the performance difference (and in this case, array layout for matrix multiplication).

The naive obvious "dot product" matrix mult of two Row Major matrices is 100-1000x slower than somewhat fancier layouts, or even simply transposing the right hand matrix can make a significant difference, let alone more fancy things.

Often the biggest throughput bottleneck for CPU bound algorithms in a numerical setting is the quality of the memory locality (because the CPU can chew through data faster than you can feed it). Its actually really really hard to get C / C++ to help you write code with suitably fancy layouts that are easy to use.

Amusingly, I also think most auto vectorization approaches to SIMD actually miss the best way to use SIMD registers! I've actually some cute small matrix kernels where by using the AVX SIMD registers as a "L0" cache, I get a 1.5x perf boost!

This is like replacing the compiler optimizer algorithm with your own, similar to the method of writing critical function in Assembly, right?

Still I don't see the connection to Haskell, can you elaborate ?

oh, thats just me rambling about why i don't trust compiler autovectorization :)

well: 1) i've been slowly working on a numerical computing / data analysis substrate for haskell for over a year now.

2) the haskell c ffi is probably the nicest c ffi you'll ever see. Also pretty fast, basically just a known jump/funcall! And no marshaling overhead too!

3) theres a lot of current and pending over the next year work to make it easy to write HPC grade code using haskell. Some approaches involve interesting libs for runtime code gen (the llvm-general lib for haskell is AMAZING).

Theres also a number of ways where ghc haskell will likely get some great performance improvements for numerical code over the next 1-2 years! (i've a few ideas for improving the native code gen / SIMD support over the next year that I hope to experiment with as time permits)

Right, I read 'naively' as 'natively'. Carry on!

to answer your question

just plain ole mathematical modeling / machine learning, and associated duct taping the tubes.

I am also going to be releasing the start of a "Numerical Haskell" tool chain sometime (hopefully this fall, but life happens and all that jazz)

Your problems sound interesting. Could you elaborate?

just plain ole mathematical modeling / machine learning, and associated duct taping the tubes.

I am also going to be releasing the start of a "Numerical Haskell" tool chain sometime (hopefully this fall, but life happens and all that jazz)

In what planet? I'm not even going to use myself as an example because I do other heavy stuff with my PC, I'm going to use my non-tech friends: one of them got a new laptop with 8GB of RAM, why? because she was complaining about webapps using too much memory and slowing down her previous system.

Regular users don't know or care about memory management, they don't even close old windows or tabs, its about convenience. That's not a problem in mobile where the need is the mother of invention so mem management is automatic and chrome reopens the tabs you had by itself, but in a desktop environment (specially windows) one wrong click and the session restore in chrome wipes your previous session.

But it was cheap, cheaper than an unlocked iphone and it gets the job done so its ok for her.

> one of them got a new laptop with 8GB of RAM, why? because she was complaining about webapps using too much memory and slowing down her previous system.

I suspect a few people on HN will be reluctant to see their role in this arrangement. ducks

Don't duck, it's the truth.

We built really powerful computers and decided the best way to use them was to run web apps consisting of a poor performance scripting language with a poor performance visual-rendering language and half the people who build with it seem to think that anything done in a web browser is free.

"Modern" CSS for even a simple site is pretty silly. So many horribly inefficient rules, sometimes dozens of levels of overwriting properties, etc. And gobs upon gobs of Javascript that does very little but is constantly running, checking input boxes, animating every tiny little detail, doing things that can be done without Javascript, etc.

I have better system performance when running a Windows 7 VM, a semi-bulky image editor, or compiling the kernel than I do with a few bulky webapps in Firefox. And this is on a desktop built just 4 years ago.

I have a light-weight Linux setup that uses 80MB of RAM after logging in. It has 1GB of RAM. I can't run GMail and Facebook Firefox and browse the Internet at the same time with minimal open tabs. It's sad.

It's not like it's constrained to the web either. I had a bit of an epiphany about the state of desktop Windows a few years ago when for a hobby project I wrote a UI using only Win32 and C and no external dependencies. Maybe it took longer to write than it should have but I was really shocked at how the thing ran so much faster and smoother than just about any UI I was using on that machine. It dawned on me that all those layers of MFC, WinForms, WPF, Silverlight, WinRT and whatever else they'll come up with in the name of developer productivity are no match for a set of APIs designed to run at 25MHz with 8MB of RAM.

This is believable, but I'd like to dig a little deeper. Do you remember which Windows apps you were comparing yours against? And on what version of Windows?

This was Windows 7, I am pretty sure with dwm disabled, and it was comparing (albeit informally) against every Windows app I used at the time.

I know that some newer frameworks take advantage of GPU features (actually I used to work as a dev in the Windows org) but guess what, GDI is still faster.

This is why I still use Winamp, nothing else comes close to dealing with playlists of thousands of items while also being fairly customisable.

Let us also not forget it's power to literally punish ungulates as well.

I can pop-up Activity Monitor any time in the day, with multiple apps open, and it's guaranteed the app using the most memory is a browser (any browser).

As a data point: Safari is using 600+MB now with just HN and GitHub open. It's using more memory than all the other apps (editor, terminal, mail, Rdio, Skype) combined. 600MB is not much by today's standards, but comparatively, is ridiculously wasteful. It's a damn CD.

Hope the Mavericks update improves this a bit since I'm short on RAM on this machine (4GB).

Mavericks does amazing things to memory usage. My machine has 4GB as well, and safari went from 1GB+ memory usage to 200MB memory usage for the same amount of tabs. Under ML, I was swapping constantly; now, no swap at all.

Did you count up all of the individual processes? Webkit is now finally on par with Chrome in that each tab is it's own running process.

Did update today, and it's certainly much snappier, it rarely hits the disk now.

I have 16MB, and usually have dozens of windows/tabs open in Safari, Xcode, Textmate2, and a few miscellaneous apps. Currently, Activity Monitor shows 15.9x MB used, 10.87 App, 1.26 File cache, 1.91 Wired, and 1.94 Compressed. Swap used 0 bytes, virtual memory 20.12 GB. No noticeable hesitations attributable to compressing/uncompressing when switching windows/apps.

4GB is plenty of RAM for a web browser with many tabs, especially with an SSD for backing virtual memory. If it isn't there is a memory leak, and if there is a leak, 8GB is as bad as 12GB.

My MBA has 4GB and I constantly forget, because it is enough, and I am constantly surprised that it doesn't have 8GB.

All modern web browsers save and restore sessions across restarts.

> 4GB is plenty of RAM for a web browser with many tabs,

I always up-to-date Firefox, so hopefully the memory leaks are minimal. I reboot my Linux machine every month-ish and don't exit Firefox unless necessary. I always keep a few "app" tabs pinned, only a couple of them are heavy. I have 4GB of RAM and turned swap off for an experiment. Sure enough, after a couple weeks of browsing and keeping about 25 tabs open, Firefox's memory consumption would creep up and up and eventually it would crash.

I can only guess/hope that there are memory leaks involved.

Re: MBA/MBP - with Mavericks 4GB is doable given how it aggressively manages memory: http://arstechnica.com/apple/2013/10/os-x-10-9/17/#compresse...

Regular users don't know or care about memory management, they don't even close old windows or tabs, its about convenience.

Why should they? If a tab's "dormant", the browser can just quiesce JS timeouts and let the tab's memory get swapped out. Not a problem anymore.

That most (all?) web browsers don't do this currently isn't the user's fault.

That's not interesting idea, but I'm not sure any browser is designed to be able to fully restore the state of a tab that it's swapped out.

Do you even know what swapping is? The whole idea of swapping is that the OS pages out memory without the application being aware of what's happening!

A bit over a year ago RAM was cheap enough you could just stock up. I have 16GB of RAM and paid 80€ or something for it. I keep stuff in tabs or windows now I used to stick in bookmarks. Right now I have Visual Studio open after work as I want to do a deployment in ~3h. No need to close and re-open the rather big project later.

Very true, but the vast majority of computer users aren't pushing the limits of their systems, and I think that's what the author is getting at. If you look at the market as a whole, the need for more powerful computers isn't nearly as big as it used to be.

And those users are overwhelmingly the ones that tablets suit just fine.

That's the problem in this type of "PCs are fine, people just don't have to upgrade" argument. (And I've made it myself, before I really worked it through and took stock of just how far tablets have already come.)

The use cases satisfied by an old PC could all be satisfied just as well by today's tablets. Yes, the software support for keyboards, external displays and server-type services isn't quite there. But that's solely a software limitation. Not a limitation of the hardware platform. It will be addressed and more PC sales will disappear.

And the actual remaining PC-justifying use cases are precisely those where an old core2duo isn't "good enough" and simply buying some more RAM or even an SSD isn't going to obviate the need for new hardware for 5+ years.

So inasmuch as people still need PCs, they have to show up in the sales charts. And inasmuch as we don't, mobile devices are going to eat that market as the software evolves.

There's really no way around it.

Frankly, I think android and/or iOS are one good upgrade-cycle away from decimating laptop sales. If either or both really focus on the 'docked' use case -- better support for external displays and keyboard support to facilitate top-flight bulk text entry/editing -- laptops are dead to the general public.

If a student can so much as use a keyboard case and tablet to write a term paper or code up some homework as easily as on a laptop, it's over. And there's no hardware preventing that. It's software. If Microsoft wasn't so inept, they'd have been way out ahead on this.

And, frankly, between the potential of the docked use case and "BYOD" policies, mobile devices could seriously eat into the enterprise desktop market as well.

You build a device cradle, that connects to an external display, power and a keyboard/trackpad, and enables the OS and apps to automatically sense that connection and shift their interfaces accordingly and you'll see how few people truly need a PC anymore.

I completely agree. I wasn't trying to say that PCs aren't dying, I was just trying to contend the notion that power users make up a significant portion of the market we're talking about. In fact, I think tablets directly show that the vast majority of consumers don't need powerful hardware, because tablets aren't very powerful machines at all.

Oh I was absolutely agreeing with you and just expanding and making explicit how this totally contradicted the article's argument.

I love the idea of device docks and I would very much like to see things go that way.

I think docks have some hurdles to jump in the consumer market, though. They're unsexy - I think most people associate them with boring enterprise computers and boring enterprise work, and they tend to be big and ugly. They're expensive too, considering that most people view them as nothing but plastic with a few connectors and wires inside.

If someone can figure out how to jump those hurdles, though, and make docks sexy to the consumer market, beige-box sales will plummet. Make them smaller, easier, cheaper. Make them less proprietary - at the very least, a Company XYZ consumer dock should be compatible with all/most of of Company XYZ's portable offerings. Make them wireless and automatic (OK, I realize this conflicts with "cheaper")! Let me drop my Win8 tablet on my desk and immediately have it display to my dual monitors and let me use it with my mouse, keyboard, printer, big external drive and wired network connection. Let me press a button and have it display to my TV or play to my big speakers. Have your cake and eat it too.

I mention Win8 specifically because the whole idea of it is that it's a tablet and a desktop OS in one. Why on earth is there no Microsoft-produced drop-in docking solution for the Surface so it can actually be both?! The consumer potential is crazy - got a crusty old desktop PC laying around? Toss it, keep the peripherals, buy a Surface+dock and you've got a new tablet/laptop and desktop. You can "dock" a Surface as-is with two cables (a micro HDMI and a USB hub with all your desktop peripherals wired in), but that leaves out wired network (maybe important), certain peripherals like speakers (maybe important), and power (critical), and even two cables is two too many.

Most people who say they need a PC don't need a beige box, they need a keyboard, mouse and monitors on a desk where they can work. The form factor of the box that the CPU, disk and memory come in doesn't matter when you're at the desk, so it might as well take the form of something you can take with you and use when you're not at the desk: a laptop, tablet or phone.

Docks will take off when you can make them wireless.

My tablet can connect to my wifi when I come home and sit at my desk. There's no reason at all why it can't connect to my mouse+keyboard at the same moment; so someone needs to solve the technical problems of it being able to connect to my large monitor wirelessly, and we're set.

None of those use cases are going to grow the PC market. The things you're describing have always been a niche (remember Workstation Class PCs?) that may add a few $$ to the bottom line, but they are not going to drive growth.

The PC market has relied on end users - consumers and business users - for it's growth engine for decades, and that appears to be drying up. One of the reasons for that is outlined in the article, for most use cases we don't need faster.

Indeed. I work with RAW photographs fairly often, and simply exporting an album with a few hundred RAW files to JPEG takes a surprising amount of time with fast, modern hardware.

Nothing is ever good enough for a development box when you use an IDE and work with larger and larger projects.

Faster CPUs, more memory, and faster storage are always welcome. I look forward to the day when Eclipse and other IDEs really start taking advantage of GPU stream processors for indexing and validation.

People snack on smartphones, dine on tablets, and cook on PCs.

A lot of people don't want to cook, so are happy with smartphones and tablets.

Why buy a desktop or laptop when an iPad will do everything you need for a fraction of the price? That's what people mean when they sound the death knell for the PC.

Why cook when you can eat chips and order pizza? Probably because it's better for you and because cooking has cultural significance that goes beyond simply replenishing calories.

People who cheerfully proclaim that PCs are dead forger that PCs aren't just devices, they also attained a certain level of cultural significance. IF the death of PCs also means the death of PC culture (which involves things like game modding, hobby website making and so on), then the death of PCs is a really, really bad thing.

Division of labor. Not everyone has to be a producer in every sphere; it's okay to be a producer in some, and a consumer in others.

Plenty of people are too busy with other aspects of their lives — doing things which may, for all we know, be of great cultural significance — to spend time being a producer in the digital sphere as well.

Some people devote their lives to cooking for others; others devote their lives to other pursuits, and only ever consume food produced by others.

That's where the analogy falls apart, people who only eat out are a small percentage of the overall popular whereas people who only need a tablet are purportedly the norm.

The numbers don't have to be the same for the analogy to work.

In any case, I think you're underestimating the number of people who never cook, or cook very infrequently. In your average family household, one person typically cooks the vast majority of the meals.

To clarify, by cooking I mean real cooking — beginning with raw ingredients, going through numerous stages of preparation requiring some degree of skill, etc.

It’s awesome to do cool, fulfilling things and for some of those things you need a computer. For others you don’t.

I really don’t see why everyone should use a computer, given the wealth of possibilities out there.

Also, I’m pretty sure the death of the PC doesn’t mean any of what you are insinuating. It will be more like the death of horse riding after the advent of the car. (If I want to go horse riding there is a club that offers that not five minutes from where I’m living. Horse riding is dead – but that doesn’t mean it’s impossible or even hard to go horse riding today.) Only that PCs will probably be an order of magnitude more relevant than horses are today and, while not always as relevant as in the past in certain contexts (at home), they will still be relevant in others (some work, academia, …).

I’m still pretty confident in the prediction that the PC at home will die. (Which will not mean that no one will have one at home. Just far less people than today.) I’m also pretty sure that the PC at work and in academia will not die.

Sounds like a great thing for job security in two decades when almost nobody but people born during the PC era know how to program.

That stuff is never going away. "Death of PCs" doesn't mean the complete disappearance of them, just their death as a dominant consumer item. Professionals and prosumers will never stop needing PCs, and they're the ones who constitute the groups you mentioned.

You took the cooking analogy too far. Programming a hello world app isn't any better for you than writing the great american novel.

It is if the future of humans on Earth is predominated by computer programming skills, even if basic.

A fraction of the price? Not hardly.

A 64gb model of the iPad costs $700 (because 48gb of storage should cost $200 to pad those juicy margins).

I bought an amazing desktop from HP last year on a black friday sale for $779. For what's in it, you couldn't have assembled it from Newegg at that price.

In another generation or two the typical Chromebook will be superior to the iPad on performance, while being half the price.

You should buy a desktop or laptop because you get drastically more computing power at the same price.

It's as if the PC is some sort of professional tool, like a truck.

Production Computer?

I think it's more like a set of tubes.

Nicely said. I don't think we'll see the full conversion for a few years, but this is more or less how I think of it.

As for me, I'd much rather have my personal chauffeur carry around my full kitchen and always fresh ingredients so I can eat in luxury any time I wish.

Thank goodness for tablets with full XWindows support to my desktop and the university supercomputer. I like broken metaphors.

Does a tablet really constitute a full kitchen, with no compromise?

To me a tablet is a cramped working space (small screen, limited memory), difficult to use (requiring additional utensils like a keyboard, mouse to make certain tasks bearable), with limited storage space (no cupboards), limited processing power (more like a camp stove than an oven), and only usable in short bursts.

Edit: In any case, the analogy is as much about the time, effort and skill required to cook as it is about having the relevant equipment at your disposal.

Attribution for that analogy should be given to Citrix http://blogs.citrix.com/2012/01/09/do-ultrabooks-mean-ultra-...

Great analogy.

Very well put!

The PC market isn't dead, but then again, the Mainframe market isn't dead either.

The Post-PC devices[1] (tablets / smartphones) are it for the majority of folks from here on out. They are easier to own since the upgrade path is heading to buy new device and type in my password to have all my stuff load on it. If I want to watch something on the big screen, I just put a device on my TV. Need to type, add a keyboard.

The scary part of all this is that some of the culture of the post-PC devices are infecting the PCs. We see the restrictions on Windows 8.x with the RT framework (both x86/ARM), all ARM machine requirements, and secure boot. We see the OS X 10.8+ with gatekeeper, sandboxing, and app store requirements with iCloud.

The PC culture was defined by hobbyists before the consumers came. The post-PC world is defined by security over flexibility. Honestly, 99% of the folks are happier this way. They want their stuff to work and not be a worry, and if getting rid of the hobbyist does that then fine. PC security is still a joke and viruses are still a daily part of life even if switching the OS would mitigate some of the problems.

I truly wish someone was set to keep building something for the hobbyist[2], but I am a bit scared at the prospects.

1) Yes, I'm one of those that mark the post-PC devices as starting with the iPhone in 2007. It brought the parts we see together: tactile UI, communications, PC-like web browsing, and ecosystem (having inherited the iPods).

2) I sometimes wonder what the world would be like if the HP-16c had kept evolving.

> I truly wish someone was set to keep building something for the hobbyist

I really don't understand your concern.

Hobbists have a wider selection of computing tools than ever before (altough, that statement was true at any time since the 50's). We have the entire arduino ecosystem for hardware hobbists, throwaway PCs like the Raspberry Pi for embebbing real computers everywhere, several different standards of desktop-capable parts for more powerfull systems, and the server ecosystem for the real beefy ones.

Most of those computer types aren't even able to run Windows or OSX. iCloud and Secureboot won't make them go away.

> Hobbists have a wider selection of computing tools than ever before

I don't think that's quite true. We had Heath kits and a lot more variety of computers from the late 70's to the early 90's. There is no under $200 computer sold at major retailers like there was in the 80's.

Hobbism is not trendy* nowadays. There is nothing aimed at hobbists for sale at any big retailer. It's not a problem with computers.

* Well, there is a perfectly rational reason for that, and it is not really a problem for hobbists. But that's the fact.

It's a big problem for the starting hobbyist. A kid will probably receive an iPad rather than something to start them on their way to being a programmer or EE.


There are LOTS of under-$570 computers sold at major retailers today.

I don't consider inflation a valid excuse when we continually hear computers get cheaper every year and the Mac mini is selling at the Apple II's old price point. The PC industry abandoned the sub $200 because of Windows licensing, Intel, and Apple taking the old Apple II price as a floor.

$570 is a lot father out of reach today for many than $200 in the 80's.

Computers DO get cheaper every year. That Core 2 Duo machine you bought in 2006 might have cost $2000, but those components (if you can even find a place to buy them) would likely cost about $300 today.

It's unfair to say that they don't get cheaper considering that a Mac Mini is an entirely different class of machine from an Apple II, in so many ways that it's ridiculous to even try to list them here.

As for your statement about $570 vs $200... what do you base that on?

No, computers in the same price range are more powerful. The price range for computers under $200 has disappeared. The Mac mini and Apple II inhabit the same price range. You can get more power in the same price range next year but that same power never trickles down to the price ranges that disappeared in the 90's.

I built a dev/gaming machine back in early 2010. It's stout, but not a ridiculously expensive (~$1,000) behemoth. The only thing I've done since then is toss some more RAM in so I could have two sets of triple channel DDR3 instead of one. I can still run just about any modern AAA game at the highest settings.

The only time I felt like I've needed an upgrade is while playing Planetside 2, which is/was very CPU bound for my setup. However, when it was initially released, Planetside 2 ran like a three-legged dog even on some higher end rigs. It's much better after a few rounds of optimizations by the developers, with more scheduled for the next month or two.

I dual boot Linux boot on the same machine for my day job, 5 days a week all year. For this purpose it has actually been getting faster with time as the environment I run matures and gets optimized.

As good as it is now, I remember struggling to keep up with a two year old machine in 2003.

> I can still run just about any modern AAA game at the highest settings.

AAA games mostly target the console. Look at GTA5, which isn't even out on the PC. Most AAA games will run on a PS3, which came out in 2006, and has 512MB of RAM (combined system / graphics).

That said, there's a point of diminishing returns - making games look much more realistic will take obscene amounts of resources.

I expect there to be a system requirements explosion for PC games now that the new consoles are right around the corner and AAA game developers can finally target much higher specs.

An interesting opposite is the previously mentioned Planetside 2. It was initially released for PC only, and was a resource hog. Now they're working on bringing it to the PS4, so they're having to do an extremely aggressive round of optimizations to make it run decently. The optimizations will get a lot of testing on the PC version (think it's hitting the test server next week). PC players will benefit as a result of the console port.

Planetside 2 is a weird example, though. I don't think there will be many games that have a 1-year+ delay from landing on PC before they hit consoles.

GTA4 was actually a sore spot for PC gamers. The game was not optimized, so it ran like crap if you didn't have a higher-end set-up.

~$1,000 sounds quite cheap for a gaming machine which can still run any modern game at highest settings. I remember I built mine at about the same time (2010) for $2500, top notch video card, fastest cpu, lots of ram, but its 2013 now and I can not say that it runs any modern game at highest settings.

It was surprisingly easy. I'm not saying you over-paid, but for $2,500 I could have built something pretty ridiculous. Most of my money went into processor and GPU, which are typically your two big ticket items.

I trolled around Newegg looking for upper-middle tier components with a higher quantity of good reviews. A lot of the times you won't see a lot of the recently released stuff with useful reviews, so some of the parts were actually circa 2009'ish instead of being latest and greatest (2010).

I didn't splurge for a super expensive case, and my power supply wasn't modular (making it pretty cheap). i7 with a decent mobo. Went AMD for the GPU since (at the time) they were the best bang for buck. Got some cheaper G.SKILL 1600mhz DDR3 RAM (which has worked awesomely for me) for next to nothing, and I was ready to roll.

Post your specs to give your post some legitimacy… I hope you didn't skimp out on your motherboard…

2010 $1000 rig running all 2013 AAA titles at the highest settings sounds unlikely to me…

Not the parent poster, but I also bought a computer in 2010 for about $1000 and am very happy with it today. I still play games at max or near max settings with no problem. This does not include a monitor though; also I only bought a 1650x1050 monitor, so I save a bit on graphics power by not having the full HD pixel count (or more).

EDIT: don't know my full specs offhand; i5, gtx 460, either 4 or 8 gigs RAM (4 I think), a 120gb SSD and 2G HD

My i7-875K (overclocked, but not exhaustively; my Gigabyte motherboard has an auto-overclock function that raised the multiplier without killing stability) with 16GB of RAM, entirely too many SSDs for any one person, and a Radeon HD6870 is capable of playing every new game I've bought this year (Dishonored, Saints Row IV--I don't buy games from EA so the new title list is pretty short right now) at max settings at no worse than 35FPS--I personally can't detect a difference between 30FPS and 60FPS, so I don't care.

Neither Dishonered or Saints Row IV are particularly intensive games. I get in excess of 100FPS in both and my cards (GTX 670s SLI) barely hut 30-35% usage…

I don't doubt the OPs rig is powerful enough to play modern games, but I seriously question the "any modern game at highest settings" statement.

FWIW, I built a rig a great rig in 2010 which I still use from time to time. i7 920, 12GB DDR3, SSD, SLI GTX 460 2GB.

More like impossible. The Internet has many PC building communities and none of them could do it. Hell an i7 and good GPU at the time easily brought you over 600$, which means very little money for a motherboard than can keep up.

Well now you should just post some core specs so we can test your claim. Exact CPU, GPU, and RAM amount?

Looking at the Tom's Hardware system builder challenges from a year ago or two years ago, and the comparing them to the $500 and $1000 machines that they have now in their System Builder marathons should showcase the tradeoffs in performance vs price more starkly.

You overpaid. You're better off spending $800 and upgrading 3 times as often.

Particularly with GPUs. I wouldn't touch something like Nvidia's Titan unless I just had gobs of money lying around.

Well yeah, something like the Titan has no price / performance at all. Usually the best valuation is around $200, ie, an r7 270x or a 760. And if you want a premium card, 280x / 770 are both devouring modern titles. Anything past that is "I want the best there is right now, screw the cost".

Just picked up an r9 280x ERRRRRRRR... Asus CUII 7970 from Fry's for $279

Happily ran BF4 beta on High/Ultra 2x MSAA at 1080P.

That's not the point. He said he can run any modern game on high settings on the computer he bought 3 years ago, I doubt you can play any game in 2016 on high settings if you buy a computer for $800 today.

I can do the same, also with a $1000 computer bought in 2010.

Why does no-one ever factor in the costs of transferring systems in this upgrade process? It's not trivial to set up a new system and bring your stuff over.

In terms of gaming performance, when you go from mid range machine to high end machine, you are often spending 2-4 times as much without 2-4 times the performance. You hit diminishing returns big time.

I go with generally mid range components with my gaming machine. Even then, I upgrade every 3 generations for CPU and every other generation for video card. CPU performance doesn't impact gaming as much as it used to.

This gives me reasonable performance in most games around high to ultra on a 1920x1080 monitor.

Bingo. And there's now some pretty fierce competition between Nvidia and AMD again, so you can often catch some pretty good deals on a GPU or a combo if you pay attention.

I still use an AMD processor on my gaming/development rig, because upgrading to a better Intel CPU would have cost me nearly twice as much (I already had an AMD motherboard, so that helped). I don't notice a bottleneck in most games.

I think the best strategy is to get mid-range components and just upgrade more often. You get way more bang for your buck, and you can always sell the old components on eBay or whatever.

IMO, you don't need the fastest CPU for gaming. I saved a ton of money by getting an AMD FX-8350 (I already had an AMD motherboard on hand, so that helped too), and although it does bottleneck in really CPU-intensive games, I can run almost all games on the highest settings at 2560x1440. The graphics card is really the only component that you need to splurge on for a gaming rig these days, because it's the bottleneck for the vast majority of games.

2010? If you got an i5-2500K and a 580, you'd probably get a PC that came in around $1200-$1300. If you could reuse components like cases, hard drives, PSUs etc. you could easily get that in under $1000. That should still run 90% of modern games maxed out at 1080p, with the exceptions being pretty much just Metro 2033 and Planetside 2.

If you're doing multi-monitor or >1080p resolutions, then you might need to get something better than the 580 however.

Im running a 2500k at 4.3ish with a CM 912+ in push/pull and happily devour pretty much anything. I don't foresee a CPU upgrade anytime soon. I had a pair of 6950 (the older 1 gigs) and just put in a 7970 and another 4GB of RAM for less than 400 bucks and only did that because BF4 is a resource monster. Everything else was fine.

I bought a ~1000 setup in 2009 and it still runs most games acceptably (not necessarily highest settings). it was a Dell boxing day deal $500 computer + $350 video card (GTX 260) + changed the case and PSU.

Maybe the OP doesnt play some of the most demanding games?

> Maybe the OP doesnt play some of the most demanding games?

I play some pretty demanding games, but I was able to get a lot of performance per dollar building custom. I knew where it was OK to spend more/less and did a lot of research/shopping around.

Modern games are gated by GPU, not CPU. Except a rare one like Dwarf Fortress.

It's not even GPU today. RAM is pretty much the bottleneck for everything. CPUs have been so much faster than RAM for so long now that they have to keep doing more and more tricks just to keep utilization up at all. With gaming, it's always texture memory and transferring from RAM to the video card that's expensive.

I'm looking forward to the first CPUs or GPUs that are double-layer with memory integrated on the second layer...

Not true. ARMA or BF3/4 multiplayer on large maps can choke your CPU with ease.

I built mine in early 2007 for around $2000 (+-$300, can't remember exactly) and it's just now really starting to show its old age. It could use more RAM and an SSD (maybe), but for 99% of what I do, including gaming, it's plenty fast enough. I can't run the very absolute latest AAA games on highest, but if I turn the resolution down a hair or turn off antialiasing they run fine.

In fact the only thing I really want a faster machine for is some of the latest emulation techniques (Higan) and a vague desire to play around with some virtualization odds and ends.

I guess this will change for a little while as updated consoles come out and games can improve their graphics as a result, and also when 4k monitors start coming out. But yeah, until those two things come into play, older computers still play games just fine.

just for the hellavut i checked out gaming at 4k (http://www.tomshardware.com/reviews/pq321q-4k-gaming,3620.ht...) and note that it takes Titan's in SLI (at $1000 each) to get good framerates on many modern games.

Like another poster said, with "game" terms replacing "data": " But, if you can run a [game] and it's [good framerate], it just means you should really be doing better [gaming] over more [pixels]."

Don't worry, PC manufacturers are currently selling machines that are already obsolete.

My dad went to Walmart and bought a computer (why he didn't just ask me to either advise him, or ask if he could have one of my spare/old ones I don't know) and monitor for $399.

It's an HP powered by a AMD E1-1500. It's awfully slow. Chokes on YouTube half the time. My dad is new to the online experience, so he basically uses it for watching streaming content.

I could have grabbed him a $99 Athlon X4 or C2D on craigslist and it would better than this thing. I'm not sure if he'll ever experience a faster computer so I don't think he'll ever get frustrated with this machine, but it's amazing that they sell an utter piece of shit like this as a new machine.

>AMD E1-1500 >Bought a computer _and monitor_

Did he buy a notebook? I've never even heard of the AMD E1-1500 before today. Everything I see says that its a notebook processor, and a pretty terrible one at that. (2cores/2threads, and only 512kb of L2 cache!?)

What's worse is that it's a BGA package, meaning it can never be upgraded. If that's really a desktop machine (and not some form of "all in one") that's vendor lock-in at its absolute worst. They've ensured that instead of buying a $99 processor he has to go out and buy a brand new machine.


That's just awful. In 2013 you shouldn't be able to buy a computer that can't stream 1080p movies with ease.

It's a desktop. I've browsed the Sunday ads for Best Buy and Walmart and it's common. Example:


future is distributed unevenly. some retail outlets are still in the past.

A tablet is a PC. Especially as x86 processors start taking over arm processors.

Just because it doesn't sit in a big box doesn't mean it's a different class of system. The difference is really the openness of the platform, comparing something like iOS to Win 8 pro.

That said, many tablets are basically what we would have thought of as PCs before. Consider something like the Samsung 500T or similar, or thinkpad helix. Components are small and cheap enough that they can be packed behind the LCD, and you have essentially a laptop that doesn't need it's keyboard.

Will iPads take over PCs? No. They are too limited, not because of hardware, but because of OS limitations. Will tablets take their place though? Quite possibly. The portability is quite handy. That I can dock a tablet with a keyboard and have a normal PC experience, but have it portable when I need it is a selling feature.

The obvious cavaet is that a limited OS is fine as long as the majority of data is cloud based. In that case even development can be done on a closed platform, and the tablet becomes something more akin to a monitor or keyboard. More of a peripheral than a computing device. We might get to that point, but that's not the cause of the current trend.

A tablet is a PC only when attached to a full-sized keyboard and a full-sized screen.

Input and output is the major differentiator, not the processor or OS.

If everyone adopted the attitude of the author of this blog, all innovation everywhere in the world would cease instantly because, for most of us in the developed world, everything is good enough already. There are many points throughout computing history at which existing hardware was overkill for the things that we were asking our computers to do. Had we stopped innovating because of that, the world wouldn't be anywhere near where it is today.

In high school I recall lusting after a $4,500 486DX2 66Mhz machine with an astounding 16MB (not GB) of RAM, and a 250MB hard drive. A few months ago I spent a little less than that on a laptop with 2,000X that amount of RAM, 8,000X that amount of hard drive space, and a processor that would have not so long ago been considered a supercomputer.

I for one am glad that we have continued to innovate, even when things were good enough.

No innovation would hardly die but hopefully focus on something important. We will hardly run out of need for innovation any time soon. We rather have a problem of innovative people focusing on the wrong things, because incentives.


The computer was a great invention. The Internet also is a big enabler. But the latest computers and phones are hardly innovative: Compared to what you got 5 years ago, they might be smaller and have better power efficiency. But on the grand scale, how does that matter?

If you see the latest Macbooks being introduced, you probably want to get one. It's very shiny and the Retina screen will allow you to experience computing in a great way. People try to become happy by spending money for experiences. Tourism, iPhones, hipster coffee shops. They don't do it because it's the universal recipe for happiness and living your life, but because it's what capitalist societies expect you to do. Most "innovation" and "disruption" only leads to zero-sum money shifts inside this system. If you think that a retina screen is innovative, i think you need to get some perspective.

The same goes for cars. The Germans (where I come from) think that they're innovative because we have a few luxury car makers here. Cars in general are great, they provide mobility and that's useful. But how are the new cars better than what was available 30 years ago? They're not even more fuel efficient.

Bill Gates seems to have got it when he stopped working full-time at Microsoft. All over the world people are using their software, but if we were using OS/2 and Lotus instead of Windows and Office nothing would change. It went very well for him and his company, but nothing they ever did was as important for humankind as what Bill Gates is doing now: giving life to millions by completely eliminating malaria and polio from the planet and supporting AIDS and TB research with huge sums.


If you see the latest Macbooks being introduced, you probably want to get one. It's very shiny and the Retina screen will allow you to experience computing in a great way.

I do not want to get a Macbook. They are vastly overpriced and underpowered. If I ran a movie studio, I might buy a Mac Pro. But that's about it.

Cars in general are great, they provide mobility and that's useful. But how are the new cars better than what was available 30 years ago? They're not even more fuel efficient.

Look at traffic fatality numbers over the last 30 years, and tell me that we haven't made innovations that haven't impacted lives. How many kids have parents that were able to raise them because they didn't die in a traffic accident that they would have died in before?

> Compared to what you got 5 years ago, they might be smaller and have better power efficiency. But on the grand scale, how does that matter?

This is insane, wrong, and dishonest. How old are you? I'm 30 and during my adult life (the past decade, basically) we've gone from phones that were essentially unreliable walkie-talkies with shitty battery life to ultra fast portable computers with 10 megabit internet connections and 4 or 5 different onboard technologies (gps, camera, etc).

WTF are you talking about sir. Do we live on the same planet?

Smartphones and tablets are the new TVs. Of course, hardware is much faster than 10 years ago, but what is it used for? What is the impact of smartphones on humanity? They have changed communication and entertainment patterns quite a lot, and not in a good way I'd argue. Communication is now cheaper and faster than ever, which means that many people don't think for themselves before they tweet or write an email. Also, many people prefer not to think at all and use their phones to distract themselves by consuming a constant stream of meaningless stuff.

By the way, the mobile phones that I had 10 years ago (Sony Ericsson, Siemens) were quite reliable and had good battery life.

I disagree.

I think that if everyone adopted the author's attitude, we get innovation where it is needed most... Which is exactly what is happening: there has been a shift in the focus of processor development over the last ten years or so from "make it as fast as possible and who cares about the power consumption" to a more performance-per-watt oriented approach.

edit: worldsayshi got there first.

Wow, downandout, your post truly has it all:

- Genuine loathing for the blog post OP

- Smarmy "I'm better than you" attitude

- Examples of how you aren't like OP

- Call out to historic reasons why OP's mindset is terrible

- Multiple over-the-top statements ("If everyone adopted the attitude of the author..." is my favorite!)



I was just about to say that about your response. All I said was that if everyone adopted the "good enough" attitude, innovation might have stopped before we ever had computers. You somehow used that to call me a smarmy scumbag. Are you somehow related to the author?

>I for one am glad that we have continued to innovate, even when things were good enough.

Needing more RAM and more HD space (especially in order to offer the same feature set) is not innovation.

It's not that people don't need a new PC because their old PC does just as good a job as it did 5 years ago. It's also not because your average mom and pop are upgrading their own rigs themselves that new PC sales are slow.

It's that when tablets hit the scene, people realized they don't need their PC for 90% of what they do on a "computer". Email, social networking, shopping, music, video etc.

Us old geeks who swap hardware, play PC games, tweak OS settings and generally use yesterday's general purpose PC will be the ones remaining who keep buying new hardware and complete machines.

The general public meanwhile will only buy a PC if their tablet/smartphone/phablet needs expand beyond those platforms.

The market will shrink but it will turn more "pro". The quicker MS evolves into a modern IBM the better.

Replace 90% by 100%. The use case of the PC at home is disappearing fast. What can you do on a PC that you cannot do on a tablet? If you really think about it, not much at all. On the iPad Apple's creativity apps [1] go a long way: the basics for the amateur who wants to create content are covered. At home the only thing I actually need a "PC" for is to import CDs, I cannot think of anything else.

The rest is a matter of taste, I personally prefer PCs (well, Macs) but I can definitely understand people preferring tablets, if only because they are so much cheaper, and generally so much easier to use (to my astonishment my daughter figured out how to use my iPad to watch videos on YouTube while she was still 0 years old, before she could even speak...).

When it comes to work I need a computer. And event that might change in the future [2].

[1] http://www.apple.com/creativity-apps/ios/

[2] http://thebinaryapp.com/

> What can you do on a PC that you cannot do on a tablet?

Easy copy pasting, selecting specific parts of a document... basically anything requiring precision and speed. You can add a keyboard and a mouse to it I suppose, but at that point you may as well have got a laptop.

That's just convenience. Do you really think many people will keep forking $X00 for PCs just for a little bit of convenience needed only once in a while? I respectfully disagree. A couple of anecdotes, for lack of evidence:

My grandfather (89 years old today) switched to the iPad. He was one of the pioneer of computer usage in France (in the steal industry)... at a time when programs were punched on cards! He retired before the mouse/keyboard thingy became popular, and never quite managed to fully grasp it when he got a computer ten years ago. He is not switching back, the iPad is way easier for him.

My parents went full iPad without knowing it: after getting one a year ago they realized that they just aren't using their laptops anymore. The iPad is just much nicer to browse the internet. In short the iPad is better for them 80% of the time, and worse 20% of the time. It can only get better with time, as we get better at making touch interfaces.

Don't get me wrong, I'm not advocating anything, I just think that's where the market is heading, and it's not coming back to the PC world I (and probably you) grew up in. I for one will keep using my laptop... but I know I'm in the minority.

Well, that "bit of convenience" varies in size a lot over the market. If you intend to do any typing over a long period of time, using a tablet just isn't a good idea ergonomically. There are may other reasons to stick to a PC, but this one seems inherent to the tablet format, and once you add a keyboard and a cradle to position the tablet ergonomically, you've lost every advantage that using it might have had in the first place.

So yes, I'd say that there's a large market of people who would rather shell out $X00 for a little bit of convenience rather than $Y00 for something that will be totally unfit for their application. "it's not coming back to the PC world I grew up in" is a truism, but no indicator of the future market shares of PCs and tablets. Also, is there any basis to the claim that you are in the minority?

> It's that when tablets hit the scene, people realized they don't need their PC for 90% of what they do on a "computer".

That's exactly what I see everywhere. People still need a PC for the 10% of the tasks they can't do on a tablet. So, they'll keep a good enough PC around all the times.

What is a different situation from a few years ago, when people were buying a PC for each family member. Now, it's a tablet for each person, a PC for each home, and spare space at the desks.

If 10 % of their needs aren't met by a tablet, they need something more that do, which is my problem with recommending a tablet over a PC. Sure, buy a tablet and use it 90 % of the time, but own a PC as well for when the tablet isn't enough.

I think it's both. Most people are better served with a new, inexpensive tablet and their current computer than they would be with a new ultrabook.

I think people are pissed off with PCs.

They bought a windows machine for what to them is a lot of money (more than a iPad), it didn't last long before it slow and it's got extra toolbars and all sorts of rubbish. What's worse is that this happened last time they bought a PC and the time before and the time before that. They are not going to add a SSD because that's not how they think + they don't how + it's throwing good money after bad + they are dubious of the benefits.

The iPad in contrast exceeded expectations and in the year or two they've had it they had a better experience. They can't get excited about a another windows machine because it's expensive, more of the same and not worth it really.

I agree that this frustration is likely an appreciable factor in the market. I use a pretty swift laptop for work stuff and need that power there, but lately I've been noticing how much I've gotten spoiled by the relative lack of babysitting that my phone and tablet have required. And I'm not even easy prey for the toolbar and crapware BS you mention like a lot of people who were previously in the PC market. PC makers worked damn hard to bring down the dollar price, but let the price in time+frustration stay high and grow until they got undercut by the overall cost.

Backend devs can probably use more computer resources, particularly cores and RAM. We want to simulate whole clusters on our dev machines and instrument them with tools like Ansible and Docker, and then deploy multiple (fairly heavyweight) processes like JVMs to them. But yeah, 4 (fast) cores and 16GB of RAM is available in a laptop these days, along with an SSD and the best display you can buy, for $3k. (Of course I'm speaking of the MBPr).

Games can always use more resources. AFAIK there is still a lot of progress being made with GPUs. 60fps on a 4K display will be a good benchmark. The funny thing is that GPU makers have taken to literally just renaming and repackaging their old GPUs, e.g. the R9.[1] As for the game itself, there is a looming revolution in gaming when Carmack (or someone equally genius-y) really figures out how to coordinate multiple cores for gaming.[2]

But yeah, most everything else runs fine on machines from 2006 and on, including most development tasks. That's why Intel in particular has been focused more on efficiency than power.

[1] Tom's Hardware R9 review: http://www.tomshardware.com/reviews/radeon-r9-280x-r9-270x-r...

[2] Carmack at QuakeCon talking about functional programming (Haskell!) for games and multi-core issues: https://www.youtube.com/watch?v=1PhArSujR_A&feature=youtu.be...

You really can't get the screen, and the amazing OS support for it, anywhere else.

CPUs have not gotten significantly faster in the last couple years, especially at the high end.

Back in Q1 2010 I got an Intel Core i7 980X which benchmarked at 8911 according to http://www.cpubenchmark.net/cpu.php?cpu=Intel+Core+i7+X+980+...

Now in Q2 2013 (3 years later) the very top of the line processor available, an Intel Xeon E5-2690 v2, is only twice as fast at 16164: http://www.cpubenchmark.net/cpu.php?cpu=Intel+Xeon+E5-2690+v...

It used to be that things got faster at a much faster rate. And until this new E5-2690 v2 was released, the fastest CPU was only 14000 or so, which is less than 2x as fast.

Anandtech has a nice analysis of the progression of CPUs over the last 9 years: http://www.anandtech.com/show/7003/the-haswell-review-intel-...

Depending on what you're doing with CPUs there have been improvements, even at the high end between Nehalem and Haswell (on the Intel side). Anandtech's data shows a nearly 2x improvement between those generations for certain tasks.

But the reality is that for other tasks (e.g. the winrar compression benchmark) the speedup is minimal compared to previous years (see Haswell vs Prescott). In many cases where there is a speedup, performance is already so good that you wouldn't notice.

It's pretty crazy to see that in the consumer world, not even games are demanding so much performance from our CPUs.

CPUs are actually getting slower after overclocking

I have water cooled 2600k at 5GHz running 24/7. Bigger GPU and more cores brings additional complexity and more heat. No other CPU can handle this without blowing up.

Does it perform faster in benchmarks than later CPUs overclocked to their lower maximums? Clock speed alone is not a good indicator.

Yes, it does. Newer CPUs have cheaper thermal paste.

You know you're allowed to use your own thermal paste right? Intel won't mind.

If you plan on overclocking your CPU it's probably a good idea to get a slightly better CPU cooler than the standard one anyway.

Cheap thermal paste is under heat sink, I would have to crack open CPU with razor blade. And even after this operation 2600K is better.

Can you give an example of a faster rate in the past?

The jump from Pentium 4 to Core2Duo. You could go from a typical Penitum 4 in 2004 to a new C2D in 2006 and get a major performance boost.

The C2D to i7 jump was pretty significant as well.

It was pretty typical to get +4x performance in less than 2 years before after an upgrade cycle. The problem is that encryption standards aren't increasing at a logarithmic rate like CPUs were, so DoD had to step in and adjust the marketplace. By 2010, it was well known in certain cirles that the i7 would be the last serious CPU in the desktop space, that future CPUs would be targeted to cloudian budgets and applications, where they can be managed and hidden away from people who might use real modern cpus you've may have heard about like Tile-x that are used by NSA to do the very thing they were afraid normal people might do, like break https, farm personal data and use it to affect the market, and basically have control over their lives outside of the financial system. Now that surveillance state is complete, they don't really have to worry about drug dealers breaking away from the financial system anymore.

> You rarely have the need to buy a whole new box.

This is the number one reason why I love the PC above any other kind of computing machine. Need more disk space? Sure, go get a new disk, you may not even need to remove any of the others. Want a better graphics card for that new game? Easy as pie. Your processor died because the fan was malfunctioning? Too bad, but luckily those two are the only things you'll have to pay for. The list goes on.

I bought my current PC on 2009. The previous one still had some components from 2002.

In the enthusiast PC world you swap out parts and upgrade your system bit by bit.

In the emerging post-PC world you just sell your old device and get a new one, not unlike how cars are treated. I doubt anyone has trouble unloading their old iPad 3, they probably get a decent amount of money for it if it's in good condition. This just does not happen with homebrew systems, the risk is too high.

What if one of the reasons we don't need new PCs yet is not that tablets and smartphones are replacing the need for them entirely (although for some people they are), and not that PCs are lasting longer on their own either (although they probably are, too), but that tablets and smartphones are helping PCs last longer by reducing the wear and tear we give them?

I'm still running fine with my 2007 Macbook, but I think my iPhone has extended its life because now my laptop almost never leaves the house and sometimes doesn't even get used in a day, whereas pre-smartphone I used to cart my laptop around rather frequently and use it every day.

I disagree with "The top of the line smart-phone or tablet you own today will be obsolete by the end of 2014 if not earlier."

I will use my 2011 smart phone until it physically breaks. If a 1.2GHz device with a 300MHz GPU, 1280x720 screen, and 1GB of RAM can't make calls and do a decent job of browsing the web, that's a problem with today's software engineering, not with the hardware.

And if Google decides to doom my perfectly good device to planned obsolence, fuck them, I will put Ubuntu Touch of Firefox OS on it. The day of disposable mobiles is over, we have alternatives now just like we do on PCs.

> When your processor is too slow, buy a new CPU, or you get a new heat sink and over clock it

The motherboards for PCs built 5 years ago are completely different from those built today, and the CPU sockets have changed every other year. New processors from Intel will be soldered on.

The performance of a PC from five years ago is probably adequate for web browsing and office tasks. For anything more demanding, the advances in power consumption, execution efficiency and process node are huge leaps from five years ago.

The performance of a PC from five years ago is probably adequate for web browsing and office tasks. For anything more demanding, the advances in power consumption, execution efficiency and process node are huge leaps from five years ago.

In 2000, a three-year-old PC couldn't run modern games. Today, a three-year-old PC simply forces you to switch to lower graphical settings. The race for hogging all the hardware resources did slow down in the past years, which is a good thing for consumers.

> The performance of a PC from five years ago is probably adequate for web browsing and office tasks. For anything more demanding, the advances in power consumption, execution efficiency and process node are huge leaps from five years ago.

Anecdotally, my machine is approaching four years old and I can still run just about every new game on the highest settings. At just under $1,000 at the time, this isn't a luxury yacht type rig, either.

This reminds me of a piece I wrote a couple years ago: http://jseliger.wordpress.com/2011/10/09/desktop-pcs-arent-g... , which makes a similar point. Both articles are less screechy and less likely to get readers than screaming headlines about OMG DEATH!!!

The PC is dead, it's just not dead for computer professionals, and never will be. But for the rest of the world - think mom, dad, gramps,grammy - why on earth do the need the headaches of a full PC (mac or windows)? A good tablet is basically enough for almost everyone else.

Anecdotally, I've found that "non-technical users" have trouble adjusting to any new UI, including Apple laptops, phones and tablets.

For them the choice is between keeping an old computer, on which they already know how to send emails and compose documents, and buying a new tablet on which they will need to learn a whole new set of UI idioms. In addition, physical keyboards and larger screens (and no "ipad neck") make laptops and desktop computers more attractive.

Add Windows 8 to that, and oh boy, there's a lot of tears. My aunt's laptop died so I bought her a new one. Every time I talk to her she's complaining about the Windows 8 UI. She still hasn't figured out how to do some tasks she would do before with Windows Vista (which she had no problem with).

I think even for the casual user there are a still a lot of use-cases in favor of a PC. Basically anything to do with managing large amount of content like photos, music or even fairly light content creation like writing long emails is much easier to do on a computer with a larger screen and mouse/keyboard.

I'll never get this point. Laptop - fine, yes, but a tablet? How do you chat, post FB status, or write an email from the virtual keyboard? It's painful. Especially if you are 50+. I don't see my mom or dad using these devices.

My grandmother is 90 and she does all of those things from her iPad. She's also almost blind and has arthritis, and yet she finds the tablet meets her needs - which mostly involve keeping track of her grandchildren and great-grandchildren via email and FB.

I do all of those things from a phone. It must be a little easier from a tablet.

> The PC is dead

Dell, Lenovo Toshiba and HP all report sales of laptops are increasing; 3.5% in the last quarter.

Death doesn't sound too bad.


Of course PC sales will be low. When you don't have enough memory, you buy more RAM. When your processor is too slow, buy a new CPU, or you get a new heat sink and over clock it. You rarely have the need to buy a whole new box.

i agree that the increased (functional) life of pcs is a contributing factor to slowing unit sales, but its laughable to attribute it to the idea that people who once would have bought a new pc are now just buying more ram and upgrading internals.

the percentage of people who would have any idea how to do that, or even consider it as a viable option, is far to small to have any real impact on demand..

I've been prioritizing human interface over raw power for some time with my laptop (more or less my only PC). It's semi-homebuilt - a Thinkpad T61 in a T60 chassis. I would rather work on this machine than any new one.

The CPU is slow by current standards, but a Core2Duo isn't slower than the low-clock CPUs in many Ultrabooks. The 3 hour battery life could be better, but I can swap batteries and many new laptops can't. The GPU sucks, but I don't play many games anyway. DDR2 is pricey these days, but I already have my 8gb. SATA2 is slower than SATA3, but I'm still regularly amazed at how much faster my SSD is than spinning rust. It's a little heavy, but really, I can lift six pounds with one finger.

So the bad parts aren't so bad, but nothing new matches the good parts. The screen is IPS, matte, 15" and 1600x1200. Aside from huge monster gaming laptops, nothing has a screen this tall (in inches, not pixels) anymore. I can have two normal-width source files or other text content side by side comfortably. The keyboard is the classic Thinkpad keyboard with 7 rows and what many people find to be the best feel on a laptop. The trackpoint has physical buttons, which are missing from the latest generation of Thinkpads. There's an LED in the screen bezel so I can view papers, credit cards and such that I might copy information from in the dark, also missing from the latest Thinkpads.

This is an over-simplication.

Yes, PCs aren't ageing as fast as they used to.

But they are obsolete beyond 'not being portable'.

Here is why tablets are winning:

1. Instant on. I can keep my thoughts in tact and act on them immediately. No booting, no memory lags, no millions of tabs open in a browser.

2. Focus. Desktop interfaces seem to be desperate to put everything onto one screen. I have a PC and a Mac (both laptops). I prefer the PC to the Mac; better memory management for photoshop and browsing, and I love Snap. But that's where the usefulness stops. With an ipad, I have no distractions on the screen.

3. Bigger isn't better. That includes screens. Steve Jobs was wrong. The iPad Mini is better than the bigger variants. Hands down. Same goes for desktop screens. I want a big TV, because I'm watching with loads of people. I don't need a big screen for a PC because the resolution isn't better than an ipad and I'm using it solo. Google Glass could quite possibly be the next advancement in this theme.

4. Build quality. PCs look and feel cheap. Including my beloved Sony Vaio Z. The ipad in my hand could never be criticised for build quality.

5. Price. The ipad doesn't do more than 10% of what I need to do. But, I do those 10% of things 90% of the time. So why pay more for a PC when the ipad has no performance issues and takes care of me 90% of the time.

I used to think shoehorning a full desktop OS into a tablet is what I wanted. Seeing Surface, I can happily say I was wrong. I don't want to do the 90% of things I do 10% of the time. That's inefficient and frankly boring. PCs and Macs are boring. Tablets are fun. There's one last point why tablets are winning:

6. Always connected. It strikes me as absurd seeing laptops on the trains with dongles sticking out. It takes ages for those dongles to boot up. I used to spend 5-10 minutes of a train journey waiting for the laptop to be ready. My ipad mini with LTE is ever ready. And cheaper. And built better. And more fun.

The PC isn't dead, but it will have next to no investment going forward, so will suffer a mediocre retirement in homes and offices across the world.

Note: I love my PC. I just love my ipad mini more.

You probably own a really nice PC but most people do not. I'd wager that an iPad is at least as expensive as the PC that it is replacing in most homes.

Don't forget your iPad isn't going to be covered in popups or left disabled by ransomware like PC SPYWARE DOCTOR 2014 like your old Windows PC.

Haha, the mental image of that has been solidified by years of Windows mediocrity.

What's ironic, though, is OS X out pop-ups Windows 7/8 with annoying prompts and notifications.

Again, tablet OSs just aren't prone to this, so people 'feel' more comfortable using them.

> 1. Instant on. I can keep my thoughts in tact and act on them immediately. No booting, no memory lags, no millions of tabs open in a browser.

Suspend to RAM, 8 GB of RAM and as many tabs open as I like (usually none, but Opera appears to handle 100s quite well, too). You must be using PCs wrong.

> 2. Focus. Desktop interfaces seem to be desperate to put everything onto one screen. I have a PC and a Mac (both laptops). I prefer the PC to the Mac; better memory management for photoshop and browsing, and I love Snap. But that's where the usefulness stops. With an ipad, I have no distractions on the screen.

I also have no distractions on the screen if I don’t want them. You must be using PCs wrong.

> 3. \…] The iPad Mini is better than the bigger variants. […] I don't need a big screen for a PC because the resolution isn't better than an ipad and I'm using it solo. Google Glass could quite possibly be the next advancement in this theme.

The iPad mini has a resolution of 1024x768, roughly the same as my X41. If your current PC doesn’t have a higher resolution, you must be using PCs wrong – I expect at least 1200 pixels vertically and 1400 horizontally if I am expected to do any work at this thing. 900 vertically might be enough for casual browsing, but it does feel inferior.

> 4. Build quality. PCs look and feel cheap. Including my beloved Sony Vaio Z. The ipad in my hand could never be criticised for build quality.

I tend to be very wary around Apple hardware, seeing how people put them into protective sleeves, pockets and all other assortments of stuff. I don’t think a tablet would survive the things my T410s survived.

> 5. Price. The ipad doesn't do more than 10% of what I need to do. But, I do those 10% of things 90% of the time. So why pay more for a PC when the ipad has no performance issues and takes care of me 90% of the time.

I won’t argue about performance issues (the iPad lacks the software to even test performance properly…) and at least I pay for a PC because I want 100%, not 90%, of the job done.

> 6. Always connected. It strikes me as absurd seeing laptops on the trains with dongles sticking out. It takes ages for those dongles to boot up. I used to spend 5-10 minutes of a train journey waiting for the laptop to be ready. My ipad mini with LTE is ever ready. And cheaper. And built better. And more fun.

Time from button press to xscreensaver unlock window: 2s. Time to get LTE modem ready: 5s. Time to open 100s of tabs in Opera and decrypt email locally and do all other things I could possibly want to do: nearly instantaneously. With an iPad mini (which would most likely break in my backpack on the way to the train), I couldn’t even do half of these things – it admittedly might be cheaper, though, but 1600 € every four-or-so years vs. 480 € what appears like every year are roughly equivalent, wouldn’t you say?

As I said, if you settle with inferior stuff and use PCs as absurdly as described by you, I can see why you like tablets more.

Your point of view is certainly interesting.

As you suggest, I simply must be using PCs wrong. But then, the ipad doesn't need an instruction manual, or to be used perfectly, for me to use that right.

Operating an automobile requires you to have passed a theoretical and practical exam and attended numerous lessons. Bobby-cars[0] are given to kids as young as two-or-so. Which one is preferable to get from A to B?

Of course, you should generally aim to use the tools that suit your needs and capabilities, but preferring a tablet over a proper PC strikes me as rather odd.

You analogy would make more sense if you added the word 'performance' to the statement.

A car is fairly intuitive to drive. Driving licenses are required for safety.

The comments you make are akin to performing in a sports car, where concentration and accuracy is key.

You car analogy would fit if you had PCs as a F1 car and tablets as a Mini. F1 car is for speed and requires perfecting to get the tyres working right. A Mini, you can pretty much drive through all sorts of faults and be none-the-wiser, because performance just isn't an issue. Well, 90% of the time. :)

I think the issue is that the rate of improvement has fallen pretty hard. I remember when nvidia moved from the 5 series to the 6 series, their new flagship card doubled the performance of any current card on the market. The same thing happened with the 8 series. Processors before multicore would show direct improvements in the speed of the machine, especially if (like the average consumer) your machine filled up with useless, constantly running crap over time.

These days I just don't see that. Graphics cards seem to improve by 30-50% each generation, and because so many games are tied to consoles now, they often aren't even taking advantage of what's available. With multicore processors and the collapse of the GHZ race, there's no easy selling point as far as speed, and much less visible improvement (now all that useless crap can be offloaded to the second core!) and most consumers will never need more than two cores. Crysis felt like the last gasp of the old, engine-focused type of game that made you think "man, I really should upgrade to play this"... and that was released in 07. Without significant and obvious performance improvements, and software to take advantage, why bother upgrading?

Five years ago, I bought a MacBook Pro to replace my PowerBook G4, which was itself five years old. The list of obsolescences was enormous: It had only USB 1.1 in a market teeming with new USB 2.0 hardware that couldn't have existed with the slower speeds; it had a single-touch trackpad just as OS X was introducing all sorts of useful multi-touch gestures; it relied on clumsy external solutions for wi-fi and Bluetooth; it had a slow-to-warm CFL LCD that had been supplanted by bright new LED backlit screens; it was even built on a dead-end CPU architecture that Apple had traded for vastly more powerful, energy-efficient, multi-core x86 processors.

Today, the calendar says it's time for me to upgrade again. Yet the pain of obsolescence of a five-year-old laptop in 2013 just isn't the same as in 2008: USB 3.0? What new applications is it enabling? Anything I need Thunderbolt for? Not yet. New Intel architectures and SSDs at least promise less waiting in everyday use... but I'm hardly unproductive with my old machine.

I would just upgrade the RAM and put an SSD in there and you should be good for a while. Drooping an SSD in mine was the single biggest upgrade I've done for productivity and battery life.

I have an old Dell D830 with a 1.8 Ghz Core2. I put an SSD in there and it made it feel like a whole new machine. My parents were complaining out their slow computers. I had them buy SSDs.

SSDs help in most cases, however, I once happened onto a netbook (Atom 1.6GHz, 2GB RAM) where SSDs did not really help. Slow as mollasses in Web browsing, Flash video stuttering at 480p. I removed SSD and sold this netbook immediately.

Quick and dirty guide for having decent PC:

1. Buy mid range processor with a lot of L2 cache 2. Find mobo that supports lots of ram and stuff it to the max. 3. SSD is a must 4. Buy the second card of the high tier model (the cut chip from the most recent architecture (in their times that were 7950, 570 etc ... but with current branding of NVIDIA a total mess it may require some reading if you are on team green) 5. Any slow hard drive will be enough for torrents 6. In 2 1/2 years upgrade the video to the same class. in 5 years ... if the market is the same repeat. If it is not - lets hope there are self assembled devices on the market non locked.

I have been doing that since 2004 and never had a slow or expensive machine.

Mainly we don't need new ones because the 3 year old one is still doing the job. That wasn't the case a decade ago - your 3 year old PC was seriously out of date and couldn't run most games released that year and probably not install the latest OS release. This rapid progress has flattened out considerably. Now people upgrade to get nice features such as retina displays or SSD drives, but that's optional (so you don't do it if you don't have spare money laying around) and the benefit is much smaller than going from a 90 MHz Pentium to a 450 MHz Pentium III.

Agree , even for development you can compile almost all programs light or heavy on a powerful machine built 3 years ago.

Hmm, there seems to be the implication that we've hit some magical end state in hardware development where consumer needs are forever met.

Personally, I think of these hardware market developments with an eye toward interplay with the software market. Historically, software developers had to consider the capabilities of consumer hardware in determining feature scope and user experience. Hardware capabilities served as a restraint on the product, and ignoring them could effectively reduce market size. The effect was two-sided though, with new more demanding software driving consumers to upgrade. Currently, in this model, the hardware stagnation can be interpreted as mutually-reinforcing conditions of software developers not developing to the limit of current hardware to deliver marketable products, and consumers not feeling the need to upgrade. In a sense, the hardware demands of software have stagnated as well.

From this, I wonder if the stagnation is due to a divergence in the difficulty in developing software that can utilize modern computing power in a way that is useful/marketable from that of advancing hardware. Such a divergence can be attributed to a glut of novice programmers that lack experience in large development efforts and the increasing scarcity and exponential demand for experienced developers. Alternatively, the recent increase in the value of design over raw features could inhibit consideration of raw computing power in product innovation. Another explanation could be that changes to the software market brought about by SaaS, indie development, and app store models seem to promote smaller, simpler end-user software products (e.g. web browsers vs office suites).

I wouldn't be surprised if this stagnation is reversed in the future (5+ years from now) from increased software demands. Areas remain for high-powered consumer hardware, including home servers (an area that has been evolving for some time, with untapped potential in media storage, automation and device integration, as well as resolving increasing privacy concerns of consumer SaaS, community mesh networking and resource pooling, etc), virtual reality, and much more sophisticated, intuitive creative products (programming, motion graphics, 3d modeling, video editing, audio composition, all of which I instinctively feel are ripe for disruption).

Wow, big long thread here will take me some time to read but I know what the OP is saying. A few pages not mentioned already...

The mysterious K Mandla gives 10 reasons not to buy a new computer


The TOPLAP project (a real hack - give a teenager an old laptop and Ubuntustudio or similar, light blue touch paper, retreat). By the way, if anyone has resources for live-coding in puredata, please post here


The Zero Dollar Laptop Project [1] and current progress [2]

[1] http://jaromil.dyne.org/journal/zero_dollar_laptop.html

[2] http://www.furtherfield.org/zerodollarlaptop/

Now, I made a major discovery over the summer: I am actually more productive on a laptop than on a desktop with a large screen. Strange but true, so I am donating the desktops and adopting a couple of Thinkpads off Ebay (X60 from Dec 2006 and X200s from March 2010) as my major computational devices. One with Debian stock and the other with gNewSense 3.0 for a giggle.

I have bought laptops, but not a whole desktop ever in my life. I've been through two desktops, mind you, that were both built from scratch[1].

I think this article gets it about right - I've started enforcing a 3 year cycle for both phone and laptops because they were costing me too much (in a mustachian sort of way) - and I've stuck to it with laptops (I made 3.5 years on a 2009 MBP) and will be doing so with the iPhone (due for replacement spring 2015.) If the nexus devices keep getting cheaper and awesomer, then I might jump to those a bit earlier (particularly if I can sell the 32GB 4S for an appreciable fraction of the new phone cost.)

Working with the 3.5 year old laptop got slightly painful (re-down-grading back to snow leopard from lion was essential, I even tried ubuntu briefly) but perfectly bearable for coding and web browsing. I'll see how slow the phone gets, but I'm quite relaxed about not having the latest and greatest iOS features (I've not seen anything compelling since iOS 5; I only did 6 because some new app requested it.)

[1] or rather, one was, and then I gradually replaced all the parts until I had a whole spare PC to sell on ebay, and one mobo bundle later and I'm still using it with no problems, playing games etc.

I guess it depends who "we" are. The average person dont need a new computer, I agree. My mom is still using my old self-assembled desktop from 2008-ish and is perfectly happy with it.

However for those of us that use our computers 8 hours+ every day, I think it makes good sense to upgrade to the newest hardware every 2-3 years.

I just assembled a computer from new parts myself, and its nice now to have a fully encrypted workstation, with zero performance hit. Q87 motherboard with TPM(asus q87m-e) + UEFI bios + UEFI GOP Compliant videocard(EVGA GeForce GTX 770) + M500 SSD + Bitlocker + Win2012R2(or Win8.1) means you can enable the builtin hardware encryption of the M500 SSDs. It gives me a certain peace of mind to know that a burglar wont be able to grab my personal files and source code if my computer was ever stolen. I also imagine the TPM+Secure boot combo will make it harder for a rootkit to go unnoticed.

Not to mention the lower idle power usage resulting from the 22nm haswell and 32nm lynx chipset.

My friends at work seems to think I'm crazy for replacing a 2 year old computer :) Although as I pointed out to one of them, he spent more than twice as much on a new mountain bike, and I'm sure i spend alot more time on my computer than he does on his mountain bike ;)

Good point, well made.

Personally, I upgrade incrementally, and I still use my PC on a regular basis. The machine I have now is a hodge-podge of parts from different ERAs. I have an Intel Q6600 but DDR3 RAM, and a modern, quite beefy graphics card that I bought when it was in the upper-echelons in early 2013. It runs most modern games pretty well. I have an SSD for most software but also three big HDDs, one of which I've had since my first build in 2004.

I'm typing this on a PC where I did the same thing as the author. Over the past 10 years, I've swapped out a part every two years or so to keep it running the latest and greatest. But the CPU is five years old and still running fine. I'm planning to donate it to a non-profit to replace a computer that's almost 10 years old and also still running fine.

There was a time when you felt like a new PC was obsolete the second you took it out of the box. But that was because we were just scratching the surface of what we could do with new hardware. We're now at a point where it's hard to find consumer and business applications for all the spare hardware that you can afford.

Mobile adoption has been so quick because everyone is buying devices for the first time (tablets), or there is an incentivized two-year replacement cycle (phones). But I'm still using an original iPad that works just fine, and a 3 year old cell phone with no reason to upgrade. Eventually, I think we'll start to see the same leveling off in mobile as well.

I was ticked off that my 2007 Mac Mini couldn't be upgraded to Mountain Lion, until I realized Snow Leopard ran all the software I needed on that box. I think I'm happy with the hardware and form factor of my phone, too, so I've got all the electronics I need for years to come. Good thing, too, because my rent just went up, and I need a new couch.

Interesting that the author assumes smartphones need to be upgraded almost yearly. My smartphone upgrade path in the last 10 years has been HTC Typhoon -> iPhone 3GS 32GB -> iPhone 5S 64GB, and a large part of the most recent upgrade was a crumbling plastic case on the 3GS.

At no point during the 4-year tenure of the 3GS did it stop being astonishing to me that I had flat-rate, always-on internet in my pocket, all my music, ebooks and audiobooks, videos that I took of my wedding, and photos that I took of our first child, who's now inherited it and mostly uses it for In the Night Garden.

Personally I think that because of the reduced horizons of smartphones, they're actually every bit as long-lasting as your PC. Sure, at some point OS updates stop coming, and with that app upgrades, but the performance of the 3GS was fine, and I'm not afraid to admit that part of the latest upgrade was just embarrassment at having such a naff old phone, as much as I loved it.

Interesting to watch the uptick in 'retina' laptops. Basically people don't need a new PC but will pay for a better PC 'experience' that means longer battery life, 'better' screen (usually retina/IPS/etc), better ergonomics.

Interestingly it seems like some would love to run their old OS on them. My Dad sort of crystallized it when he said "I'd like to get a new laptop with a nicer screen but I can't stand the interface in Windows 8 so I'll live with this one." That was pretty amazing to me. Not being able to carry your familiar OS along as a downside. That reminded me of the one set of Win98 install media I had that I kept re-using as I upgraded processors and memory and motherboards. I think I used it on 3 or 4 versions of machines. Then a version of XP I did the same with.

I wonder if there is a market for a BeOS like player now when there wasn't before.

AFAIK even with OEM versions, only Win8 Pro edition comes with downgrade rights.

the article is falling under fallacy of assuming the wrong sample. Of course the author wouldn't buy new PC because he can upgrade his old one. Heck, almost any tech-savvy people can in fact upgrade or build one from the scratch. If not, chances are that you know at least one person who can help you and after the first time, it just gets easier.

the PC market isn't dead, it is slowly receding and it won't stop. It's because of the new alternatives, and assuming finite budge, when you get one of the alternatives, which cost roughly around a consumer-level laptop, you don't have enough for another PC that you don't need.

The article to me seems extremely narrow in both its oversight and scope. People don't care about processing power not because it's a marketing gimmick, but because they don't care. People who do care are the ones who know enough to care, and they will always be minority.

For many of the most common business and consumer applications (web browsing, using MS Office), adding a more powerful computer just doesn't do that much for you. Power users will upgrade their old machines, but most users will just keep using them.

4K will be the revival of PC sales, in two ways:

1. Consumer affordable monitors. You'll need a better GPU, and probably Display Port. I don't expect most consumers wanting 30" 4K display. They'll want 22-27" displays of 4K resolution, a la Retina. (PPI scaling) Everything is still the same size as people are used to (compared to 1080p), but everything is sharp as Retina.

2. 4K adoption of multimedia on the Internet. The more 4K videos that pop up on YouTube, the more people who are going to want to upgrade their hardware. This one isn't specific to PCs though, it could apply to mobile devices as well.

Go to YouTube and find a 4K video (the quality slider goes to "Original"). Now look at the comments. Many of the comments in 4K videos are people complaining how they can't watch the 4K video because of their crappy computer (and sometimes bandwidth).

Yeah, I pretty much agree with that premise. In my experience, faster CPUs and RAM make little difference compared to the gains from an SSD. Hard drive disks are such a huge bottleneck compared to other upgrades that the average user gets the biggest gains in responsiveness from upgrading to an SSD. And for a lot of PCs that doesn't even necessitate buying a new one.

For laptops it's a different story. The big push seems to be in reduction of power consumption for longer battery life, which sounds pretty sensible to me. I guess if battery life is a big concern for a PC user, then it makes sense to go to a smaller process. That does seem like a pretty small reason to upgrade, though.

Another good indicator that the PC "game" has changed is that the two major commercial PC OS's just released their latest versions (Mavericks & 8.1) for free.

Hardly any academics, or professionals who write articles, reports, or serious documents, are doing so exclusively on their iPad. They probably own an iPad, but the majority of people I see who are trying to contribute something substantial to the world - a book, design, quality video and sound recording, or just professional documents to share with colleagues... these people are using a laptop or a desktop. Their PC might be old, but it does the job. These people own tablets and phones too.

Now do the math. If everyone - smart, average, stupid, young, old, are buying tablets and smartphones, then of course this makes PC sales look like death.

It's more like a "post-PC-avoidance" world we're in now. A lot of stupid people avoided using PCs back in the day. Now all those people own tablets and smartphones and use them for entertainment.

As a guy who has been involved in computers I tend to buy something to last at least 3-4 years. Once I start feeling I'm behind I like to upgrade.

I had a 2005 imac before acquire this 2011 iMac and in between I've bought MacBooks and Macbook Air. I'm thinking in getting my new desktop on 2015.

Thing is, when I go to my parents house, I see 2003 computers. I think this reality apply's to many families: parents don't care about speed, they get used because their needs are less computational and more casual, like browsing, Facebook and Skype. The trend I'm seeing in Spain is getting iPads for parents is getting notably high. All my friends instead upgrading their parents pc desktops are buying ipads and parents love it. Are you having the same experiences?

Yes. I'm living in Spain and my wife and I bought an iPad for her parents. They love it.

I actually touched upon this in a blog post I wrote last month: http://ilikekillnerds.com/2013/09/rumours-of-pcs-demise-have... — and I said exactly this. A bad economy coupled with the fact people just don't need to upgrade as much any more are reasons PC sales have slowed. The PC will always be around, tablets and smartphones are great, but they're not comfortable for extended periods of time nor as capable. As I also point out, being a developer means I need a keyboard and multiple monitors to do my job and coding on a tablet is just never going to happen.

That's why we all needs tablets. A tablet for you...and a tablet for you...And you get a tablet....and you get a tablet....We all get tablets..... Oh! these tablets kind of suck to actually produce or do anything on. ....... ok, back to laptops and all-in-ones.

Basically this, computers hit good enough a while ago, now you just have to replace parts when they die.

Yes, on paper, the latest processor is faster than the one released two years ago but you have to be doing specific types of workloads with it to really make a big difference.

Yeah. The last major thing for PCs in the past couple years were probably SSDs.

The upgrade to SSD was glorious. Like 386 to pentium. Watching my machine boot that fast made me grin for months.

The last few times I looked at the desktops available at Targets and Walmarts in the Bay Area, there weren't very many options. Bestbuy and Costco are somewhat better equipped. I think that, with the lower margins on desktops relative to laptops and the amount of space they consume, desktop PCs are well on their way out of being attractive to traditional brick and mortar retailers.

Haswell architecture couldn't have hit the market at a better time for laptop owners, with more powerful integrated graphics and low power use. I'm sure it isn't a coincidence.

Every time I read an article about the death of the PC and the ascension of mobile, I wonder how much carrier subsidies distort the relative demand for PCs vs mobile devices.

I'm inclined to believe that mobile sales are "artificially" inflated by these subsidies to a large degree.

Of course, if this business model is sustainable over the long term I guess it doesn't matter for mobile h/w manufacturers.

But for s/w developers the fact that people upgrade h/w every 2 years because of subsidies doesn't mean that those h/w sales are translating into a greater user base.

I still use my 5 year old desktop (upgraded twice) for development. I like to open box and upgrade it myself , if I want to do similar on laptop I think twice . Freedom to upgrade it yourself is a bliss.

Part of the reason is because we have gone back to the days of terminals. Chromebook is a good milestone in marking what people do with computers and how much power they need. We are past the point where computer as a consumer device, and computer as a professional equipment have parted their ways. We are also lucky that the people who buy CPUs in bulk for their powerhouses are still using architectures similar to the ones we use in our desktops and laptops. Because with our weak demand for new hardware, the prices cannot stay low for long.

I have always pondered over this whole question of PC being dead vs alive. Interesting thing is that even though with tablets and smartphones, lot of regular people can probably get away with not using a PC just to surf the net, facebook etc, the real question that comes to mind is what will happen in the future if someday coding/programming does become a commodity and more and more regular people actually start coding (to whatever extent) to solve problems. Would that ever happen ? What would they use then ? PCs ? something else ?

I have a 13 inch Acer I purchased in early 2011. Despite its low cost, the thing has run like a charm since day 1. I literally have zero desire to replace this thing at any time in the foreseeable future. It still runs 4+ hours on a battery, which is remarkable, since I use this machine more than 5 hours a day.

I have a desktop with twice the processing speed and twice the ram, but for all intents and purposes, it runs almost exactly the same as the little Acer. Unless I am playing a game or running illustrator, I simply don't need the power.

I ran my 2008 Acer Aspire One ZG5 netbook until I got my current "Ultrabook" a couple of months ago.

The netbook handled just about everything I threw at it, and with FreeBSD and dwm it ran faster than it did when I first bought it.

Unfortunately I'm not too pleased with the HP Envy 15. The AMD A6 Vision graphics aren't so bad, but support for the Broadcom 4313 wifi card is sparse in the nix world...

Soon I'll be tearing it apart to swap out the bcm 4313 for something supported by FreeBSD, but for now, I'll not be purchasing a new PC any time soon.

Somehow I don't think my mom would trade her iPad for an e1505 with a broken display, external monitor, plus the periodic need to upgrade the hard-drive and install/upgrade Ubuntu :)

I don't know which conclusion I had about this was more useful:

1) I don't need to buy a new PC every two years anymore 2) Someone should make a tablet with slots so it can be upgraded like a PC

" 2) Someone should make a tablet with slots so it can be upgraded like a PC"

Wouldn't that be a Windows 8 convertible laptop with a touchscreen?[1][2] Once you put expansion options on it you start compromising mobility.

[1] http://www.futureshop.ca/en-ca/product/dell-dell-xps-12-5-ul...

[2] http://www.futureshop.ca/en-ca/product/hewlett-packard-hp-sp...

Tablets need to be very small, lightweight and thin. Making them modular and configurable would go against that.

Look at some ipad teardown for instance and watch how everything is packed together. Not to mention that these days everything is in SoCs: you can't upgrade the GPU and CPU separately. Even the RAM can be stacked on the SoC package and if it's not it's soldered right next to it.

Contrast that with the innards of a desktop PC which is mostly empty space.

We are going in the oposite direction of #2 for at least a decade. Nowadays a motherboard has almost all the periferals you'll need in a typical PC, and at the ARM world (that is more flexible) we are getting into the periferal-less single chip CPU that has only a power supply and I/O pins for display, USB and network.

That's a natural step because connectors are expensive, power hungry and bulk. As computers get inexpensive, upgrading makes less sense, so it makes less sense to pay more for an upgreadable system.

I think #2 is not done so they force you to buy a new tablet every 2 years.

> Of course PC sales will be low. When you don't have enough memory, you buy more RAM. When your processor is too slow, buy a new CPU, or you get a new heat sink and over clock it. You rarely have the need to buy a whole new box

This is not end-consumers nor businesses. Enthusiasts who were building and upgrading their computers were always a small market.

The article talks about upgrading repeatedly, but I don't think the author can extrapolate their own expertise over the rest of the traditional desktop users.

PC's are far from dead for consumers but, for manufacturers and retailers, the high-churn glory days are over. With high-end gaming now chained to the console cycle, even gamers won't get the itch to upgrade more often than Sony and MS refresh their platforms.

Intel, AMD, etc. might want to consider slowing their desktop product cycles down a tad. Instead of spending extra to bring every incremental performance to market as soon as it can be, perhaps longer product cycles will bring down costs.

I'd argue a similar pattern is happening with laptops (well, at least ones with exchangeable parts).

My old T400 was "dying" until I put an SSD in it. Blew my mind how significant an upgrade that was. When it started "dying" again I maxed out the RAM @ 16GB.

The CPU is a bit lacking now that I want to run multiple VMs side by side, and the chassis has seen perhaps a bit too much wear, so a replacement is coming -- but I've managed to put it off for years, with relatively inexpensive upgrades.

Hmm, I want to compile huge open source projects quickly. For this I need as many cores as possible at a reasonable price, a lot of memory and an SSD. So it's time to upgrade :)

I agree with the author. I built my desktop computer in 2009 (I think) and it's still going strong. I see no reason to upgrade. I also recently purchased a used Thinkpad X220. It's a few years old but has no problem doing everything that I want to do with it.

It's wasteful to be throwing away computers constantly. In the PC world, I've noticed that it's particularly prevalent among "gamers" that are convinced that they need a new computer every couple of years.

Well, the CPU/RAM/HDD systems do last a very long time. It's the GPU that needs periodic upgrading. Robert Space Industries for instance will be leveraging the Cryengine 3 with nearly 10 times as many polygons as with the average 3D FPS. Also, Microsoft keeps adding rendering features to the latest OS's which require hardware updates on the GPU level. I guess what I'm saying is: Nvidia will continue to be a sound stock to add to your portfolio.

A few things lead to this including the obvious tablet/mobile disruption. PC Gaming decline due to console gaming and mobile and Moore's law and processor speed.

I used to update for gaming and 3d almost entirely.

I also used to update more frequently for processor speed/memory that were major improvements.

If we were getting huge memory advances or processor speeds still there would be more reason to upgrade. Mobile is also somewhat of a reset and doing the same rise now.

The PC is not dead; the market for selling new PCs is just stagnant. PostPC doesn't mean the PC is dead, but it lives on more like a zombie.

I'm hoping that a new generation of largish (24-27") 4K displays will lead to a rebirth in desktop PCs, if only because we depend on them so much for professional work where they've fallen behind in experience when compared to high-end laptops, which shouldn't be the case!

I'm seeing the same thing with my iPhone. I Have a 4S and while I like what the 5s brings I'm just not sure it's worth upgrading now. There is just starting to be the very hint of slowness in some things on the 4S, but it isn't anything like when I went from the 3G to the 4s. That was a huge upgrade. Now it just doesn't feel necessary to buy the next thing on the same schedule.

Thanks! Somebody finally said it! (or at least this is the first blog post I read about it)

If any, what is dead is the software need for the Moore's law

This is all true, I still can do pretty much everything on my 2009 PC, but truth is also that I do it rarely, specially since I've got a new console a few years ago and stopped playing on PC... everything work related is on my laptop, playing games on console is nicer, PC desktops are simply not needed anymore (for what I do, and also for majority of not-tech users)

This is pretty much dead on. What I think will happen is that PC manufacturers are going to look around for new markets and the obvious one is going to be consoles. Once SteamOS comes out I expect a slow but massive ramp up in PC-Console production in a similar vein to the way that Android powered devices have come to dominate the smartphone market (in numbers shipped).

It seems like the only folks who consistently upgrade their computers every 1-2 years are gamers and people working with big media files. Some friends and I run a website dedicated to helping people build and upgrade their PCs. We see about 130k visitors per month. That's a pretty low number, but it still converts to a quarter of a million in sales every month.

It's weird,but I feel like my PCs are all you slow. I bought a rMAcBook Pro recently expecting to be blown away, but it still feels sluggish to me. I want instantaneous response when it comes down to it. There actually is a qualitative difference between 100ms and 10ms response time. I'm surprised, I really thought we would be closer.

When we speak of PCs versus smartphones or tablets we're talking a lot about form factor and portability. I imagine a day when my smartphone has more horsepower than the best desktop today and it can drive a huge 4K monitor while streaming petabits at a time. You'll only need one device and it will be the CPU to all your interfaces.

Mostly agree, however, I think there could be more upgrade waves for home PCs, triggered by some qualitative improvements in technology. My guess, once we have a reasonably powerful, totally silent (fanless, 512-1TB SSD), book sized desktop PC, maybe in 2-3 years from now, it might trigger wave of home PC upgrades. After that, who knows...

This article makes a similar point: http://techland.time.com/2013/04/11/sorry-pc-industry-youve-... (I think it's been posted on HN before, but I couldn't find the post).

Well I did, used my last one for almost 8 years, got this one a few months ago, don't have to upgrade as often; I still have to upgrade. It's lighter, quieter, generally more powerful, more RAM, more disk space, better graphics. These are all the reasons I ever upgraded just not as often.

Average computing power and storage has gotten to a point that it now can handle the everyday stuff with relative ease. High-def video/gaming are the main areas where hardware still has to keep up with.

Although one could argue that network bandwidth is still an area affects the "everyday stuff".

Thinkpad T60 purchased (refurb!) in 2007. Still a rock solid machine. It does get a little warm though..

T60 will even run coreboot, if you fancy putting an SSD in there, you'd have near instant boot.

Just replace the thermal paste (easy to do on a Thinkpad) and clean the fans, it'll do wonders!

> The PC is not dead, we just don't need new ones

It's really nice when some build process takes less time because of better hardware. Also, try running some upcoming games on an old PC. Obviously the need for some hardware depends on what you are planning to do.

PCs are ugly, clunky and they take up a lot of space at home. Also, a PC reduces the appeal of home or office, compared with a Mac. Honestly this is one of the reasons I bought a Mac. Ofcourse Mac is UNIX, that is another major reason.

This is so true, tablet/smart-phone are great portable devices, however I can not live without a PC/laptop, it's just I already had a few of them. My first choice will be PC, then smart phone, the last one is tablet.

I just upgraded from Q6600 / 4GB to i7-4770K / 32GB, but actually that Q6600 would have been enough, if I would have just used SSD with it. SSD is they key. Apps I user are Firefox, Thunderbird, Deluge and VLC.

My .02 is that Microsoft OS stopped being lead-ware. I noticed that since Win7.

When people say the PC is dead, they do not mean that it is not being used and people don't need one... they mean that people simply don't buy it as often and have other options to choose from, like laptops.

Saying that the PC is dead is being correct. Almost everyone I know buys a laptop instead of a PC. I know a lot of people that do not have a PC, but I don't think I know a single person that doesn't have a laptop.

It's like saying the Novel is Dead. Plenty of novels are being written, but it is really not the one major form of art that people are discussing. That is being replaced by television and film. Will there be novels written fifty years from now? Most definitely. But still, the idea that the novel is the one true form where the greatest art occurs is over.

Well, if PC had to die then on what are we going to write all our code ?

Tablets, those funky phones are popular today something else will get popular after them. PC may never get as popular as them but they are here to stay.

My 2009 Core2Quad with 8Gigs of Ram and an SSD still feels faster than the latest and greatest with a normal HDD. It even runs OSX beautifully ;)

SSDs just changed the game, and it was about 2009 when that started.

I'm a happy owner of a DELL e1505 still working in the living room where has survived two little girls of 4 and 2 years. Now I want to rescue it and install Ubuntu after upgrading to a SSD.

I recently bought a new PC, after 6 years. Not because my old PC is unusable but I rather need a new one as HTPC with very low power consumption.

At least Microsoft is helping the PC industry.

Microsoft and its SharePoint platform will keep SharePoint developers upgrading their desktops upon every release.

I think computing power/storage is becoming more necessary on the server side than the client side.

With all the guide from tonymac, I enjoy building my own hackintosh with cheaper and better hardwares :P

If there's no money to be extracted from it, then it's dead in the eyes of industry.

most of consumers will not even upgrade their PCs, but change it to a new PC, laptop or tablet when it's completely broken.

i'm thinking my parents - they will use that 2000 pc until it's not booting up, and then they'll worry on upgrade

Below is what I feel is a relevant excerpt from Text of SXSW2013, Closing Remarks by Bruce Sterling [1]:

--- Why does nobody talk about them? Because nobody wants them, that’s why. Imagine somebody brings you a personal desktop computer here at South By, they’re like bringing it in on a trolley.

“Look, this device is personal. It computes and it’s totally personal, just for you, and you alone. It doesn’t talk to the internet. No sociality. You can’t share any of the content with anybody. Because it’s just for you, it’s private. It’s yours. You can compute with it. Nobody will know! You can process text, and draw stuff, and do your accounts. It’s got a spreadsheet. No modem, no broadband, no Cloud, no Facebook, Google, Amazon, no wireless. This is a dream machine. Because it’s personal and it computes. And it sits on the desk. You personally compute with it. You can even write your own software for it. It faithfully executes all your commands.”

So — if somebody tried to give you this device, this one I just made the pitch for, a genuinely Personal Computer, it’s just for you — Would you take it?

Even for free?

Would you even bend over and pick it up?

Isn’t it basically the cliff house in Walnut Canyon? Isn’t it the stone box?

“Look, I have my own little stone box here in this canyon! I can grow my own beans and corn. I harvest some prickly pear. I’m super advanced here.”

I really think I’m going to outlive the personal computer. And why not? I outlived the fax machine. I did. I was alive when people thought it was amazing to have a fax machine. Now I’m alive, and people think it’s amazing to still have a fax machine.

Why not the personal computer? Why shouldn’t it vanish like the cliff people vanished? Why shouldn’t it vanish like Steve Jobs vanished?

It’s not that we return to the status quo ante: don’t get me wrong. It’s not that once we had a nomad life, then we live in high-tech stone dwellings, and we return to chase the bison like we did before.

No: we return into a different kind of nomad life. A kind of Alan Kay world, where computation has vanished into the walls and ceiling, as he said many, many years ago.

Then we look back in nostalgia at the Personal Computer world. It’s not that we were forced out of our stone boxes in the canyon. We weren’t driven away by force. We just mysteriously left. It was like the waning of the moon.

They were too limiting, somehow. They computed, but they just didn’t do enough for us. They seemed like a fantastic way forward, but somehow they were actually getting in the way of our experience.

All these machines that tore us away from lived experience, and made us stare into the square screens or hunch over the keyboards, covered with their arcane, petroglyph symbols. Control Dingbat That, backslash R M this. We never really understood that. Not really. ---

[1]: http://www.wired.com/beyond_the_beyond/2013/04/text-of-sxsw2...

This isn't universally wrong, but it is dead-wrong with present technology. And covering our eyes and pretending that we are currently that advanced doesn't make it so.

Because while basic computation is a universal commodity, what is implemented on top of it certainly isn't - a piece of software always functions as someone's agent. When the systems you end up relying on are entirely defined by someone else, the only thing that represents your will is your mind, and it is effectively executing a complex and ill-defined protocol against always-diligent computers.

You've done the computational equivalent of declining a lawyer.

But it doesn't seem like a big deal, since you're only compromising a little at any given time. But the software is always changing in ways that benefit its controllers while your expectations are mostly based on the capabilities that they've presented. So the progress you perceive is entirely in their desired paradigm. Features that would benefit you but at the expense of Google/Apple/etc are never explored, because you aren't the user of their software - you're its working set!

I can forgive the old-timers who were conditioned by broadcast media to see the world in hierarchal take-it-or-leave terms and don't understand what they're losing by sharecropping in walled gardens. And I can mostly forgive the unclued herd that just buys whatever is advertised.

But for everybody who knows the power of personal computers yet pretends webapps and locked appliances are actual progress, either out of personal laziness, cognitive dissonance, or longstanding need for social acceptance: shame on you for abandoning that self-determination you tasted the first time you truly experienced computing.

Good thoughts here. I am optimistic due to things like raspberry pi and arduino becoming popular, but at the same time these things just feel like this generations version of building radio kits and model airplanes. Ultimately it's the data rather than the computation that's important.

Yeah exactly, it's not that general purpose CPUs will be outlawed like we worried about in the 90s, it's that the generally popular ways of using technology won't be using their capabilities.

It's not just the data itself, but really about protocols used to access the data. Protocols mediate between parties, and by choosing to download a binary blob simply to check your email, you've given up any true bargaining power in that exchange. You still have some autonomy by hacking the blob (userscript injection, etc), but you're only building on unstable ground.

Well, PCs performance has never been beaten by tablets and phones.

Either way, terrible news for Intel and Microsoft.

The new fanless PC's are pretty cool.

If only Paul let me vote twice.

Right, that's what it means to say that the market is dying. But if you need to feel clever, feel clever.

Wish I could downvote this. Snark like this is toxic and is neither useful nor interesting.

A lot of folks reading sensationalist articles about the PC market decline are making the conclusion that nobody likes or uses PC's anymore since sales are declining. The author is pointing out that it's a poor conclusion to make since there are other factors contributing to the decline, including the fact that the usable lifespan of today's PC's is longer than it used to be.

People don't only claim that the market is dying, they also say that PCs will simply disappear [1].

And that's a huge difference for the application developers so it's not only about being smart, that's a very relevant question.

[1] https://www.google.com/search?q=pc+will+disapear&ie=utf-8&oe...

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact