Hacker News new | past | comments | ask | show | jobs | submit login
The PC is not dead, we just don't need new ones (idiallo.com)
566 points by firefoxd on Oct 24, 2013 | hide | past | favorite | 396 comments

I've felt this way since I built my last desktop in 2008. I was sort-of waiting for the "gee its time to upgrade" mark to roll around in 3 or 4 years, but it hasn't happened yet. Any games I want to play it still runs very well, and it still feels very fast to me even compared to modern off-the-shelf systems.

When my friends ask for laptop-buying advice I tell them if they like the keyboard and screen, then its just plain hard to be disappointed with anything new.

I think I can pinpoint when this happened - It was the SSD. Getting an SSD was the last upgrade I ever needed.


Above that, PCs aren't necessary for a lot of people, because people do not need $2000 Facebook and email machines. For the median person, if you bought a PC in 2006, then got an iPad (as a gift or for yourself) and started using it a lot, you might find that you stopped turning on your PC. How could you justify the price of a new one then?

Yet if there was a major cultural shift to just tablets (which are great devices in their own right), I would be very worried. It's hard(er) to create new content on a tablet, and I don't really want that becoming the default computer for any generation.

I think its extremely healthy to have the lowest bar possible to go from "Hey I like that" to "Can I do that? Can I make it myself?"

I think its something hackers, especially those with children should ask themselves: Would I still be me, if I had grown up around primarily content consumption computing devices instead of more general purpose laptops and desktops?

Tablets are knocking the sales off of low-end PCs, but we as a society need the cheap PC to remain viable, if we want to turn as many children as possible into creators, engineers, tinkerers, and hackers.

We would be better off steering into the skid. History has plenty of examples of people who've tried to hang onto the old ways 'because that's how I learned it'.

The way forward isn't to try and keep cheap PCs viable for creativity's sake, but to ensure that creative desires are being met on the newer devices. Would I have learned memory management and dual booting if I'd had a tablet instead of a 386? Probably not. But now that same money buys a high end tablet and a pile of hours for an EC2 micro instance.

Would I still be me? No, I would be even better. All those weeks wasted fighting with modem racks for my BBS I'd gladly trade for weeks spent on a Nexus 10 and a linode.

As a kid, my parents were really anti-videogames (except for a few educational ones), anti-TV (we had one, but it was in its own room and rarely used), and would only let me use the home computer for homework.

As a result, I spent all of my childhood and teenage years reading (mostly fantasy and sci-fi; several books a week, at my peak) and programming. I taught myself programming in 7th grade because I really wanted to play Sim City, but having no gaming computers/consoles, I couldn't. However, I did have a programmable calculator for my math class.

I spent all of middle school and high school programming my own games on this underpowered (a few mhz of CPU and a few kb of memory), inconvenient (typing thousands of lines of code on this tiny keyboard is a feat) device (and yes, I did make a turn-by-turn Sim City for the TI-82). While I was doing this, a lot of my friends were playing Warcraft 3, The Sims, and watching dumb TV shows (early to mid 2000s, reality TV was just becoming big in France).

Similarly, my girlfriend's parents were very anti-TV etc. As a result, she spent her childhood and teenage years drawing, and eventually she went to a top art school (RISD) and now makes a living from her illustration and teaching art.

I'm not sure who I would be if I had grown up with an iPad and a playstation, or who my girlfriend would be if her parents had let her watch TV; no one can tell. However, I think our situations worked out really well for us; and when I have kids/teenagers (not for another decade or so), I will most definitely give them a life analog to what I experienced rather than a media-consumption heavy one. For instance, I haven't had TV ever since I moved out of my parent's house (I do have a projector for watching movies from the computer), and intend to keep it that way.

I obviously have a fairly tech heavy life right now, as a tech worker in SF, but I am trying to cut down. I am noticing that I whip out my phone every time I have a spare 30 seconds, I have several laptops and iPads laying around the house, etc. - and I like it less and less. I'm slowly selling away my excess devices, and am thinking of getting rid of my iPhone when my contract runs out (and just get a dumbphone as a replacement for emergencies). I bought a really cheap netbook, installed archlinux+xmonad on it, and am using it as my primary machine at home for web browsing + programming + LaTeX. It's harder to get distracted with this machine.

In 2011, Tom Preston-Werner said the only hardware he had at home were a waffle maker, a microwave and a bike[0]- I like that mentality.

19 year old me would have pre-ordered a Google Glass from day one and used it with religious fervor; now, I am absolutely not interested in such a device, as I know it is just the ad billboard of the 21st century manufactured by Google.

I still have a few guilty digital pleasures; for example, I buy a lot of used video games that I wanted to play in my childhood and never could (mostly Game Boy Advance/Game Cube- the upside is that you can get 20 of those games used on Amazon for the price of 3 new current-gen games).

My hope is that over the next few months and years, I will revert to reading as much as I used to, and spend less time on Facebook/Twitter/etc. (HN is not completely in that bucket, as it leads me to write introspective comments like this, which I think is good). I think a big part of it is removing the devices that will call for your attention. My Game Boy Advance or MP3 player will never call for my attention- it just waits for me to use it. However, my iPad or iPhone will call for my attention every few minutes, which is not liberating at all. Tech should be liberating.


[0]: http://tom.preston-werner.usesthis.com

Quite interesting thank you for sharing. The idea that constraints can guide creativity. It reminds me of pg's essay on distractions.

I've got something of a different view. I was never limited TV or game consumption. My parents had the mindset of 'you'll have to learn to self moderation at some point, better to do it when the stakes are small'. They wanted me to learn to recognize when something I enjoyed was having a negative impact. They would guide me subtly by asking me to think about time spent in various areas and what that meant. Now as an adult I've not struggled with moderation in any area. Never got pulled into MMO style games, diet isn't a challenge etc... My biggest worry is that I work too much and don't make enough time for fun.

I remember finding it odd when I'd visit a friends house and they were only allowed a certain number of sodas per day, and had regimented rules about computer time. Later on as adults, these friends almost universally struggled with various addictions real and digital (I've seen lives ruined by MMOs). To me it appeared that they weren't able to manage their own desires without outside inputs. Obviously it was a different experience for you.

Ultimately, all humans are pretty different, and there is likely no one size fits all.

My younger brother always had a really hard time with the constraints my parent set - he would watch TV and play video games in secret, and even sneakily used their credit cards to pay for the MMO he played (he was in his early teens, and my parents didn't give us any allowance- they'd just give us some money if we wanted to go out with friends, or if we wanted a book they'd buy it for us. That worked for me, but not him). At some point my parents did try to not set any limits on his TV watching habits etc., thinking he would learn to self-moderate, as you described; for those few years, he basically spent his life in front of the TV and video games, doing literally nothing else. He's in his early 20s now, and still having a hard time with such matters.

What worked for you may or may not have worked for me, and definitely didn't work on my brother; what worked for me would have probably not worked for you, and definitely did not work for my brother.

Humans are interesting creatures, aren't they :)

I had one or two good friends whose parents were more like yours, and I loved going to their house because we would play video games until 4am and eat pizza and drink unlimited Coca Cola. Doing this once or twice a month was like heaven for me, and my parents were aware of it but were fine with it happening- I guess they thought "as long as it's not in my house and not too often, it's fine". However, one of those friends did not do immensely well later in college/life, to which my parents respond "I'm not surprised".

How I wish those questions had clear cut black/white answers :P

You sound like I imagine my son in 10 years ...

The reason I occasionally let him stay with friends for the junk food/games/TV binge with friends is so that he learns all types of experiences. I don't believe in banning anything, just regulating. How else can we understand our world if we don't experience it for ourselves?

If I were to host the junk food/TV/games binges, I'd have to buy the console and games and stock the junk food ... Things that I just don't want to do .. So it's easier to let someone else do it.

In return, I take others kids on bush walks, to BBQs, roller skating, etc - so their kids get to do something different too.

There may be some "it doesn't happen in my house" as you suggest, but it's not my primary or secondary motivator.

Your observations are interesting. I'm in the early days of this process with my kids, but I do limit their tv and game time. I've found (like the grandparent poster) that an absence of TV and games leads to other pursuits: sports, reading, programming, writing books, etc. To give you an example: I just got the national school results. My son kicked ass in a school that underperformed. One of the key differences I've observed is ... TV and games. Other parents buy the books and encourage kids to do homework, but TV and games are too tempting for developing / young minds. The other thing I notice is that after a while, routine kicks in and study has become fun for my kids. The other obvious issue I notice is the mental fatigue that my kids have suffered when staying with friends for lengthy gaming sessions. We talked about his they felt, why they felt that way and side effects (short attention, lethargy, etc). But, they had no regrets and wanted to do it again. Reminds me of smokers... They know it's bad, but don't care. When my kids are old enough, they can be responsible for themselves. Until then, they'll have to live with my constraints, healthy minds and bodies. I'll just have to accept your possible scenario that it will all collapse into a screaming heap in their early adulthood. Let's say, I'm dubious about your claim, particularly as most levels of freedom and responsibility are gradual with most kids ... But I have seen stranger things.

Having thought more about this. I'm leaning more towards the pro-limitations side of the argument. Clearly games are more addictive now. I doubt many lives were ruined over pac-man and missile command. If young me had grown up around WoW perhaps I'd be different?

You mention the gaming binges at friend's houses. If you have the opportunity, ask the parents if those marathon sessions happen on a normal night. There are a few instances in my youth were I recall overindulgence, and most were at the urging of friends. Friends who were manic in their desire to get as much time in as possible before their parents returned. The idea of 'limits against overuse don't apply the second you leave my house' is precisely what got many of my friends in trouble later in life. Merely trying to assist with perspective. Please don't take this as a critique of your view (as I'm in agreement) or parenting ability (congrats on the high test scores).

>I'm dubious about your claim, particularly as most levels of freedom and responsibility are gradual with most kids ... But I have seen stranger things.

I'd like to point out that the scenario I describe wasn't a free for all. Falling short of expectations was met with discipline and restrictions as any child could expect. Simply that regulation of tv/game time was a personal choice provided that expectations were met.

The important part is the guidance. Finding the best way to show the kiddos how to recognize a bad habit. I'm thinking a good middle ground would be a discussion where the time limits are decided, but then applied universally. Something doesn't stop being a bad habit just because you're at a friends house.

Oh man, if there's one thing I could change in my childhood, it'd be trading half of my gaming time with programming time. I was unfortunately growing up in the golden age of web programming and games and 'chose' the latter. Unfortunately, you can't do much with just 20 years of gaming experience. My parents mostly let me do whatever because I already had a skill then (basic tech support) which we thought would guarantee a self-supporting job after college...


It's becoming increasingly important to get college right the first time through if only because of insane, rising costs. If it were cheap, I could probably find a class on self-moderation.

On the bright side, I find most of today's games boring, so it's rather easy to get productive.

Consider community or tech college. It's still not cheap, but it's way more affordable than a university or state college. They also tend to be more flexible and understanding since most of their students are part-time. Employers care most about your skills, and a year or two at a community college can give you more non-academic skills than any Computer Science bachelor degree.

This is why I love my Kindle Paperwhite so much. I could read on my Ipad, but it's so easy to jump into something else with that. The Kindle's limitations let me get on with reading without being tempted by other things.

I think you may be my doppelganger, but I started off on a TI-83+. ;)

There is some special about introducing yourself to programming on limited hardware like that. It's the sort of device that you can grow into, and which challenges you to get innovative when you start to reach its limitations. I don't believe that sort of environment is the only environment that somebody with a hacker mentality and approach can be created, but it does seem to be a good way to do it.

I am thinking of getting rid of my iPhone when my contract runs out (and just get a dumbphone as a replacement for emergencies

Iphones are great for tourtists. Unneccesary otherwise.

Except when you are in another country, and roaming charges are outrageous.

I am unsure if you can attribute your abilities to your anti-TV upbringing. I grew up with parents who had a relaxed attitude about video games and TV, and thus consumed a lot of both.

But I also spent so much of my time drawing, painting, and programming because I love creating things. I am unsure if TV had any negative effects. The ownership and pleasure you get from your own creations is unlike anything that TV or games can provide, and it is utterly addictive.

"However, my iPad or iPhone will call for my attention every few minutes, which is not liberating at all."

This feeling has been creeping on me ever since I succumbed to a smartphone for work a few years ago.

Instead of releasing me from some of the lower-level tasks I deal with day to day, this phone, these devices, nag me to pay attention to all of my friends back home, all of my old classmates, former colleagues, and all of the things they're producing / forwarding / commenting on.

I moved out of the US this summer and it took me a while to get a SIM here. I had roaming data on my work phone but couldn't use it much for the obvious cost.

It's been liberating to just live inside my head and in my immediate environment these past months. When I finally got a local number, I put the SIM in a 20EUR Samsung flip phone. I find few issues with eventually just living with that phone.

You can turn all that nagging off, so the only notifications you'll get, will be calls and texts. I really think some people blame the smartphone for being a distracting element, and it's not wrong that it can be distracting, but it's because the user allow it to. You can turn off all notifications, so you decide when to see if there is something of interest. For instance, my phone does not tell me when a mail arrives - the little badge will show how many new mails if I choose to flip to the screen with the mail app. Facebook never notifies, not even with a badge - I decide when I want to see Facebook (which rarely is on the phone, that app is horribly slow). The same goes for most of my apps. Only when people try to contact me directly by call or text will it go off, and if I'm particularly busy I will set it to "do not disturb" and rarely be bothered by it.

"You can turn all that nagging off, so the only notifications you'll get, will be calls and texts."

Very true.

I think what I've feeling, and trying to describe, is a rejection of expanding my consciousness into the Internet. A large portion of my social groups use those social networks as extensions of themselves, for communication and interaction.

With immediate access to those channels it's difficult to ignore the draw of that technology. And not using those services regularly ends up being the same as not using them at all.

I know it all too well. At some point I felt so annoyed by the smartphones intrusion in my life, that I began to ignore the notifications. It was great for a while, but I still felt a slight nagging. So I did the logical thing, and turned it off entirely. Haven't missed anything of significance, and I'm still in touch with the people that I want to be in touch with over the social networks.

Beautifully worded :)

I've got a 4th gen iPad, a laptop, an external monitor + kb + mouse and a Lumia 920 right now. I still wonder at times if I should cut back a bit, but most of these devices have their uses for me.

Why did your parents, who only let you use the regular computer for homework, let you use your calculator for programming? Is it possible that your identity-building time spent creating SimCity for the TI-82 was not something deliberately enabled by your parents' somewhat closed-minded attitude toward technology, and was, instead, a "waste of time" that they simply failed to realize was happening?

Because they had no clue that a calculator could be used for other things than calculating :)

If you want to read more, trash the ipads and push Michelet Histoire de France (19 vols) from project Gutenberg to your kindle.

Same here by the way, no tv, my laptops are my wife second hand ones, never owned an Apple device.

I fear the ergonomics of mobile computers do more damage than desktops. Source: own experience

I'm not sure if you mean that the comfort of working on a tablet connected to remote virtual servers is higher than that of a single local laptop or desktop.

But if that is your final point, I very much disagree. Besides the fact that a computer keyboard and mouse are often vastly superior to the cramped peripherals you typically see connected to a tablet, large screens and user interfaces designed for large screens often have a huge ergonomic and productivity advantage over the tiny, touch centric interfaces for tablets.

Note: the previous paragraph only applies to the current state of apps for creative and engineering activities. It could be that five years from now, tablets and their software for creating content have evolved to a point where they rival or surpass desktop software. Or they might not. Either way, my point is that we're not there now, and creating software etc on tablets is a sub optimal experience today.

Long form writing seems close to impossible on a tablet. A long hacker news comment is possible, but once you are looking at even a modest book report, I can't see it.

Maybe with some sort of accessory keyboard, but then you are looking at more of an ad hoc laptop than a pure tablet.

You'd expect long-form writing to be unthinkable on today's mobiles, but plenty of people wrote whole novels on Palm PDAs with only the on-screen keyboard/Graffiti. I also performed extensive text entry on Palm PDAs for my middle/high school notes and assignments, and didn't consider it a bad experience. And these examples, and the above commenter with his Sim City clone for the TI-82, demonstrate, when you make up your mind that you're going to do something on a limited device, it's often surprisingly possible.

I do not disagree, just wanted to mention that nowadays it is still possible to learn multi-booting with Android tablets and custom ROMs

I'm starting to see a trend of tablets-as-laptops where people have a case that integrates a keyboard and they type their papers or whatever else on their iPads or other types of tablets.

Having a 1.5 or 2 pound laptop, with a 12 hour battery life that you can detach the keyboard from for $300 is a much better form factor than the current typical laptop. Many of these tablets also come with wacom pen digitizers or touch, allowing a creative input that is missing in many laptop form factors.

Also you can still create web-apps and other such things with things like node.js and so on android tablets today. Javascript really is the BASIC of this generation.

I won't be surprised to see full IDEs that could be viable in creating general purpose apps in near future. I really think Android & iOS will eventually become the next 'desktop' OS with a full suite of apps as powerful as the current desktop set of applications. Concerns about tablets as consumption only devices will go away probably within the next decade as the world transitions to these 'mobile' OSes.

Android-based IDEs already exist. I should know. I was looking for one the moment I bought my ASUS Transformer a couple years ago. I found the only thing I couldn't do with the device was write and compile programs. I would bring it to hacking sessions, but I couldn't test code on it. At the time, the only option was a text editor that could style and check code syntax.

The main downsides of these devices for programming is screen real estate, CPU speed, and support from major IDE/compiler vendors.

Example: https://play.google.com/store/apps/details?id=com.aide.ui

I am following you, but cannot make myself comfortable with non-elastic screen size. I like my tablet to be 8", while anything less than 12 is not good for a laptop (I tried with UMPCs and Netbooks). However, I certainly can imagine myself plugging 8" inch tablet into 27" monitor and keyboard/mouse combo for comfortable work at office.

>because people do not need $2000 Facebook and email machines

Devil's advocate: if that is true, then why are macbook pros such a hot sell? I'm typing this here in a college library's lobby. When I look around, I see roughly 3/4 of the laptop-using students are using a macbook pro, with a few macbook airs littered around. If I were to walk around and glance at what people were working on, it'd probably be something like 70% youtube/facebook and 30% using some word processor.

My point is that the consumer's decision to buy or abandon a product isn't solely driven by how good the product is, the value of the new item as a "status icon" also has to be taken into account. All you need to get the customer to justify that $2000 price tag is a culture of rabid consumerism and the garauntee that they will be cooler than their friends if they buy this extremely expensive laptop that does all sorts of things they will never ever use.

In your example involving the 2006 PC and the new iPad, I would argue that a huge contributor to the consumer's abandoning of the PC is because it's nowhere near the potency of a status icon as an iPad is.

The MBP is one of the most functional devices available. The wanky, hipster appeal is way overplayed. Plenty of people use them inspite of their image.

Phones satisfy the mobile convergence thing between organisers, phones, cameras and handheld games. Tablets satisfy the "computer as a bicycle" vision of Jobs and bring portable computing to the masses. But neither are really full featured enough for a developer or a college student.

When it comes to a full featured, keyboard equipped, programmable device there is bugger all that is light, powerful, has a long battery life and ships with a decent environment (posix, term, ssh) out of the box. Dell's XPS 13 DE delivers a bit of the picture but it doesn't have the quality bundled iWork apps that would appeal to a student and has worse battery life. They are getting there though.

The rest of the laptop market is hindered by shipping with Windows only and the difficulty of getting pre and post-sales driver support for the developers and students who require an environment like Linux. The industry has really fucked themselves trying to keep to the model of the glory days.

Hopefully Steam will help set things straight by creating a large mainstream market that more niche users like developers, students and scientists can benefit from.

Manufacturers are going to have to stop the race to the bottom and start building less models, higher quality and in bigger volume if they are to compete with Apple in price, quality and profitability. And they are going to have to explore well integrated Chrome OS and Linux packages and work on developing drivers with better performance and power efficiency.

> The MBP is one of the most functional devices available. The wanky, hipster appeal is way overplayed. Plenty of people use them inspite of their image.

They are fine machines, but if you don't care about subjective things, like how a laptop looks, how much it weighs (an extra pound or 2 never really genuinely matters to a use case), and you only care about getting exactly what you need (not want) for the lowest price, then there is no way you would ever buy a MBP. They are a product that we convince ourselves that we need or deserve because we lust after it, even when a cheaper alternative would do just fine.

I have bought macs for a while, but if I have to be really honest with myself there was always a cheaper alternative that was 'good enough', it's just that I subjectively want apple's products because they are 'nicer'.

> The rest of the laptop market is hindered by shipping with Windows only and the difficulty of getting pre and post-sales driver support for the developers and students who require an environment like Linux. The industry has really fucked themselves trying to keep to the model of the glory days.

Quite good for those of us that work across multiple OSs.

Just install GNU/Linux on VMWare with CPU virtualization enabled and get two OS in a laptop with the minimum of fuss.

Just install GNU/Linux on VMWare with CPU virtualization enabled and get two OS in a laptop with the minimum of fuss.

And far less battery life.

I often run Ubuntu or Windows on my MBA with VMWare to compile some software. It's one of the best methods to soak your battery empty quickly.

Well, I do have a laptop with Windows 7 and a netbook from Asus that was sold with a Linux distribution.

I use GNU/Linux since 1995 and on a laptop still looks like 1995 to me, in terms of graphics card, battery, hibernation and wireless support.

My Asus, which was sold with a supported Linux hardware, does not hibernate.

In terms of battery life I am yet to find a laptop where GNU/Linux lasts longer than Windows.

People who believe in being able to buy social status are the ones perfectly fooled by advertisements! Someone can be butt ugly, poor and stupid, but still have the highest social status and popularity in most social environments. What makes one an attractor of popularity is demonstrating power, will and the ability to achieve what you want.

Who would be more popular?

A) A rich guy who thinks he can buy friends, by inviting everybody to an expensive cocktail to an exclusive restaurant?

B) A guy who own lots of expensive high-tech and luxury car + his own house?

C) A guy with charme and charisma, who can make anyone feel special, or have a fun time, even in the dirtiest place?

Choose yourself. I believe you all experienced A,B and C already. But I would rely on the scientific evidence that C) has a higher long term chance of staying an attractor of popularity. (I'm sorry, I couldn't put the references together, but hope you understand)

Side story: A weird guy in my old class possessed a bluetooth wireless headphones with integrated mp3 player in 2006, a 3000EUR laptop and all sorts of other very expensive gimmickry. That didn't make him cooler at all. People still didn't like him and it just made him more vulnerable to attacks of the some of the more primitive pupils.

> People who believe in being able to buy social status

Trust me, you can buy social status. If you pay enough, you have the status.

> Someone can be butt ugly, poor and stupid, but still have the highest social status and popularity in most social environments.

Most social environments he exposes himself to, yes. Most social environments, no.

He said that a product being a status icon justifies a high price, not that a high price makes something a status icon.

edit: parent comment has completely changed since I replied

Yea, guys can buy people out of their league, but their wives are usually misserable, and cheat. Women have a hard time doing the same. Actually, the whole success thing works in reverse for women. They become less attractive with money, unless they earned that money with their mind. Even then the men these women want still don't want want them. I was trying to soften the blows I will get for this thread.

In Marin, I see so many attractive women who settled for some fugly dude just because he has money. Maybe, that's what keeps guy's striving for the next Facebook?

I could go on and on about this subject, but I'll get clobbered.

If you can't achieve C, you sure can use your money to achieve A or B.

Also if you're lucky enough to be able to carry off C, you're more likely to have options A and B at your disposal.

I'd say laptops have been improving quite a bit in non-computational ways. Weight, battery life, and screen quality particularly.

A MBP can be brought around to show how cool you are, a Dell Workstation not so much. Maybe if somene builds a handle for the Mac Pro borg cylinder. ;)

Because MacBook Pros have HDMI output, more pixels, larger hard-drives, and run Starcraft.

Because people like new things.

My grandma taught me the saying "More money than sense." . It's proven incredibly wise

When you start collage, you normally get a new computer. But otherwise I agree with your premise. A high powered desktop PC is no longer a status symbol for anyone compared with laptops, tablets, or phones.

While many people but them for the technical side as pointed out in other replies, there is no denying that other buy them as a status symbol, in the same way they will buy a BMW for a car.

When my friends ask for laptop-buying advice I tell them if they like the keyboard and screen, then its just plain hard to be disappointed with anything new.

That's exactly what I'm disappointed with on everything new. The Thinkpad T60p, from 2006 remains superior on both points to everything new from my point of view.

That's exactly what I'm disappointed with on everything new. The Thinkpad T60p, from 2006 remains superior on both points to everything new from my point of view.

While a very fine machine in it's day, the screen, battery life, processing power, disk speed, size, and lack of heat on a modern machine like the 13" Retina MacBook Pro are on another level. You're talking about 7 years of evolution.

Well, I have T61 guts and an SSD in mine. It isn't so far behind as you might expect.

I've used a 15" Retina Macbook Pro. The screen is not better. Sure, it's higher density and brighter, but the viewing angles are not better. Subjectively, I'd say the color reproduction is worse (I used them next to each other). Reflections, glare and fingerprints are significantly worse on the MBP.

Yea, I have a ten year old Toshiba P26-S2607 I still use on a daily basis.

I have a one year old MBP that I haven't used much; I'm not sure why, but the screens seem the same?

I do know one thing about most old laptops; they were made to last longer than 2 years.

That said--HP--as made crappy laptops for quite some time. I bought one a few years ago and it was horrid on an engineering basis, but it looked Slick.

I had an HP laptop that I bought around 2009. It lasted a year before the insane heat issues started; I agree with you completely regarding the engineering issues. In the end, I ended up buying a Lenovo about two years ago, and it's been perfect for me.

Getting back to the original story, I don't think I've bought a desktop since 2003. I prefer using them, but laptops are just so convenient.

Are you sure it was simple heat issues and not a defective Nvidia GPU? A bunch of them in that time period were prone to premature failure. Many HPs used them, as did a few Thinkpads.

I also enjoy quite old laptop (HP2530p, 2009, 12", Core2 Duo 1.9GHz, sturdy, lightweight). I upgraded to 256GB SSD and 8GB RAM. Plenty enough for the next couple of years, me thinks :) Perfectly runs W8.1/LinuxMint/Android-x86ICS

Completely irrelevant to the rest of the discussion at hand, but if you are going run Android as a Netbook OS on stock PC hardware with keyboard and mouse/trackpad, I've found that everything from Jellybean and up is incredibly much better to work with than Android 4.0.

They've done some deep level fixes which just makes everything flow and stick together as one "netbook" user-experience in a much better way.

My experience is mostly from Asus Tarnsformer type devices and not regular X86 laptop hardware, but I suspect the same improvements should be valid in X86 country. You certainly have nothing to lose by trying it [1].

My 2 cents.

[1] http://www.android-x86.org/releases/build-20130725

Agree, and to wander further afield: how's your luck running apps? The Google apps work great, I use Opera and some hack to get Flash games to work, but the vast majority of apps just fail to load.

In spite of the problems, it's amazingly fast and usable, esp. compared to Windows/Ubuntu.

running the stable android-x86 on a Samsung NB505

Better than newer T series? I've been looking into Lenovos because the keyboard is the main differentiator I care about on a laptop.

I've typed on a T530. It was pretty good, but the removal of several keys I actually use and the relocation of others bothers me. I don't think the feel is quite as good on the T530 as on the T60, but it's still one of the best laptop keyboards.

Yes! I have an X1C and they have messed the home, del, pgup, pgdn island, and placed print screen between the right alt and ctrl. I'm tripping over the keys all the time.

How hard can it be to design keyboards with all keys in the correct place? Thinkpads used to be the only ones which got this design issue right.

I think Lenovo wanted increasingly large trackpads with increasingly short screens. They've mentioned in their design blog that some of the changes are driven by "consumerization".

The problem with that is Thinkpads were never meant for mainstream consumers, Lenovo isn't going to out-Apple Apple, and if it wanted to try, it would do better using a model line that doesn't have a business-oriented reputation going back decades.

I'm hoping and waiting for some third party like infocomp to supply proper aftermarket thinkpad keypads for the new lenovo machines, with the buttons restored to proper order. Otherwise I will not be "upgrading", ever.

Don't get me started on the single audio socket for both recording and playback.

I have the same laptop running debian. Even runs small VMs ok and has survived multiple liquid accidents and dropping

I've upgraded it with a T61 motherboard, 8gb RAM, a tweaked BIOS and an SSD. It runs not-so-small VMs now. I try not to drop it or spill things in it, but the spill-resistant keyboard is good peace of mind.

This is a common generational worry, right? We learned X a certain way, that pedagogy was necessarily linked to the technology at the time, we worry in retrospect that that specific technology was a necessary condition for ever learning X.

I'm only 28, and I do it, too-- e.g., how will kids expand their imagination and learn about the world without only having paper books to immerse themselves in for hours at a time? How will anyone learn the basics of programming without finding QuickBASIC on an old Packard-Bell 386, playing around with Gorillas or Snakes, or entering their own code from books in the library?

I think there will be a sufficient number of hacker types around for the cynical and simple reason that corporations need to inspire kids to learn how to code so that they can hire folks in two decades. This ought to inspire a token amount of educational support and tool building so that entry-level development will always be accessible to kids.

These days, everyone has access to an excellent cross platform learning platform for programming: the browser. So yeah, I think if anything programming is more accessible then ever.

Not that I disagree with what you've said, but browsers aren't a platform for learning to program on a tablet.

The worry isn't unfounded though and in the case of programming it has actually been measured, the decline in quality of CS students is what inspired the Raspberry Pi project:


people do not need $2000 Facebook and email machines

I dunno, Facebook and GMail seem to get slower each month. They're unusably slow, even in Chrome on Windows, on both my and my wife's circa-2008 machines.

HTML5 combined with crazy CSS can make a C2D crawl these days. It's not just Flash anymore that's sucking CPU power down.

I still use Facebook on old iPad 2. And it goes faster as the same day I've bought it.

That's generally the solution to the problem of accessing the Web on old PCs: request the mobile version of the website. Heck, with enough trickery you can even get Opera Mini to run.

I would pay for a silent PC. I paid a lot for a quiet Apple PowerPC Mac pro back in the day and built a quiet PC but they are not silent and I have moved to a quieter place and I can hear them (yes the Mac runs Fedora now). I have some silent ARM machines but they don't quite cut it yet though maybe the new quad core one will.

Silent PC Review ( http://www.silentpcreview.com ) occasionally reviews complete systems. Last year they measured an i7 machine from Puget Systems at 11dB in their anechoic room ( http://www.silentpcreview.com/Puget_Serenity_Pro ) which is the best I've seen for an off-the-shelf PC.

I haven't used a Puget machine myself, but I've relied on Silent PC Review's component recommendations for nearly a decade, and never been disappointed.

I have seen full size ATX cases from Silverstone with passive cooling, where the entire case is machine out of aluminum and acts like an enormous heat sync. Have yet to find a good fanless power supply. The new Mac Pro supposed to be only 12DB. It's hard to beat that, most rooms have ambient noise of at least 18db.

Seasonic seems to have a pretty good fanless power supply.

> The new Mac Pro supposed to be only 12DB.

Until the dust hits it...

Solution: clean it once in a while :)

>Solution: clean it once in a while :)

Teach me oh great one. Honestly I've never managed to clean a fan. You can brush off the obvious dust but the noise comes from dust getting into the fan itself.

Spray it with compressed air inside and out. You can get into the slit between the fan housing and the motor and blow it out there as well. Most fans start making noise due to bearing failure / wobble, not the dust. I have a small home server with 4 fans. After almost 10 years now 3 out of the 4 fans are as quite as they were on day one, the 4th has developed a bearing issue and needs to be swapped out eventually. I am assuming that Apple will use hire quality fans then the $3 crap fans I am using.

>Most fans start making noise due to bearing failure / wobble, not the dust.

That exactly is the issue. They don't just mysteriously develop bearing failure / wobble...they do so because dust gets into the bearings. And that exactly is my point...no amount of compressed air can fix dust in the ball bearing grease. I just end up replacing all the fans after a few years...

There's some complete 'mini' systems that achieve this, presumably by pushing the the transformer out of the computer and into a wall-wart or similar: http://www.damnsmalllinux.org/store/Mini_ITX_Systems/533MHz_... http://www.amazon.com/Intel-D2500-Fanless-Mini-ITX-D2500CCE/...

Buying these as stand-alone components seems to be more of a challenge.

My MacBook Air (2012) is completely silent most of time (except viewing videos in Flash or Silverlight). Given is silent, I can actually hear the power supply of my monitor buzzing! (actually it's very quiet buzzing) - be careful what you wish for I guess.

but the video is so loud!

Yeah I know it's awful, if my Celeron 450Mhz could play DVDs at no effort, I don't know why this can't play videos on 5-10% CPU and not spin up the fan.

Most video encoding is mp4-type now (rather than the mp2-type used by dvds). Mp4 is typically a lot more processor intensive.

Any even vaguely modern laptop offloads mpeg 4 part 10 decoding onto a decoder chip or the GPU.

The older Mac laptops had (SMCUtil?) a utility to change the threshold for the fan - I know lots of Wintel systems have this in BIOS/EFI.

I have one, from http://www.quietpc.com/

It's completely silent, you can only tell that it's on by looking at LEDs.

Mac mini has fans but is basically silent.

things with fans are not silent enough - the mac pro seemed ok but now it seems too noisy

I guess that's what I get for living in the city...

my silent PC solution from 2008. Still the best value for money if architecture of your house supports it :) http://solnyshok.blogspot.com/2008/03/my-silent-pc-solution....

https://store.tinygreenpc.com/tiny-green-pcs/intense-pc.html Very silent fanless pc. I have one and I am glad.

If an i3 is fast enough for you, and you don't need high end graphics, you might want to consider an Intel "Next unit of computing" system... You'll need to get laptop components for it though. I've considered it, but not that into quiet, and want a bit more power.


There are passive coolers for i7s out there. You can even find passively cooled GPUs and PSUs. It's pretty cheap to build a silent pc.

It's tricky getting a passive GPU to actually work, mine has a tendency to overheat and halt the system rather than throttle down under load. It required a fair bit of tweaking to case fans and driver underclock settings to get it stable.

Those passive coolers generally still rely on some airflow. With no case fans at all, I'm told they won't be able to hold up under any decent load.

Intel NUCs [1] are well engineered and extremely quiet.

[1] http://www.intel.com/content/www/us/en/nuc/nuc-kit-d54250wyk...

In the Netherlands there is https://ikbenstilcomputers.nl selling high-end fanless computers.

I believe there will be a major cultural shift to tablets/phones/handheld.

The social implications worry me -- mainly that the most popular handheld devices (iOS) are _locked down_, you can't actually install whatever software you want on it.

I don't know if the actual experience of using Android, for non-techies, might end up seeming similar?

The social implications of this worry me. We spend increasing amounts of time on our computers, and have decreasing power and freedom over what software they run how.

I think your concern about it being harder to create on tablets, and the social implications therein -- is also legit, but it worries me less than the loss of control over our devices. People will find a way to create, although the nature of what they create will be affected by the affordances of the device, for sure. (there will be a lot more 140 char poetry, heheh)

Not disagreeing with your overall concern about locked down platforms...

mainly that the most popular handheld devices (iOS) are _locked down_

With a 80% market-share[1] I think it's safe to safe Android is by far the most popular handheld platform, with iOS being for the niche market.

[1] http://techcrunch.com/2013/08/07/android-nears-80-market-sha...

Most game makers are targeting consoles, which just went through a longer than usual shelf life. With the new ones coming online, we will see more PC upgrades from gamers.

Also resolution stopped at 1080p for a decade. Now that 4k is happening, there is an easy path to performance for gpu manufacturers.

Resolution dropped

High-end for 4:3 was 2048x1536 (3 megapixels)

High-end for 16:10 was 1920x1200 (2.2 megapixels)

High-end for 16:9 was until very recently 1920x1080 (2 megapixels)

Just to clarify, high end for last two ratios are 2560x1600 and 2560x1440. Both were available at least two years ago when I built my current machine, although after some quick research the 16:9 format had some available as early as 2007.

My laptop bias is showing. Those were the highest resolutions available on laptops. They were also the highest commonly-available resolutions for desktop screens, though none of those was the highest that existed.

It's really _only_ code that is harder to create on tablets though, and it strikes me as extremely narrow-minded to write off the music, art, text, huge Minecraft sculptures, photos and videos of singing, dancing, playing, that have been gleefully created by people young and old on these devices.

It's the exact sort of snobbery that has almost completely killed art, dance, drama and music in many schools, as if the only valuable acts of creation left to humanity are engineering and science (which, incidentally, are both wonderfully served by the innumerable education apps on these locked-down, post-apocalyptic devices).

Code is harder because editing text is harder. Text input and manipulation is just not very good on tablets. Even if you plug a keyboard into iOS editing is still worse than WordPerfect circa 1988.

You might type stuff in OK, but manipulating text is awful.

Yeah, I have no idea how Shakespeare got anything done.

Sure. 884,421 words[1]. And with that workload, I bet he used the most efficient technology available at the time to do it.

[1] http://www.opensourceshakespeare.org/stats/

I agree, I brought a high-end Dell 2.5 years ago and it was starting to feel like upgrade time (sluggish performance etc) a few months ago. I then stuck in a 240gb SSD and its faster than it ever was, even when I first got it.

I agree with the general consensus here regarding how computers in general have been aging better now. Especially with different computing devices available today, it's hard for the average person to justify buying a new computer within 3-4 years of theirs.

At the same time, I think even the "average" person would want more than a 128gb ssd, and therefore today's entry-level harddrives equipped with these small ssd's won't age that well. I know, I know, ssd's come in larger sizes --- but they become significantly more expensive, and most entry-level notebooks (with ssd's) come with 128gb. As a comparison, it's almost weird that years ago you could get a 500gb harddrive without giving it a second financial thought. As such, I think that if there is anything that a normal person might want, it's more harddrive space as they fill up their small-ish SSDs --- so that they don't need to worry about deleting things when they have too many pictures, games, etc. The average person won't want chrome to take half the number of milliseconds to open a new tab, or their games to go from 40 fps to 60fps. But, to me it seems easy to fill up these smaller harddrives, and many people might be looking for a new computer to deal with that.

Before someone mentions it: YES, cloud solutions and external solutions exist. But is it part of mainstream usage to store your stuff on an external hdd? Also, wouldn't people anyway want a future computer where they didn't need to do that? I'm not claiming they have terabytes of data, but I think over the course of 3-4 years, people could pretty easily accumulate > 256gb of data. Otherwise, is there a free and easy cloud solution that gives > 50gb of space that people use a lot today? (Not to my knowledge)

everyone I know has a 64-128GB SSD and a 1TB internal HDD, and a 1TB external HDD for backups.

Quick solution: USB3 (or e-sata or thunderbolt) external hard drive. Alternatively, NAS (that you can cloud-ify).

I've never needed a hugh hard drive. External HD's work fine for me.

Flickr gives you a terabyte free.

surely no one wants to rely on a third party service that is only available online and at the service's whim for their photos.

Same here. I feel like we've reached the end of the relevancy of Moore's Law. My PC at home is from early 2008, and it's still an excellent machine. It played all my games wonderfully, even new ones, even 3D ones. I can't imagine a game looking better than The Witcher 2.

Actually, this isn't entirely true; a few months ago, my WinXP install started to play up, to I bought an SSD and installed Win7 on it. Now it runs better than ever. A recent OS helps a lot, as does a SSD.

I can see one reason why I might still want to upgrade, though: The Witcher 3. I doubt it's going to run well on my by then 6 year old machine. But maybe a new graphics card is all I'll need. Or maybe it'll even just work.

My Macbook Pro is a lot more recent, but it also feels like it might last forever. It can handle everything I throw at it. Why would I ever need something more powerful than this?

If I want anything new from my computers now, it's stuff like smaller size, less noise, less power use, etc. They're powerful enough.

For high performance gaming, I wonder if console gaming has something to do with it. With publishers now focusing on consoles instead of PCs, graphics may be held back to a degree due to that.

But another thing to consider is that the console release cycle has also slowed down, because there's less of a need to upgrade there as well. So you see the lack of desire to upgrade trend emerging for consoles as well as PCs.

I do think that nearly everyone who wants a PC has one at this point. That plus no desire to upgrade means slower sales. If people were actually ditching their PCs entirely, that would be a different story.

I use my 2009 iMac for local heavy lifting and watching TV. (Heavier lifting I use the cloud.) I will upgrade my desktop once the LCD goes retina.

I will continue to upgrade my laptop frequently. Lighter, smaller, faster, longer lived, more durable. Every new laptop has increased my productivity, flexibility.

I just bought a 2013 MBA 13". Most amazing machine I've ever had. Now that I'm accustomed to 13" (vs 15"), I will likely buy a 2013 MBP 13" retina. I'm certain that I'll be very happy.

I agree. Much of this is just because the processing is moving off of the machine too. When you were doing all of your own computations, any incremental thing you did required a stronger PC. Now that computation is happening on the server.

There are programmers, mathematical and financial users who are still stretching their desktops, but for most of the rest of us the need to upgrade is going away. It's almost like it's time to upgrade when there's just too much clutter on the old machine.

I agree with the point about keyboard and screen.

I wanted a gaming laptop, but once I got into that category, I'll be honest--the deciding factor for me was keyboard layout. I'm a developer so it's really important to me to have special keys in the right place, and to have them easily distinguishable by touch.

Nothing is worse than arrow keys with no gap separating them, or an F5 that blends in, or page up/down in some unusable position.

Got some Asus model, and it's great.

> because people do not need $2000 Facebook and email machines

I agree with your post but just wanted to point out a Facebook/Email PC does not cost 2000 dollars anymore (and has never cost that much for a long time) :) You can get around with a 300 dollars laptop just fine of that kind of usage.

you can use a wireless/usb mouse and keyboard with a laptop or [windows] tablet and its pretty close to a PC... you can also use an external monitor on many of them. PCs need to become the size of a raspberry pi, that's all.

You're just not playing the latest games or doing any intensive computations, or just don't care that you could finish your task 3 times faster than on your 2008 CPU.

What's your framerate in Battlefield 3 on a 64 player map at 1920x1080, high settings? Or Crysis 3?

You just picked the most expensive calculation you could, more or less, and asked how a general-purpose PC handles it.

A brand new general-purpose PC handles that situation equally well--unplayable.

No, the most expensive would be a 3-monitor setup at 2560x1440 on ultra settings.

That's the thing...most games are still not multi-threaded well, and single threaded performance hasn't gone up more than maybe 20-30%, they just tacked on more cores.

That's actually not true. Modern high end CPUs are multiple times faster than the CPUs from 2008. The architecture changed too, not just the frequency.

I have an i7 920, purchased in December 2008. Please point out which consumer processors (No $3000 xeons) have "multiple times" greater single threaded performance.


"For what" is the obvious question. Web development with a remote testing environment, office applications, email, web browsing - sure, a Core 2 Duo is more than good enough if your software environment is kept in order. Audio / video / photoshop, gaming, developing software that does math, data analysis - you can never get fast enough.

The limiting factor is if your computer's feedback loop is tighter than your brain's perception loop. If you can type a letter and the letter appears, your computer is fast enough for word processing. But, if you can run a data analysis job and it's done before you release the "enter" key, it just means you should really be doing better analyses over more data. Certain use cases grow like goldfish to the limits of their environment.

Even with gaming there isn't as much of a push as there used to be to constantly be on the cutting edge. This is mostly do to the fact that the industry as a whole focuses primarily on consoles first now and thus consoles tend to be the gating "LCD" target. If your PC is at least as good or a little bit better than a console released in 2005 or 2007 you're set. Of course, there will soon be a bump forward here with the next gen Xbox and Sony systems coming out in a month.

I fit into a lot of the special cases here: Developer, gamer, amateur photographer with many gigabytes of RAW files and even I don't feel the need to upgrade systems like I used to. Now it is about an every 3-4 year thing whereas previously it was yearly or more.

Emphatic agreement. I wind up helping folks a lot with writing high performance software, and it's very easy to get to the point where the time to run a model is totally determined by how fast the CPU and IO are. I'm talking problems where naively written C would take an hour, but if I'm careful and use some blend of very clever c or Haskell, the computation is done in 2-5 minutes

in machine learning, as soon as you stop using linear models which are normally quite fast to compute, models will be as slow as you can tolerate.

for example: random forest, gradient boosting, gam, etc -- you will typically do parameter searches and the models you get are as good as your willingness to wait. Good software will run at a significant fraction of memory bus speed and the faster that bus goes the better your models will be.

exactly! This winds up being a memory locality / layout problem often times!

eg: most CPU memory traffic is in "cachelines" size chunks so you're best off trying to organize information so you can use all bandwidth! I've a few ideas on this i'm trying to bake into an array/matrix library i hope to release soon. :)

What kind of problems are those? I would love to find problems, ultimately examples, where smart Haskell and c blends are superior to pure c.

It's not neccesarily which language is faster, but the which algorithm is faster. He said naively written C, which mean the algorithm may be entirely different and run in O(n2) and much slower than a Haskell version which use a different algorithm and run in O(nlogn).

actually in the specific example I'm thinking about, i'm talking about memory locality being the performance difference (and in this case, array layout for matrix multiplication).

The naive obvious "dot product" matrix mult of two Row Major matrices is 100-1000x slower than somewhat fancier layouts, or even simply transposing the right hand matrix can make a significant difference, let alone more fancy things.

Often the biggest throughput bottleneck for CPU bound algorithms in a numerical setting is the quality of the memory locality (because the CPU can chew through data faster than you can feed it). Its actually really really hard to get C / C++ to help you write code with suitably fancy layouts that are easy to use.

Amusingly, I also think most auto vectorization approaches to SIMD actually miss the best way to use SIMD registers! I've actually some cute small matrix kernels where by using the AVX SIMD registers as a "L0" cache, I get a 1.5x perf boost!

This is like replacing the compiler optimizer algorithm with your own, similar to the method of writing critical function in Assembly, right?

Still I don't see the connection to Haskell, can you elaborate ?

oh, thats just me rambling about why i don't trust compiler autovectorization :)

well: 1) i've been slowly working on a numerical computing / data analysis substrate for haskell for over a year now.

2) the haskell c ffi is probably the nicest c ffi you'll ever see. Also pretty fast, basically just a known jump/funcall! And no marshaling overhead too!

3) theres a lot of current and pending over the next year work to make it easy to write HPC grade code using haskell. Some approaches involve interesting libs for runtime code gen (the llvm-general lib for haskell is AMAZING).

Theres also a number of ways where ghc haskell will likely get some great performance improvements for numerical code over the next 1-2 years! (i've a few ideas for improving the native code gen / SIMD support over the next year that I hope to experiment with as time permits)

Right, I read 'naively' as 'natively'. Carry on!

to answer your question

just plain ole mathematical modeling / machine learning, and associated duct taping the tubes.

I am also going to be releasing the start of a "Numerical Haskell" tool chain sometime (hopefully this fall, but life happens and all that jazz)

Your problems sound interesting. Could you elaborate?

just plain ole mathematical modeling / machine learning, and associated duct taping the tubes.

I am also going to be releasing the start of a "Numerical Haskell" tool chain sometime (hopefully this fall, but life happens and all that jazz)

In what planet? I'm not even going to use myself as an example because I do other heavy stuff with my PC, I'm going to use my non-tech friends: one of them got a new laptop with 8GB of RAM, why? because she was complaining about webapps using too much memory and slowing down her previous system.

Regular users don't know or care about memory management, they don't even close old windows or tabs, its about convenience. That's not a problem in mobile where the need is the mother of invention so mem management is automatic and chrome reopens the tabs you had by itself, but in a desktop environment (specially windows) one wrong click and the session restore in chrome wipes your previous session.

But it was cheap, cheaper than an unlocked iphone and it gets the job done so its ok for her.

> one of them got a new laptop with 8GB of RAM, why? because she was complaining about webapps using too much memory and slowing down her previous system.

I suspect a few people on HN will be reluctant to see their role in this arrangement. ducks

Don't duck, it's the truth.

We built really powerful computers and decided the best way to use them was to run web apps consisting of a poor performance scripting language with a poor performance visual-rendering language and half the people who build with it seem to think that anything done in a web browser is free.

"Modern" CSS for even a simple site is pretty silly. So many horribly inefficient rules, sometimes dozens of levels of overwriting properties, etc. And gobs upon gobs of Javascript that does very little but is constantly running, checking input boxes, animating every tiny little detail, doing things that can be done without Javascript, etc.

I have better system performance when running a Windows 7 VM, a semi-bulky image editor, or compiling the kernel than I do with a few bulky webapps in Firefox. And this is on a desktop built just 4 years ago.

I have a light-weight Linux setup that uses 80MB of RAM after logging in. It has 1GB of RAM. I can't run GMail and Facebook Firefox and browse the Internet at the same time with minimal open tabs. It's sad.

It's not like it's constrained to the web either. I had a bit of an epiphany about the state of desktop Windows a few years ago when for a hobby project I wrote a UI using only Win32 and C and no external dependencies. Maybe it took longer to write than it should have but I was really shocked at how the thing ran so much faster and smoother than just about any UI I was using on that machine. It dawned on me that all those layers of MFC, WinForms, WPF, Silverlight, WinRT and whatever else they'll come up with in the name of developer productivity are no match for a set of APIs designed to run at 25MHz with 8MB of RAM.

This is believable, but I'd like to dig a little deeper. Do you remember which Windows apps you were comparing yours against? And on what version of Windows?

This was Windows 7, I am pretty sure with dwm disabled, and it was comparing (albeit informally) against every Windows app I used at the time.

I know that some newer frameworks take advantage of GPU features (actually I used to work as a dev in the Windows org) but guess what, GDI is still faster.

This is why I still use Winamp, nothing else comes close to dealing with playlists of thousands of items while also being fairly customisable.

Let us also not forget it's power to literally punish ungulates as well.

I can pop-up Activity Monitor any time in the day, with multiple apps open, and it's guaranteed the app using the most memory is a browser (any browser).

As a data point: Safari is using 600+MB now with just HN and GitHub open. It's using more memory than all the other apps (editor, terminal, mail, Rdio, Skype) combined. 600MB is not much by today's standards, but comparatively, is ridiculously wasteful. It's a damn CD.

Hope the Mavericks update improves this a bit since I'm short on RAM on this machine (4GB).

Mavericks does amazing things to memory usage. My machine has 4GB as well, and safari went from 1GB+ memory usage to 200MB memory usage for the same amount of tabs. Under ML, I was swapping constantly; now, no swap at all.

Did you count up all of the individual processes? Webkit is now finally on par with Chrome in that each tab is it's own running process.

Did update today, and it's certainly much snappier, it rarely hits the disk now.

I have 16MB, and usually have dozens of windows/tabs open in Safari, Xcode, Textmate2, and a few miscellaneous apps. Currently, Activity Monitor shows 15.9x MB used, 10.87 App, 1.26 File cache, 1.91 Wired, and 1.94 Compressed. Swap used 0 bytes, virtual memory 20.12 GB. No noticeable hesitations attributable to compressing/uncompressing when switching windows/apps.

4GB is plenty of RAM for a web browser with many tabs, especially with an SSD for backing virtual memory. If it isn't there is a memory leak, and if there is a leak, 8GB is as bad as 12GB.

My MBA has 4GB and I constantly forget, because it is enough, and I am constantly surprised that it doesn't have 8GB.

All modern web browsers save and restore sessions across restarts.

> 4GB is plenty of RAM for a web browser with many tabs,

I always up-to-date Firefox, so hopefully the memory leaks are minimal. I reboot my Linux machine every month-ish and don't exit Firefox unless necessary. I always keep a few "app" tabs pinned, only a couple of them are heavy. I have 4GB of RAM and turned swap off for an experiment. Sure enough, after a couple weeks of browsing and keeping about 25 tabs open, Firefox's memory consumption would creep up and up and eventually it would crash.

I can only guess/hope that there are memory leaks involved.

Re: MBA/MBP - with Mavericks 4GB is doable given how it aggressively manages memory: http://arstechnica.com/apple/2013/10/os-x-10-9/17/#compresse...

Regular users don't know or care about memory management, they don't even close old windows or tabs, its about convenience.

Why should they? If a tab's "dormant", the browser can just quiesce JS timeouts and let the tab's memory get swapped out. Not a problem anymore.

That most (all?) web browsers don't do this currently isn't the user's fault.

That's not interesting idea, but I'm not sure any browser is designed to be able to fully restore the state of a tab that it's swapped out.

Do you even know what swapping is? The whole idea of swapping is that the OS pages out memory without the application being aware of what's happening!

A bit over a year ago RAM was cheap enough you could just stock up. I have 16GB of RAM and paid 80€ or something for it. I keep stuff in tabs or windows now I used to stick in bookmarks. Right now I have Visual Studio open after work as I want to do a deployment in ~3h. No need to close and re-open the rather big project later.

Very true, but the vast majority of computer users aren't pushing the limits of their systems, and I think that's what the author is getting at. If you look at the market as a whole, the need for more powerful computers isn't nearly as big as it used to be.

And those users are overwhelmingly the ones that tablets suit just fine.

That's the problem in this type of "PCs are fine, people just don't have to upgrade" argument. (And I've made it myself, before I really worked it through and took stock of just how far tablets have already come.)

The use cases satisfied by an old PC could all be satisfied just as well by today's tablets. Yes, the software support for keyboards, external displays and server-type services isn't quite there. But that's solely a software limitation. Not a limitation of the hardware platform. It will be addressed and more PC sales will disappear.

And the actual remaining PC-justifying use cases are precisely those where an old core2duo isn't "good enough" and simply buying some more RAM or even an SSD isn't going to obviate the need for new hardware for 5+ years.

So inasmuch as people still need PCs, they have to show up in the sales charts. And inasmuch as we don't, mobile devices are going to eat that market as the software evolves.

There's really no way around it.

Frankly, I think android and/or iOS are one good upgrade-cycle away from decimating laptop sales. If either or both really focus on the 'docked' use case -- better support for external displays and keyboard support to facilitate top-flight bulk text entry/editing -- laptops are dead to the general public.

If a student can so much as use a keyboard case and tablet to write a term paper or code up some homework as easily as on a laptop, it's over. And there's no hardware preventing that. It's software. If Microsoft wasn't so inept, they'd have been way out ahead on this.

And, frankly, between the potential of the docked use case and "BYOD" policies, mobile devices could seriously eat into the enterprise desktop market as well.

You build a device cradle, that connects to an external display, power and a keyboard/trackpad, and enables the OS and apps to automatically sense that connection and shift their interfaces accordingly and you'll see how few people truly need a PC anymore.

I completely agree. I wasn't trying to say that PCs aren't dying, I was just trying to contend the notion that power users make up a significant portion of the market we're talking about. In fact, I think tablets directly show that the vast majority of consumers don't need powerful hardware, because tablets aren't very powerful machines at all.

Oh I was absolutely agreeing with you and just expanding and making explicit how this totally contradicted the article's argument.

I love the idea of device docks and I would very much like to see things go that way.

I think docks have some hurdles to jump in the consumer market, though. They're unsexy - I think most people associate them with boring enterprise computers and boring enterprise work, and they tend to be big and ugly. They're expensive too, considering that most people view them as nothing but plastic with a few connectors and wires inside.

If someone can figure out how to jump those hurdles, though, and make docks sexy to the consumer market, beige-box sales will plummet. Make them smaller, easier, cheaper. Make them less proprietary - at the very least, a Company XYZ consumer dock should be compatible with all/most of of Company XYZ's portable offerings. Make them wireless and automatic (OK, I realize this conflicts with "cheaper")! Let me drop my Win8 tablet on my desk and immediately have it display to my dual monitors and let me use it with my mouse, keyboard, printer, big external drive and wired network connection. Let me press a button and have it display to my TV or play to my big speakers. Have your cake and eat it too.

I mention Win8 specifically because the whole idea of it is that it's a tablet and a desktop OS in one. Why on earth is there no Microsoft-produced drop-in docking solution for the Surface so it can actually be both?! The consumer potential is crazy - got a crusty old desktop PC laying around? Toss it, keep the peripherals, buy a Surface+dock and you've got a new tablet/laptop and desktop. You can "dock" a Surface as-is with two cables (a micro HDMI and a USB hub with all your desktop peripherals wired in), but that leaves out wired network (maybe important), certain peripherals like speakers (maybe important), and power (critical), and even two cables is two too many.

Most people who say they need a PC don't need a beige box, they need a keyboard, mouse and monitors on a desk where they can work. The form factor of the box that the CPU, disk and memory come in doesn't matter when you're at the desk, so it might as well take the form of something you can take with you and use when you're not at the desk: a laptop, tablet or phone.

Docks will take off when you can make them wireless.

My tablet can connect to my wifi when I come home and sit at my desk. There's no reason at all why it can't connect to my mouse+keyboard at the same moment; so someone needs to solve the technical problems of it being able to connect to my large monitor wirelessly, and we're set.

None of those use cases are going to grow the PC market. The things you're describing have always been a niche (remember Workstation Class PCs?) that may add a few $$ to the bottom line, but they are not going to drive growth.

The PC market has relied on end users - consumers and business users - for it's growth engine for decades, and that appears to be drying up. One of the reasons for that is outlined in the article, for most use cases we don't need faster.

Indeed. I work with RAW photographs fairly often, and simply exporting an album with a few hundred RAW files to JPEG takes a surprising amount of time with fast, modern hardware.

Nothing is ever good enough for a development box when you use an IDE and work with larger and larger projects.

Faster CPUs, more memory, and faster storage are always welcome. I look forward to the day when Eclipse and other IDEs really start taking advantage of GPU stream processors for indexing and validation.

People snack on smartphones, dine on tablets, and cook on PCs.

A lot of people don't want to cook, so are happy with smartphones and tablets.

Why buy a desktop or laptop when an iPad will do everything you need for a fraction of the price? That's what people mean when they sound the death knell for the PC.

Why cook when you can eat chips and order pizza? Probably because it's better for you and because cooking has cultural significance that goes beyond simply replenishing calories.

People who cheerfully proclaim that PCs are dead forger that PCs aren't just devices, they also attained a certain level of cultural significance. IF the death of PCs also means the death of PC culture (which involves things like game modding, hobby website making and so on), then the death of PCs is a really, really bad thing.

Division of labor. Not everyone has to be a producer in every sphere; it's okay to be a producer in some, and a consumer in others.

Plenty of people are too busy with other aspects of their lives — doing things which may, for all we know, be of great cultural significance — to spend time being a producer in the digital sphere as well.

Some people devote their lives to cooking for others; others devote their lives to other pursuits, and only ever consume food produced by others.

That's where the analogy falls apart, people who only eat out are a small percentage of the overall popular whereas people who only need a tablet are purportedly the norm.

The numbers don't have to be the same for the analogy to work.

In any case, I think you're underestimating the number of people who never cook, or cook very infrequently. In your average family household, one person typically cooks the vast majority of the meals.

To clarify, by cooking I mean real cooking — beginning with raw ingredients, going through numerous stages of preparation requiring some degree of skill, etc.

It’s awesome to do cool, fulfilling things and for some of those things you need a computer. For others you don’t.

I really don’t see why everyone should use a computer, given the wealth of possibilities out there.

Also, I’m pretty sure the death of the PC doesn’t mean any of what you are insinuating. It will be more like the death of horse riding after the advent of the car. (If I want to go horse riding there is a club that offers that not five minutes from where I’m living. Horse riding is dead – but that doesn’t mean it’s impossible or even hard to go horse riding today.) Only that PCs will probably be an order of magnitude more relevant than horses are today and, while not always as relevant as in the past in certain contexts (at home), they will still be relevant in others (some work, academia, …).

I’m still pretty confident in the prediction that the PC at home will die. (Which will not mean that no one will have one at home. Just far less people than today.) I’m also pretty sure that the PC at work and in academia will not die.

Sounds like a great thing for job security in two decades when almost nobody but people born during the PC era know how to program.

That stuff is never going away. "Death of PCs" doesn't mean the complete disappearance of them, just their death as a dominant consumer item. Professionals and prosumers will never stop needing PCs, and they're the ones who constitute the groups you mentioned.

You took the cooking analogy too far. Programming a hello world app isn't any better for you than writing the great american novel.

It is if the future of humans on Earth is predominated by computer programming skills, even if basic.

A fraction of the price? Not hardly.

A 64gb model of the iPad costs $700 (because 48gb of storage should cost $200 to pad those juicy margins).

I bought an amazing desktop from HP last year on a black friday sale for $779. For what's in it, you couldn't have assembled it from Newegg at that price.

In another generation or two the typical Chromebook will be superior to the iPad on performance, while being half the price.

You should buy a desktop or laptop because you get drastically more computing power at the same price.

It's as if the PC is some sort of professional tool, like a truck.

Production Computer?

I think it's more like a set of tubes.

Nicely said. I don't think we'll see the full conversion for a few years, but this is more or less how I think of it.

As for me, I'd much rather have my personal chauffeur carry around my full kitchen and always fresh ingredients so I can eat in luxury any time I wish.

Thank goodness for tablets with full XWindows support to my desktop and the university supercomputer. I like broken metaphors.

Does a tablet really constitute a full kitchen, with no compromise?

To me a tablet is a cramped working space (small screen, limited memory), difficult to use (requiring additional utensils like a keyboard, mouse to make certain tasks bearable), with limited storage space (no cupboards), limited processing power (more like a camp stove than an oven), and only usable in short bursts.

Edit: In any case, the analogy is as much about the time, effort and skill required to cook as it is about having the relevant equipment at your disposal.

Attribution for that analogy should be given to Citrix http://blogs.citrix.com/2012/01/09/do-ultrabooks-mean-ultra-...

Great analogy.

Very well put!

The PC market isn't dead, but then again, the Mainframe market isn't dead either.

The Post-PC devices[1] (tablets / smartphones) are it for the majority of folks from here on out. They are easier to own since the upgrade path is heading to buy new device and type in my password to have all my stuff load on it. If I want to watch something on the big screen, I just put a device on my TV. Need to type, add a keyboard.

The scary part of all this is that some of the culture of the post-PC devices are infecting the PCs. We see the restrictions on Windows 8.x with the RT framework (both x86/ARM), all ARM machine requirements, and secure boot. We see the OS X 10.8+ with gatekeeper, sandboxing, and app store requirements with iCloud.

The PC culture was defined by hobbyists before the consumers came. The post-PC world is defined by security over flexibility. Honestly, 99% of the folks are happier this way. They want their stuff to work and not be a worry, and if getting rid of the hobbyist does that then fine. PC security is still a joke and viruses are still a daily part of life even if switching the OS would mitigate some of the problems.

I truly wish someone was set to keep building something for the hobbyist[2], but I am a bit scared at the prospects.

1) Yes, I'm one of those that mark the post-PC devices as starting with the iPhone in 2007. It brought the parts we see together: tactile UI, communications, PC-like web browsing, and ecosystem (having inherited the iPods).

2) I sometimes wonder what the world would be like if the HP-16c had kept evolving.

> I truly wish someone was set to keep building something for the hobbyist

I really don't understand your concern.

Hobbists have a wider selection of computing tools than ever before (altough, that statement was true at any time since the 50's). We have the entire arduino ecosystem for hardware hobbists, throwaway PCs like the Raspberry Pi for embebbing real computers everywhere, several different standards of desktop-capable parts for more powerfull systems, and the server ecosystem for the real beefy ones.

Most of those computer types aren't even able to run Windows or OSX. iCloud and Secureboot won't make them go away.

> Hobbists have a wider selection of computing tools than ever before

I don't think that's quite true. We had Heath kits and a lot more variety of computers from the late 70's to the early 90's. There is no under $200 computer sold at major retailers like there was in the 80's.

Hobbism is not trendy* nowadays. There is nothing aimed at hobbists for sale at any big retailer. It's not a problem with computers.

* Well, there is a perfectly rational reason for that, and it is not really a problem for hobbists. But that's the fact.

It's a big problem for the starting hobbyist. A kid will probably receive an iPad rather than something to start them on their way to being a programmer or EE.


There are LOTS of under-$570 computers sold at major retailers today.

I don't consider inflation a valid excuse when we continually hear computers get cheaper every year and the Mac mini is selling at the Apple II's old price point. The PC industry abandoned the sub $200 because of Windows licensing, Intel, and Apple taking the old Apple II price as a floor.

$570 is a lot father out of reach today for many than $200 in the 80's.

Computers DO get cheaper every year. That Core 2 Duo machine you bought in 2006 might have cost $2000, but those components (if you can even find a place to buy them) would likely cost about $300 today.

It's unfair to say that they don't get cheaper considering that a Mac Mini is an entirely different class of machine from an Apple II, in so many ways that it's ridiculous to even try to list them here.

As for your statement about $570 vs $200... what do you base that on?

No, computers in the same price range are more powerful. The price range for computers under $200 has disappeared. The Mac mini and Apple II inhabit the same price range. You can get more power in the same price range next year but that same power never trickles down to the price ranges that disappeared in the 90's.

I built a dev/gaming machine back in early 2010. It's stout, but not a ridiculously expensive (~$1,000) behemoth. The only thing I've done since then is toss some more RAM in so I could have two sets of triple channel DDR3 instead of one. I can still run just about any modern AAA game at the highest settings.

The only time I felt like I've needed an upgrade is while playing Planetside 2, which is/was very CPU bound for my setup. However, when it was initially released, Planetside 2 ran like a three-legged dog even on some higher end rigs. It's much better after a few rounds of optimizations by the developers, with more scheduled for the next month or two.

I dual boot Linux boot on the same machine for my day job, 5 days a week all year. For this purpose it has actually been getting faster with time as the environment I run matures and gets optimized.

As good as it is now, I remember struggling to keep up with a two year old machine in 2003.

> I can still run just about any modern AAA game at the highest settings.

AAA games mostly target the console. Look at GTA5, which isn't even out on the PC. Most AAA games will run on a PS3, which came out in 2006, and has 512MB of RAM (combined system / graphics).

That said, there's a point of diminishing returns - making games look much more realistic will take obscene amounts of resources.

I expect there to be a system requirements explosion for PC games now that the new consoles are right around the corner and AAA game developers can finally target much higher specs.

An interesting opposite is the previously mentioned Planetside 2. It was initially released for PC only, and was a resource hog. Now they're working on bringing it to the PS4, so they're having to do an extremely aggressive round of optimizations to make it run decently. The optimizations will get a lot of testing on the PC version (think it's hitting the test server next week). PC players will benefit as a result of the console port.

Planetside 2 is a weird example, though. I don't think there will be many games that have a 1-year+ delay from landing on PC before they hit consoles.

GTA4 was actually a sore spot for PC gamers. The game was not optimized, so it ran like crap if you didn't have a higher-end set-up.

~$1,000 sounds quite cheap for a gaming machine which can still run any modern game at highest settings. I remember I built mine at about the same time (2010) for $2500, top notch video card, fastest cpu, lots of ram, but its 2013 now and I can not say that it runs any modern game at highest settings.

It was surprisingly easy. I'm not saying you over-paid, but for $2,500 I could have built something pretty ridiculous. Most of my money went into processor and GPU, which are typically your two big ticket items.

I trolled around Newegg looking for upper-middle tier components with a higher quantity of good reviews. A lot of the times you won't see a lot of the recently released stuff with useful reviews, so some of the parts were actually circa 2009'ish instead of being latest and greatest (2010).

I didn't splurge for a super expensive case, and my power supply wasn't modular (making it pretty cheap). i7 with a decent mobo. Went AMD for the GPU since (at the time) they were the best bang for buck. Got some cheaper G.SKILL 1600mhz DDR3 RAM (which has worked awesomely for me) for next to nothing, and I was ready to roll.

Post your specs to give your post some legitimacy… I hope you didn't skimp out on your motherboard…

2010 $1000 rig running all 2013 AAA titles at the highest settings sounds unlikely to me…

Not the parent poster, but I also bought a computer in 2010 for about $1000 and am very happy with it today. I still play games at max or near max settings with no problem. This does not include a monitor though; also I only bought a 1650x1050 monitor, so I save a bit on graphics power by not having the full HD pixel count (or more).

EDIT: don't know my full specs offhand; i5, gtx 460, either 4 or 8 gigs RAM (4 I think), a 120gb SSD and 2G HD

My i7-875K (overclocked, but not exhaustively; my Gigabyte motherboard has an auto-overclock function that raised the multiplier without killing stability) with 16GB of RAM, entirely too many SSDs for any one person, and a Radeon HD6870 is capable of playing every new game I've bought this year (Dishonored, Saints Row IV--I don't buy games from EA so the new title list is pretty short right now) at max settings at no worse than 35FPS--I personally can't detect a difference between 30FPS and 60FPS, so I don't care.

Neither Dishonered or Saints Row IV are particularly intensive games. I get in excess of 100FPS in both and my cards (GTX 670s SLI) barely hut 30-35% usage…

I don't doubt the OPs rig is powerful enough to play modern games, but I seriously question the "any modern game at highest settings" statement.

FWIW, I built a rig a great rig in 2010 which I still use from time to time. i7 920, 12GB DDR3, SSD, SLI GTX 460 2GB.

More like impossible. The Internet has many PC building communities and none of them could do it. Hell an i7 and good GPU at the time easily brought you over 600$, which means very little money for a motherboard than can keep up.

Well now you should just post some core specs so we can test your claim. Exact CPU, GPU, and RAM amount?

Looking at the Tom's Hardware system builder challenges from a year ago or two years ago, and the comparing them to the $500 and $1000 machines that they have now in their System Builder marathons should showcase the tradeoffs in performance vs price more starkly.

You overpaid. You're better off spending $800 and upgrading 3 times as often.

Particularly with GPUs. I wouldn't touch something like Nvidia's Titan unless I just had gobs of money lying around.

Well yeah, something like the Titan has no price / performance at all. Usually the best valuation is around $200, ie, an r7 270x or a 760. And if you want a premium card, 280x / 770 are both devouring modern titles. Anything past that is "I want the best there is right now, screw the cost".

Just picked up an r9 280x ERRRRRRRR... Asus CUII 7970 from Fry's for $279

Happily ran BF4 beta on High/Ultra 2x MSAA at 1080P.

That's not the point. He said he can run any modern game on high settings on the computer he bought 3 years ago, I doubt you can play any game in 2016 on high settings if you buy a computer for $800 today.

I can do the same, also with a $1000 computer bought in 2010.

Why does no-one ever factor in the costs of transferring systems in this upgrade process? It's not trivial to set up a new system and bring your stuff over.

In terms of gaming performance, when you go from mid range machine to high end machine, you are often spending 2-4 times as much without 2-4 times the performance. You hit diminishing returns big time.

I go with generally mid range components with my gaming machine. Even then, I upgrade every 3 generations for CPU and every other generation for video card. CPU performance doesn't impact gaming as much as it used to.

This gives me reasonable performance in most games around high to ultra on a 1920x1080 monitor.

Bingo. And there's now some pretty fierce competition between Nvidia and AMD again, so you can often catch some pretty good deals on a GPU or a combo if you pay attention.

I still use an AMD processor on my gaming/development rig, because upgrading to a better Intel CPU would have cost me nearly twice as much (I already had an AMD motherboard, so that helped). I don't notice a bottleneck in most games.

I think the best strategy is to get mid-range components and just upgrade more often. You get way more bang for your buck, and you can always sell the old components on eBay or whatever.

IMO, you don't need the fastest CPU for gaming. I saved a ton of money by getting an AMD FX-8350 (I already had an AMD motherboard on hand, so that helped too), and although it does bottleneck in really CPU-intensive games, I can run almost all games on the highest settings at 2560x1440. The graphics card is really the only component that you need to splurge on for a gaming rig these days, because it's the bottleneck for the vast majority of games.

2010? If you got an i5-2500K and a 580, you'd probably get a PC that came in around $1200-$1300. If you could reuse components like cases, hard drives, PSUs etc. you could easily get that in under $1000. That should still run 90% of modern games maxed out at 1080p, with the exceptions being pretty much just Metro 2033 and Planetside 2.

If you're doing multi-monitor or >1080p resolutions, then you might need to get something better than the 580 however.

Im running a 2500k at 4.3ish with a CM 912+ in push/pull and happily devour pretty much anything. I don't foresee a CPU upgrade anytime soon. I had a pair of 6950 (the older 1 gigs) and just put in a 7970 and another 4GB of RAM for less than 400 bucks and only did that because BF4 is a resource monster. Everything else was fine.

I bought a ~1000 setup in 2009 and it still runs most games acceptably (not necessarily highest settings). it was a Dell boxing day deal $500 computer + $350 video card (GTX 260) + changed the case and PSU.

Maybe the OP doesnt play some of the most demanding games?

> Maybe the OP doesnt play some of the most demanding games?

I play some pretty demanding games, but I was able to get a lot of performance per dollar building custom. I knew where it was OK to spend more/less and did a lot of research/shopping around.

Modern games are gated by GPU, not CPU. Except a rare one like Dwarf Fortress.

It's not even GPU today. RAM is pretty much the bottleneck for everything. CPUs have been so much faster than RAM for so long now that they have to keep doing more and more tricks just to keep utilization up at all. With gaming, it's always texture memory and transferring from RAM to the video card that's expensive.

I'm looking forward to the first CPUs or GPUs that are double-layer with memory integrated on the second layer...

Not true. ARMA or BF3/4 multiplayer on large maps can choke your CPU with ease.

I built mine in early 2007 for around $2000 (+-$300, can't remember exactly) and it's just now really starting to show its old age. It could use more RAM and an SSD (maybe), but for 99% of what I do, including gaming, it's plenty fast enough. I can't run the very absolute latest AAA games on highest, but if I turn the resolution down a hair or turn off antialiasing they run fine.

In fact the only thing I really want a faster machine for is some of the latest emulation techniques (Higan) and a vague desire to play around with some virtualization odds and ends.

I guess this will change for a little while as updated consoles come out and games can improve their graphics as a result, and also when 4k monitors start coming out. But yeah, until those two things come into play, older computers still play games just fine.

just for the hellavut i checked out gaming at 4k (http://www.tomshardware.com/reviews/pq321q-4k-gaming,3620.ht...) and note that it takes Titan's in SLI (at $1000 each) to get good framerates on many modern games.

Like another poster said, with "game" terms replacing "data": " But, if you can run a [game] and it's [good framerate], it just means you should really be doing better [gaming] over more [pixels]."

Don't worry, PC manufacturers are currently selling machines that are already obsolete.

My dad went to Walmart and bought a computer (why he didn't just ask me to either advise him, or ask if he could have one of my spare/old ones I don't know) and monitor for $399.

It's an HP powered by a AMD E1-1500. It's awfully slow. Chokes on YouTube half the time. My dad is new to the online experience, so he basically uses it for watching streaming content.

I could have grabbed him a $99 Athlon X4 or C2D on craigslist and it would better than this thing. I'm not sure if he'll ever experience a faster computer so I don't think he'll ever get frustrated with this machine, but it's amazing that they sell an utter piece of shit like this as a new machine.

>AMD E1-1500 >Bought a computer _and monitor_

Did he buy a notebook? I've never even heard of the AMD E1-1500 before today. Everything I see says that its a notebook processor, and a pretty terrible one at that. (2cores/2threads, and only 512kb of L2 cache!?)

What's worse is that it's a BGA package, meaning it can never be upgraded. If that's really a desktop machine (and not some form of "all in one") that's vendor lock-in at its absolute worst. They've ensured that instead of buying a $99 processor he has to go out and buy a brand new machine.


That's just awful. In 2013 you shouldn't be able to buy a computer that can't stream 1080p movies with ease.

It's a desktop. I've browsed the Sunday ads for Best Buy and Walmart and it's common. Example:


future is distributed unevenly. some retail outlets are still in the past.

A tablet is a PC. Especially as x86 processors start taking over arm processors.

Just because it doesn't sit in a big box doesn't mean it's a different class of system. The difference is really the openness of the platform, comparing something like iOS to Win 8 pro.

That said, many tablets are basically what we would have thought of as PCs before. Consider something like the Samsung 500T or similar, or thinkpad helix. Components are small and cheap enough that they can be packed behind the LCD, and you have essentially a laptop that doesn't need it's keyboard.

Will iPads take over PCs? No. They are too limited, not because of hardware, but because of OS limitations. Will tablets take their place though? Quite possibly. The portability is quite handy. That I can dock a tablet with a keyboard and have a normal PC experience, but have it portable when I need it is a selling feature.

The obvious cavaet is that a limited OS is fine as long as the majority of data is cloud based. In that case even development can be done on a closed platform, and the tablet becomes something more akin to a monitor or keyboard. More of a peripheral than a computing device. We might get to that point, but that's not the cause of the current trend.

A tablet is a PC only when attached to a full-sized keyboard and a full-sized screen.

Input and output is the major differentiator, not the processor or OS.

If everyone adopted the attitude of the author of this blog, all innovation everywhere in the world would cease instantly because, for most of us in the developed world, everything is good enough already. There are many points throughout computing history at which existing hardware was overkill for the things that we were asking our computers to do. Had we stopped innovating because of that, the world wouldn't be anywhere near where it is today.

In high school I recall lusting after a $4,500 486DX2 66Mhz machine with an astounding 16MB (not GB) of RAM, and a 250MB hard drive. A few months ago I spent a little less than that on a laptop with 2,000X that amount of RAM, 8,000X that amount of hard drive space, and a processor that would have not so long ago been considered a supercomputer.

I for one am glad that we have continued to innovate, even when things were good enough.

No innovation would hardly die but hopefully focus on something important. We will hardly run out of need for innovation any time soon. We rather have a problem of innovative people focusing on the wrong things, because incentives.


The computer was a great invention. The Internet also is a big enabler. But the latest computers and phones are hardly innovative: Compared to what you got 5 years ago, they might be smaller and have better power efficiency. But on the grand scale, how does that matter?

If you see the latest Macbooks being introduced, you probably want to get one. It's very shiny and the Retina screen will allow you to experience computing in a great way. People try to become happy by spending money for experiences. Tourism, iPhones, hipster coffee shops. They don't do it because it's the universal recipe for happiness and living your life, but because it's what capitalist societies expect you to do. Most "innovation" and "disruption" only leads to zero-sum money shifts inside this system. If you think that a retina screen is innovative, i think you need to get some perspective.

The same goes for cars. The Germans (where I come from) think that they're innovative because we have a few luxury car makers here. Cars in general are great, they provide mobility and that's useful. But how are the new cars better than what was available 30 years ago? They're not even more fuel efficient.

Bill Gates seems to have got it when he stopped working full-time at Microsoft. All over the world people are using their software, but if we were using OS/2 and Lotus instead of Windows and Office nothing would change. It went very well for him and his company, but nothing they ever did was as important for humankind as what Bill Gates is doing now: giving life to millions by completely eliminating malaria and polio from the planet and supporting AIDS and TB research with huge sums.


If you see the latest Macbooks being introduced, you probably want to get one. It's very shiny and the Retina screen will allow you to experience computing in a great way.

I do not want to get a Macbook. They are vastly overpriced and underpowered. If I ran a movie studio, I might buy a Mac Pro. But that's about it.

Cars in general are great, they provide mobility and that's useful. But how are the new cars better than what was available 30 years ago? They're not even more fuel efficient.

Look at traffic fatality numbers over the last 30 years, and tell me that we haven't made innovations that haven't impacted lives. How many kids have parents that were able to raise them because they didn't die in a traffic accident that they would have died in before?

> Compared to what you got 5 years ago, they might be smaller and have better power efficiency. But on the grand scale, how does that matter?

This is insane, wrong, and dishonest. How old are you? I'm 30 and during my adult life (the past decade, basically) we've gone from phones that were essentially unreliable walkie-talkies with shitty battery life to ultra fast portable computers with 10 megabit internet connections and 4 or 5 different onboard technologies (gps, camera, etc).

WTF are you talking about sir. Do we live on the same planet?

Smartphones and tablets are the new TVs. Of course, hardware is much faster than 10 years ago, but what is it used for? What is the impact of smartphones on humanity? They have changed communication and entertainment patterns quite a lot, and not in a good way I'd argue. Communication is now cheaper and faster than ever, which means that many people don't think for themselves before they tweet or write an email. Also, many people prefer not to think at all and use their phones to distract themselves by consuming a constant stream of meaningless stuff.

By the way, the mobile phones that I had 10 years ago (Sony Ericsson, Siemens) were quite reliable and had good battery life.

I disagree.

I think that if everyone adopted the author's attitude, we get innovation where it is needed most... Which is exactly what is happening: there has been a shift in the focus of processor development over the last ten years or so from "make it as fast as possible and who cares about the power consumption" to a more performance-per-watt oriented approach.

edit: worldsayshi got there first.

Wow, downandout, your post truly has it all:

- Genuine loathing for the blog post OP

- Smarmy "I'm better than you" attitude

- Examples of how you aren't like OP

- Call out to historic reasons why OP's mindset is terrible

- Multiple over-the-top statements ("If everyone adopted the attitude of the author..." is my favorite!)



I was just about to say that about your response. All I said was that if everyone adopted the "good enough" attitude, innovation might have stopped before we ever had computers. You somehow used that to call me a smarmy scumbag. Are you somehow related to the author?

>I for one am glad that we have continued to innovate, even when things were good enough.

Needing more RAM and more HD space (especially in order to offer the same feature set) is not innovation.

Applications are open for YC Winter 2022

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact