[...] Intel is desperately trying to figure out what to
do to combat the phones and tablets that are eating them
alive from the ankles up. It is pretty obvious that the
company both doesn’t understand what the problem is and
is actively shutting out all voices that explain it
- Laptops will stick around and have Intel Inside for quite a while. The market may be boring, but it will be there for years. Corporate America helps.
- Servers won't be switching to ARM any time soon (I'd argue this is the riskiest bet).
- The desktop and enthusiast/gamer PC market will be around for a while, and also won't be switching to ARM any time soon.
So all of these "shoe-ins" buy them time, and I believe they think that in time they can pull off the biggest risk of all:
- Intel is betting that the biggest differentiating factor is and will be performance per watt. They are willing to gamble that they will eventually eclipse ARM cores in this area. In their view, if they have an x86/64 core that trounces competing ARM architectures in ppw then phone, tablet, and set top manufacturers won't have a problem putting those chips in their devices.
Granted, I'm not saying I think Intel is 100% correct or that they'll succeed with their long term bets; I just don't think they are as clueless as this rant makes them out to be.
No doubt about it, though, UltraBooks DO suck.
EDIT: I'm going to revise my statement on UltraBooks. Not all of them suck. In particular, the Lenovo Yoga is fantastic.
Intel may be able to build lower power/faster chips but their foundries aren't cheap, neither are the thousands of engineers Intel puts on each SoC, and technology scaling is only getting more and more expensive.
It's a classic case of the innovators dilemma. Intel are optimized to sell very fast chips that cost hundreds of dollars but the performance of fast-ish chips that cost a few dollars has almost caught up to them and Intel simply can't compete without a complete restructuring. A further complicating factor is the fact that ARM is a weird many-headed chimera  that Intel simply can't kill they way the did with x86 competitors like AMD.
Its light, powerful, and compact. I can throw it in my backpack and forget its even there. What's not to like about Ultrabooks?
(For anyone wondering, its a Dell XPS 13 running Ubuntu)
The X300 was the Ultrabook before the word Ultrabook even existed :)
1. Apple and Google built an OS. Intel isn't willing to go after software. Are they smart to limit themselves? Microsoft is at least pretending to make hardware – alienating all the OEMs in the process – and Intel doesn't have what it takes to make software?
2. Cheap laptops have traditionally been Intel's enterprise play. Dell/HP/Lenovo are still selling 1366x768. Is Intel serious about their integrated graphics, while their customers are still being outfitted with 1366x768?
3. The server market is not a safe place to hide. Talk of how Intel is safe on servers ignores the consumer market and Intel is obviously not going to ignore the consumer.
So are we down to a single supplier for all our computers (Apple)? What will you do when Apple screws up? But more to the point, Intel had better watch out before Apple just ditches them entirely.
I think it would be silly for Intel to build their own OS from scratch; it would be expensive and probably lead to their demise (ala Nokia).
They do hire software engineers to e.g. make Android work on x86 and make Linux servers run well on x86.
But even Lenovo has a bad reputation compared to the fruit company, I mean, they do ok for value, but the hardware can be flaky.
I'm typing this on a T400 Core 2 2.4 Duo, Radeon HD, 8GB RAM, 1440x900 screen, 9-cell unit with 3G card built in and windows 7 x64 pro license sticker purchased in absolutely perfect unused condition for £145 (!). The battery had done charge 11 cycles to give you an idea. Chucked a Samsung 840 Pro in it for £102 and it's a perfect machine.
I have 2 spare ones (T61's) lying around as well so I always have a spare handy.
This puts me in a better position than a new Lenovo or fruit purchaser.
I'm a firm believer of letting someone else pay for the immediate depreciation in value! :)
And to be honest, most of the help was unsolicited, but was friendly and helpful so I accepted it. My experience buying a ThinkPad was overwhelmingly positive, and I also love the machine (I ponied up for the 1080p screen, which is fanstastic).
It has (1600x900)..but the new 5" phones have full HD screens for some time.
You mean, in comparison to ARM based tablets?
However, if you're saying that ultrabooks suck at consuming content compared to tablets, all I have to say is "well duh."
another fun fact: the OS for tablets and phones is compiled on x86, every time, all the time. Compiling on ARM for ARM is just way slower.
I suspect neither you or the OP have been paying attention lately? Neither had I until I started looking. I got a Samsung series 9 earlier this year and I'm very happy with it. I think many people would love it, but they don't know it exists. Same goes for a few of the others.
I'm willing to bet some hear are typing out on a tablet, but I'm sitting here in the kitchen with my laptop.
Ugh. Change tack, which is a sailing reference. As for the actual content, I feel this analysis lacks nuance. Mobile is booming, of course, but the PC is not dead, nor will it be dead five years from now. There are a hundred use cases for which a desktop or laptop is the only practical solution. Fantasize all you want about businesses abandoning real machines for iPads; reality begs to differ.
I don't think it's all Windows 8's fault. The average desktop PC is just too powerful.
I've been using Visual Studio 2010/2012/2013 with an i3 and an SSD for years now and I rarely run against any sort of performance bottleneck.
To compare what sort of performance requirements I have: in the project that I work on I have a solution with 28 projects that takes about 50 seconds to build from a clean build. Visual Studio takes care of incrementally building the projects during normal development, so usually I'm looking at ~5 seconds to build then launch the debugger.
I have absolutely no need to upgrade. No need = no sale.
I'm using Windows 8 as my operating system. It takes one step forward and one step backwards. I'm looking forward to Windows 8.1 but there's nothing so seriously wrong with Windows 8 that I need 8.1.
When I'm sitting in front of my PC and using Visual Studio, I'm not thinking "I wish this was actually a docked tablet". I have an iPad for mobility.
PC sales are probably undergoing a bit of a course correction as people who are satisfied with tablets buy tablets instead of PCs. But I suspect PCs will be around for a long time to come and, until that day, there's nothing for them to "[come] back" from.
But each time I look at the offerings, I conclude that my $700 PC from 2010 is perhaps not superior, but is certainly adequate enough that I'm not interested in forking out $1k+ to get something with a comparable battery lifetime.
Instead I start to look at the Chromebook with Ubuntu on it as a much less expensive option with the primary thing I want (battery life) combined with an almost-real computer that does basically everything my older laptop can do.
But even then, sunk cost and all. It's not enough to meet the performance of my three year old laptop, they have to exceed it. And it's amazingly still not there yet.
I don't know what exactly Intel has been optimizing, but it sure isn't anything that matters to me.
The FUD surrounding Win8 is kind of crazy. You want to know the easiest, fastest way to launch an application? Hit Windows key, type application name. Same on Win7, same on Win8...hell, it's the same on Ubuntu.
Its actually, IME, noticeably slower than mousing in most cases on Windows 7.
Reminds me of a coffee cup I bought at Wal-Mart for work. It had a transparent plastic sticker on it that would not come off, without leaving a gummy stain. Didn't come off in the wash. Think about it: I buy something NEW and shiny, and the first thing I have to do is clean shit off of it? Maybe buy some other product to get the marketing crap to come off? Of course I respond emotionally, just as if I got a new car and found someone had dumped in the back seat.
What is fun is that I doubt this applies to Server 2012, even though it is based on the same codebase.
People can't make movies, edit images properly, use a compiler, debug, use a nontrivial spreadsheet,etc in phones or tablets. Until that doesn't change the desktop PC won't die. They might not been as popular as before nor have the same upgrade cycle as before, they might had lost relevance as a growing market, but they are far from dead.
As you mentioned the future is fragmentation, certainly not a monolithic tablet-only future. There are still too many incentives to keep using PCs for many, many usages.
I do a podcast entirely on my iPad, I also record music on it - and in both cases significantly prefer it to doing the same on my computer.
I also sketch stuff (although a decent digitiser would help) and my daughter records and edits films on it as well (she says it's too fiddly editing video on the Mac). I write numerous blog posts and I've written one essay on it (using an external keyboard). I even used it for coding (well not really, just as an SSH client to a linux box, but the inbuilt 3G over MOSH made it extremely convenient for doing non-UI work).
I've not had to do any spreadsheet stuff on it, and I can see why that would be a weakness. Nor have I had to do any Photoshop-level image editing (I have done simple image editing).
But for most "creation" tasks I find the iPad to be competent, and in some cases (especially audio editing) to be superior to a "computer".
Of course, none of the stuff is "professional" level creation - but that still fits perfectly with Steve Jobs' cars vs trucks analogy - most people don't need that level of control.
Having GPIO pins to play with also helps with the learning. The Boca Raton model of computing device was a bit stifling for homebrew hardware extension because of having to accommodate the de facto standards of the IBM PC/AT model (RS232 serial, Centronics parallel, CGA/EGA/VGA display, DIN5/PS2 kybd/mouse) to get a hackable connection going.
There are some non-market explanations in the article and I could personally think of many more.
The PC could be way better than it is in technical terms and mostly (but not only) on the software side. The main problem is that once Windows reached the de facto monopoly, it had little incentive to innovate and instead had reasons to stay backwards compatible.
I believe I have an unpopular opinion about desktop PCs. The conventional thinking is that desktop computing is boring because a modern PC does everything it is intended to do just fine. That may be true, but the problem is that the industry is not interested in establishing new usage patterns—new things the PC should do.
At the end of last year, I started a series of rants about how modern technology sucks  with particular emphasis on the frustrating stagnation of desktop computing and the bothersome way every new portable computing device wants to be a center of attention.
I was pleasantly surprised that the author of the linked article hits the target squarely when he lists off what PCs need. The first item: better displays. He may be speaking more about laptops (and they are deserving of the shame), but allow me to rant a bit about my preferred computing medium—desktops.
The stagnation of desktop displays is, and has been for a decade, the crucial failure of desktop computing. Display stagnation is the limitation that allows all other limitations to be tolerated. It is the barrier that leads the overwhelming majority of users (and even pundits!) who tolerate mediocrity to declare everything else—from processors, to memory and GPUs—as "good enough." I absolutely seethe when I hear any technology declared good enough (at least without a very compelling argument).
Desktop displays, and by extension, desktop computing is so far from good enough that it should be self-evident to anyone who observes users interacting with tablets or mobile phones(!) while seated at a desktop PC. Everything that is wrong with modern computing can be summarized in that single all too common scene:
1. Desktop displays are not pleasant to look at. They are too small. They are too dark. They are too low-fidelity. And they often have annoying bezels down the middle of your view because we routinely compensate for their mediocrity by using more of them, side-by-side.
2. The performance of desktop computers is neglected because "how hard is it to run a browser and Microsoft Office?" This leads to lethargy in updating desktop PCs, both by IT and by users ("I don't want the hassle"). In 2013, I suspect many corporate PCs in fact feel slower than a modern tablet or even mobile phone.
3. Desktop operating systems are actively attempting to move away from (or at least marginalize) their strong suits of personal applications and input devices tailored for precision and all-day usage.
4. Desktop computers--and more accurately personal home networks--have lost their role as the central computing hub for individuals by a misguided means of gaining application omnipresence: what I call "the plain cloud." This is because none in the desktop industry (Microsoft most notably) are working to make personal networks appreciably manageable by laypeople.
5. Mobile phones and tablets are often free of IT shackles and therefore enjoy more R&D (more money to be made).
Desktop displays stopped moving forward in capability in 2001, and in large part regressed (as the article points out) since then. Had they continued to move forward--had the living room's poisonous moniker of "HD" spared computer monitors its wrath--I believe we would have breathtaking desktop displays by now. In that alternate universe, my desktop is equipped with a 50+" display with at least 12,000 horizontal pixels.
Desktop computing needs to leverage immersion (without nausea; VR goggles need not apply, yet). Large form-factor super-high-definition displays would bring all manner of new technology needs with them:
1. Gesture controls.
2. Ultra high-bandwidth wired networking (win for wired network folks) to move super high definition files.
3. Ultra high-capacity storage.
4. Extremely fast processors and GPUs to deal with a much greater visual pipeline.
Such a computing environment is a trojan horse for today's tablets: it turns tablets into subservient devices as seen in science fiction films such as Avatar. The tablet is just a view on your application, allowing you to take your work away from the main work space briefly until you return. I say trojan horse, but that's not quite right because I actually want this subservient kind of tablet very much. I do not want a tablet that is a first-class computing device in its own right (even less do I want a phone to be a first-class computing device). I only want one first-class computing device in my life, running singular instances of applications for me and me only, and I want all my devices to be subservient to that singular application host.
For the time being, that should be the desktop PC. In the long haul, it could be any application host (a local compute server, a compute server I lease from someone else, or maybe even a portable device as envisioned by Ubuntu's phone). But for now, the desktop should re-assert its rightful role as a chief computing environment, making all other devices mobile views.
Dynamic brightness range is a necessary step for writing a 3D renderer that makes you feel like you're looking out a window. 256 levels of brightness aren't nearly enough.
We don't need the ability to specify that a pixel should be brightness level 64823 vs 64824. It doesn't need to be that fine-grained. What we need is the ability to overdrive the brightness of specific pixels. That way sunlight filtering through tree leaves will actually give the impression of sunlight filtering through tree leaves.
Tangentially related, it reminds me that OLED has utterly failed to become a thing on the desktop, and it breaks my heart. It was a decade ago when I read that OLED was the next hot thing and it would bring unprecedented contrast and brightness to displays.
Today, I like OLED mobile phone displays.
But my Dell U3014s are disappointing crystal-over-backlight garbage. Not only that but expensive crystal-over-backlight garbage.
I think in 2-3 years you'll see Samsung Chromebooks and Ultrabooks with OLED screens...
The price will come down with time.
It's not just the number of brightness levels. On an LCD display, if you brighten the backlight by a factor of 50, a black pixel will be about as white as a white pixel with the backlight at a brightness factor of 1. One of the most frustrating things about LCD displays is their inability to completely transmit or completely occlude light. This can be seen most obviously by trying to watch a movie in a dark room.
Actually what is needed is anatomically defined range - the "true brightness", like it is for the 24-bit "photographic" (tetra-chromatic) range of colors. This would be a limit defined by the anatomical limits of human eye. There is a limit of absolute darkness (relative to the eye) and there is also a limit in the brightness of the light that the human eye can safely be exposed to. In this defined range there is a limit in granularity that the human eye can distinguish. I am not aware if such a definition exists, but I am aware that a display to respect the "true brightness" is impossible (for the mere fact that the light reflected from our faces sheds upon the dark portions of the display, and those dark portions can not perfectly absorb the external supplement of light).
For what I've seen, displays have become a commodity and that's a good thing. I can go buy any kind of display, choose a size and I'm probably fitted with more pixels than I ever need, the panels are neat, flicker-free and flat, and the best part of it is that they cost next to nothing. You buy a laptop and you choose the size based on how much hardware you want to carry around — not because you need this huge screen because all laptop screens are "good enough" as you mentioned: I haven't had a laptop with less than 1440x900 for... a little less than a decade. And the resolution has never been inadequate for browsing, coding, drawing, writing and watching movies which is what I mostly do.
This is completely the opposite of what we had in the 90's when 15" CRT was the baseline, you were always a bit short of resolution and you never had the money to buy that huge one-cubic-meter-in-size display that could do your 1280x960 at 60Hz or something, and for which you probably had to upgrade your graphics card and probably your PC too. That totally, totally sucked. One could live with the basic resolution and screen size but I remember the agony of something better always being almost at reach. These days screens are something you don't think about twice. Everything is, again, dreadedly good enough, and if you need something professional you can get that too it probably won't cost you the price of a small car.
In the last 10 years or so I haven't considered once whether I should soon upgrade to a "better" display or to a laptop with a "better" screen. That's bliss, IMHO.
My next phone will likely be a 5" phone with 1920x1080 given current prices/specs for full HD display phones - at that point I expect my laptop and desktops will annoy me even more. Current relatively low end tablets are now starting to get substantially higher resolutions than that.
I can only say that I don't see the pixels on my laptop screen either and text looks "good enough" (as in, not pixelated and I can see serifs which also look natural) and that the phone would be quite unusable if viewed from the same distance as I look at my laptop screen.
I'll put aside the size portion of the debate since I can't really fathom how anyone could argue against a desktop display that can fill their entire field of view, assuming such a display were available and economically priced. I think those who argue for small displays on the desktop enjoy being contrarians, and they bring up matters of taste and style (such as "I don't want something so large on my desk.").
But in terms of clarity, the argument posed by those satisfied by the status quo usually is composed of these points:
1. Users sit at a distance of 2 to 3 feet from a desktop display.
2. Therefore, high-DPI is not meaningful because the human eye cannot perceive additional clarity at that distance.
3. High-spec IPS LCD screens are good enough.
So the experiment is simple enough:
1. Find an iPhone 5+, Galaxy Nexus+, or Lumia 920+. Something with a high-DPI wide contrast-ratio display. Open up a web site or document or whatever.
2. Do the same on your PC.
3. Hold the phone up side by side at the same distance. Zoom the phone's text to match the text size seen on the desktop display.
4. Behold how the phone's display is considerably more readable. For most combinations of phone versus desktop display, the phone's display will be crisper, have better contrast-ratio, and better color accuracy.
My Lumia 920 (not even an OLED, just a nice high-DPI IPS display) utterly shames my Dell U3014 IPS desktop LCDs.
Reading text rendered with high-DPI, high-quality displays at 2 to 3 feet is an absolute delight. Not only that, you can lean to see greater detail. Yes, 2 to 3 feet is the typical distance, but sometimes I like to get closer to my desktop display to work with fine details.
I would pay a steep premium (but obviously I wouldn't break the bank) for a desktop display that matched the clarity of my phone's display. If I could just tug at the edges of my mobile phone and make it magically grow to fit my desktop, I would be so happy.
I have a 22'' screen, FUll HD on my desktop and I never complain about it. Maybe you just have a crappy LCD one.
Pretty sure that's the point.
And it's not a marketing gimmick. When you read a lot, having "print quality" text is an extraordinary enabler.
What do you use it for? Some people need better monitors. I hope my doctor has a nice greyscale screen for looking at xray images.
Yep, I thought about that monitor when writing my comment, but it was late and didn't want to research it. ;)
We have regressed indeed.
For some reason a few years ago the PC industry decided that 720p is good enough as if people only watch movies on their laptop. Trying to squeeze a full featured IDE in 720p is like looking at the screen through a key hole.
Let's look at use cases, and focus on the average person(assuming that the gamer market is too small to justify opening a top manufacturing line for large displays).
Usecase 1: movies. Even assuming there's value in 4K resolution(which is not certain) you still need to align and improve so many industries to make it work. Really hard.
Usecase 2: games. Most games plain folk play are casual games, and maybe playing angry birds at high resolution don't justify spending that much money . And even if there are ideas for such games, it's still chicken and egg problem.
Maybe the right strategy is to sell 10" retina displays, used at a short distance, to let people experience quality cheaply, build the relevant industries, and than offer large retina displays .
As far as movies are concerned the jump to 4k has already begun, but I do believe you're right and there needs to be an alignment between the industries. The BluRay still competing with DVDs shows that, hopefully the new console generation will outside enough for a serious shift in physical media.
Not to mention: when it comes to real productivity, you can't beat the desktop. It isn't the profitable sector for manufacturers, but it's still the productivity toolset.
We just need to make them exciting to the overall ecosystem.
This is one way WebRTC will be really useful (since, IMO, consumer IT is mostly Web or mobile apps). With signalling services in place, we'll be able to run hosts from desktops without registering domain names or establishing fixed IPs. The user-centric, desktop-hosted systems can grow out of that.
Are you kidding? In 2001, it was pricy to get a 15" 1280x1024 LCD monitor. Are we in the same state of affairs today? Would you be willing to wager that the quality, viewing angle, etc of that LCD monitor compares favourably to today's models?
Yes, prices have (mostly) come down. Yes, we have IPS versus TN. Thank goodness. But in terms of the top-tier specification of displays—resolution—we've stagnated and regressed. As others in this thread have pointed out, laptop resolutions in 2001 were higher than they were in ~2011. I don't own Apple products, but I give them credit for ending the tyranny of HD. Thank you, Apple!
Also, prices are not uniformly better, or at least they were not until very recently when some Korean manufacturers decided to shake up the monitor cartel. The consumer high-end in particular had been stuck with 2560x1600 30" monitors at ~$1,100 for about seven years.
Related rant: http://tiamat.tsotech.com/hd-sucks
Personally I had to pay $500 for a 19 inches that would do 1280x1024, CRT. the screen isn't even flat. flat screen crts were out of my price league.
Yesterday I bought a 27" of much higher resolution, for $300. So yeah, right.
We can make applications which do what you describe but it is a hell of a lot of work. It would require a custom affinity system (which desktop is the tablet attached to, how do I authenticate it), a custom server-client system (how do I talk to the desktop regardless of how I am connected, including NAT traversing, ect), a custom user interaction system (how do I deal with two users using the same program at once?) ect. for every device. Of course the key problem is that tablets, phones, ect. use a myriad different technologies.
So the solution today, the high intensity one, is that the application developers spend hundreds of thousands of man-hours writing tens of different versions of each piece of code. This is how Netflix, Google, Facebook, and others do it. These applications have integration across every device, TVs, smart-phones, desktops, tablets, smart-boards, cars, ect. This is not cheap, this is not easy, and this is not very useful.
The better solution (one that I am working on personally, albeit slowly) is to build tools that allow us to write one large piece of multifaceted application code against a myrid of idealistic DSLs (one for rendering, one for communicating, one for user identity, ect.), which can then be auto-magically (note the magic) ported to every device. This what unity does, as a domain specific example.
The poorman's solution is of course HTML, but that doesn't work well with current desktops due to NAT.
I very much appreciate what you're attempting to do and I think it will measurably improve matters. However, where I personally differ is on the key point I made above: I don't want applications to run on multiple devices. I want applications to be singular instances available everywhere. If I begin typing an e-mail at my desktop, I want to view my e-mail application on my tablet or phone and see the exact in-progress e-mail—to the letter—available and interactive in real-time. As I type on any device viewing my e-mail application, the letters appear on all over views instantly. Presence information (for the purposes of notification sounds and the like) could follow whichever device I interacted with most recently.
A premise of MVC that we have in large part forgotten is that views can be plural and diverse.
As you point out, countless developer hours are used porting applications to myriad devices. I'd rather conserve that effort. Have the computational guts be on a high-performance, high-connectivity, well-known (e.g., x86) application host. Then only the views need to be made plural, in a manner similar to (but obviously more comprehensive than) responsive web design. More comprehensive because some devices will be touch-enabled, some will be large, some will be small, some with a keyboard, others without, etc.
All that said, I do like what you're talking about and building. I look forward to seeing that project come to light!
Personally, however, I maintain that even at the level you describe the tools just don't exist to reduce the time significantly unless you give up native application feel. I.e. streaming the view, and just sending the input back. The web is a great start to such a system, but it still has a lot of hurdles to cross if it wants to be that platform (including performant (i.e. native) 3d rendering, NAT traversal for user ran applications (just switch to IP v6 already!), saving data client side, threading!, different input and display methods like you mentioned, ect.)
I also have other, fundamental, problems (more like pet peeves) with the way the web is designed to work. But my main motivator is the massive problems I have with programming languages. I should probably start a blog...
It's simply 'good enough' for just about anything that I'd want to do with a PC (and then some). The problems - if you can call them problems, I'd prefer to call them challenges - are to reduce the need for all these interfaces.
The best computer would work like siri does, only it would be really intelligent. That sort of quantum leap would transcend any mere improvement in hardware. All this eye candy and visual stuff does not allow me to work any more productive than what I could do 20 years ago with just a 15" green phosphor CRT. Displays are not the problem.
What I have seen is that it started out being computers (look at the Altair for god's sake with its switch panel!) and then it became a computer and some 'office applications' , then 'office applications' and you could develop on it, and now 'applications.' The best way to develop for a Tablet or Win/RT system is on a workstation with some development tools.
What is interesting to me is that the "PC" overtook the "Workstation" (Which was very much a dedicated development device) and killed it. Now as the "PC" market moves to more turnkey solutions, nothing has yet backfilled the void being left behind.
I see that many folks believe that the workstation of the future runs a virtual instance on the other side of a network connection, your "terminal" is a couple of specialized applications running on your application running device. I can easily see Google taking the Pixel and making it work sort of like 'spaces' used to work on the Mac, except when you zoom into your 'Workstation' space with your xterm windows, your debugger, and documentation browser its really hosted by some RDP or VNC like protocol to a service back in the cloud somewhere. It isn't a diskless workstation it's a terminal with a really rich serial protocol that runs at 10 megabaud.
You claim "But for now, the desktop should re-assert its rightful role as a chief computing environment, making all other devices mobile views."
And I suspect it will, if the price of doing that is better than the price of doing it "in the cloud" (or remoted to the network).
I don't understand. I can buy a very powerful PC at very low cost from any nearby mall. I can build one custom from components I buy at NewEgg or Amazon. I can get a Mac with its great fit and finish. I can choose Windows, Linux or OS X.
What void? Nothing went away. Workstations are better and cheaper than ever. Go buy one.
I would like a better screen, I'm not interested in the Metro interface, and the new MacPro has no internal slots. There's not much excitement happening in desktops/workstations, sure. But they didn't go away.
Let's regress further. Why are margins so slim in the desktop computing market? Well, for starters, aside from Apple, there are no non-windows desktop vendors worth talking about. And Apple doesn't license their OS.
So the Windows tax has basically crippled the PC market innovation-wise. And Microsoft has done little to promote progression in PC standards in the past 10 years.
Microsoft and Apple have given up on the desktop market (Apple because they discovered they could print money by making smartphones, and Microsoft because they're a monopoly that gets their money innovation or not).
A sort of super-iMac, if you will.
They would probably be hugely popular among the IT/sysadmin/monitoring crowd too.
if the new mac pro is any indication, they might try to do innovative things on the graphic side. As they are in position to force a boost of the graphics performance of all their hardware they might be the only one to be in position to do so.
On the control side, I found the magic trackpad to be really different and a lot more usable for basic tasks and gestures (I'd use a mouse for gaming and a pen for drawing, but everything else is easier on the trackpad)
Too bad no company seem to be really advancing on the personal networking side. Google would have the hardware and software knowledge to pull it out, but as long as advertisement is their core business it woudn't make any sense. For now a Synology like NAS with pluggable apps seems to be the less cumbersome solution to have devices talk together.
To be honest I stoped reading pretty soon because the valid arguments are burried under a truckload of pointless ranting and "I told you so!".
I have no problem with someone having a strong oppinion but the article is just horrible.
Also point of author just seems nonsensical to me. I have a big-ass powerful pc and I love it! Use it for gaming, game development, 3d modelling, music making. I built it myself from components, set up OS and software exactly the way I want with win7 and Crunchbang (linux). No other device would be able to do exactly what I want like this.
Maybe most people dont want that but most people have never been technically minded, if theyre happier with tablets then thats fine.
Meanwhile a great desktop lasts longer than ever, is cheaper than ever, and does everything extremely fast. I have a 5 year old desktop that outperforms a lot of laptops out there, including my new Macbook Air & my work laptop (not even a month old), and that desktop pales compared to my half year old desktop (which costs maybe $200 more than the 13" cheapest Macbook Air).
I think part of the shift in the market is due to the great state desktops have become as long lasting devices (& thus declining sales), and some of the improvements on more mobile devices - I'm highly skeptical of any call that the desktop is going away anytime soon though, because the mobile experience is still seriously lacking in the sweet spot of performance, battery life, weight, and price.
I hope this trend do not lead to slowness in PC innovation.
Innovation will be driven by mobile (both phones and tablets). It requires a lot more innovation to build these smaller devices that operate all day on a battery.
That's only true because mobile started from scratch a few years ago. You just see innovation because this market (for "smartphones and tablets") barely existed 6 years ago. Obviously this will tone down very fast as performances stop increasing significantly. You see that the latest ARM chips are only marginally more powerful than the previous ones in terms of operations/Watts.
So, they will, also, get only "a little faster every couple of years." Do not kid yourself.
Do you actually visit your local PC stores regularly?
Because while it was not unusual to see full sized towers on display, these days it's rare to even see a standard ATX case. The typical form-factor even for the desktops have slimmed down massively, and a large number of the devices are now "all in one" models with the PC and monitor in one.
That's actually why I agree with you about mobile - the performance _most_ people needs is low enough that the form factor has been steadily going towards mobile, and yet sales are still anaemic while mobile device sales have sky-rocketed. With the rise of Android "TV sticks", coupled with increasing support for mirroring/streaming display from your mobile devices, and/or HDMI/MHL ports and bluetooth keyboards, the desktop is going to be redundant for most people within a few years.
The irony is the desktop trend towards small devices, bluetooth keyboards and "all in one" models are accelerating the demise of the desktop more than it is propping it up, by making people used to wireless keyboards and not having a big box PC.
Multicore is also bigger in PC's than smaller devices, but that's mostly trickle-down from server-space.
But maybe people start looking at which jobs require using a computer to get essential work done vs. not needing one and therefore not using it. If people really think that entire generations of people are not going to need computers to do work are seriously mistaken, especially in BRIC/developing countries. I don't think the question really is are PC dying, the question should what the hell can I do with the ~$1000 machine other than look at cat pictures. We can thank Microsoft mostly for that. Seriously I think people really underestimate how turned off the entire industry is from Windows 8, especially when they need to upgrade the only reasonable choice is Apple.
Remember Apple is the only company that is actually increasing sales to laptops(MBA models). Clearly there is a market it's just not being served by current parties.
As opposed to what... tablets? Or are you suggesting we leap forward to Computer Terminals right out of the Hitchhiker's Guide.
I couldn't read much further than that. Really bad article.
But just the other day I was annoyed with the fan running while I wasn't doing anything intensive. So I checked my power settings and was surprised that the Min processor speed on battery was set to 100% and the cooling policy set to active. So I changed it. Then a few minutes later I checked it again and it was back to 100%/active. I finally started doing some Googling and found out the f'ing track pad driver was the cause of those power settings getting reset! Several driver updates and a BIOS update and the machine is where it should have been to start with.
I can only imagine how many people out there have this exact machine, which is chock full of Intel's best frequency stepping and power management technology, and it's all completely disabled.
The computers people currently own?
However, I guess I really don't understand why everyone's getting so uptight about. What we're seeing is not everyone in the world suddenly making a change to mobile for no discernible reason. It's not as though suddenly the winds changed and now PC is mysteriously dying.
No, what happened is what always happens: someone made a better tool for the mass market to do what they want to do, which is to consume content and be entertained. Magic wireless Netflix screens that fit in their pockets are just better at that than PC's are and there's nothing wrong with that. It's evolution, and even as a PC enthusiast and power user, I'm more than glad to see people better able to do what they enjoy.
That's not to say I'll be giving up my hand picked keyboard, large screen, and heavy tower, but I'm not to worried. There's always going to be money in serving the needs of people who get work done on computers, even if it's not quite as much money as it used to be.
That comes across like you do think it is better, if maybe not by a lot. But then you write the following:
> I totally went back to PC, built my own desktop, bought a nice Lenovo laptop and called it a day.
I don’t understand. You hated your Macs so much that you bought 2 new computers? If it was just OS X that you hated, you could’ve simply kept the Macs and ran a different OS. So what was it about the hardware that you disliked so much?
Do they really ? Imo most consumer couldnt care less about any of that, its a tech savy minority that wants higher quality screens and SSDs. Thats exactly the reason why we are seeing zero innovation in the PC monitor space, because the market doesnt really care. It cares for price most importantly which leads to popularity of low res screens and slow HDDs in the first place.
I personally see a scenario where everyone has a nice big LCD screen, full sized QWERTY, and probably still a mouse, in their study at home but carry their 'beige box' in their pocket. Just 5-10 years out imho. Unfortunately I think Windows is still positioned best to make this happen.
Have you looked at the Ubuntu phone OS? It's designed to be the same software, hardware, and UX from phone to desktop. I don't necessarily like it, but it shows promise of doing exactly what you say Windows is best positioned to do.
As a user who has been very impressed by the combination of http://elementaryos.org/ and the fact that Valve/Steam has recommitted to Linux, I think Windows is on the downturn and Linux is on the upswing.
I really hope they won't make Microsofts mistake and start trashing the desktop UI of Steam and instead giving those users the Big Picture Mode UI to use. Because, you know "Everything must look and work identical. No matter how different the use cases may be"
Microsoft could have avoided all of this by just offering a small option during install "Where do you want to install win8? Tablet or Desktop". And according to that selection install MetroUI or otherwise leave it uninstalled. But their notion that from phone over tablets to desktop, all just have to have the uni colored tiles everywhere certainly isn't doing the trick.
10% speed improvement a year isn't nothing. The small battery improvement this year? Oh.. We went from from 6H battery life to 13H.. it's only double, it sucks! Heck, it's better than my smartphone with screen on.
The rest is on windows, which is an OS, not the OS. (Which isn't even a _bad_ OS, despite the hate for Microsoft)
The market has pretty much plateaued. Pretty much everybody has a PC at home and work. Most households already have multiple computers. Heck, I know entirely non-technical powerwasher/gutter cleaner guys who have 2 or 3 computers. In fact, I don't know a single person older than 10 years old who doesn't have at least one Personal Computer of some kind.
Any commodity off-the-shelf PC will pretty much do whatever you ask of it (at least for most consumers). I used to replace my computer every year or two just so I could run modern software. I haven't felt compelled to do so for the last 6 years and even then I'm 50/50 on doing it. The rMBP my work issued to me is fantastic for virtualization, but unbelievable overkill for everything else I do (mostly email, word and web).
There's just not much of a reason to buy more machines outside of regular replacement rates due to failure and total obsolescence and new humans buying them as they get old enough.
It's not that PCs aren't coming back, it's that the constant growth in the market has plateaued.
Everybody was hoping China, India and Africa would explode 3/5s of the world's humanity moved into the middle-class and needed computers, but the growth has been far slower than was hoped and these first time computer buyers won't really be constantly upgrading like previous markets did -- the market characteristics are such that it won't be a simple repeat of the 80s, 90s and early 2000s.
Smartphones and Tablets are an entirely new segment and still growing (though showing some signs of flattening out as well). That's why they're exciting, because those markets are still building out and upgrading. But there are signs that those segments are flattening as well.
Tablets and phones are awesome, but they're definitely not a replacement for a general purpose PC. Even my mother and father, who're quite the luddites, regularly needs capabilities that don't work well on a tablet -- like doing taxes. Even if those things were magically fixed and working awesomely tomorrow, they'd still want a bigger screen than a tablet afford.
PCs aren't going anywhere, it's just that the market has to shift to sustaining the market not growing it (which is infinitely more expensive, meaning loads more money sloshing around in the secondary markets). This is fundamentally the problem that both Intel and Microsoft are dealing with. Apple escaped it largely because they created new segments to grow into.
Heck, the one new market segment that PC makers did manage to get into, netbooks, they managed to screw up so bad that the entire segment was dead within just a few years. (If you think of where netbooks needed to go as a segment, the Surface Pro would probably be a reasonable outcome, except that market is totally hosed now and Microsoft has to rebuild it).
Also it becomes even better by imagining it in Bane's voice.
Netbooks didn't get screwed up, it's just that what most people want from a cheap device is primarily content consumption which is done better by tablets.
The upgrade cycle with phones and tablets might be starting to slow as well.
Oh they did. Every new generation of netbooks was actually worse than the previous one (I don't even know how that was possible, but it happened).
Hard disk space converged on 320GB, RAM on 1/2GB, display size was never seemingly available greater than 1024x768 and battery plateaued at ~5 hours.
I adored my first netbook. My second was blah. By the third I was starting to think everybody in the PC market was certifiably insane. What on earth made them think they should release computers that get worse year on year???
A netbook that gets better every year for $250-350 will have a razor-thin profit margin. If it has to have Microsoft Windows Special Edition on it, 10-30% of the price tag goes to MS. Joe Average wouldn't buy a computer that doesn't run whatever brand name he recognizes, so Windows is a requirement.
A tablet isn't a "computer", so it doesn't need to run Windows in order to sell, it needs to run Facebook and Angry Birds. It can sell for $250-350 and have a decent margin, and then there are so many of them being made that the parts cost drops and the margin gets better.
A Google Nexus 7 now has everything except a keyboard in terms of netbook hardware, but it gives up no margin to MS and has a market big enough for economies of scale.
Meanwhile, whatever Intel is pushing as an Ultrabook can go for $1000-1400 because it starts as a high-end desktop replacement and then gets features shaved away. Except nobody with any sense buys them, and the usual bulk purchasers of laptops that don't make sense (Fortune 500 companies) aren't re-buying machines as often, and when they do, sometimes it's a fruit machine in offices which would never have bought those five years ago.
(Why? because of you, dear HN reader. The alpha suits know that the alpha techs demand the best hardware, and they can see the logos glowing on the back of your screen better than you can.)
>A netbook that gets better every year for $250-350 will have a razor-thin profit margin. If it has to have Microsoft Windows Special Edition on it, 10-30% of the price tag goes to MS.
And if you count all of the payments they get for preinstalling crap, I think it cancels out most if not all of the windows license.
Or they could just install linux. Some did (not enough though).
>Joe Average wouldn't buy a computer that doesn't run whatever brand name he recognizes, so Windows is a requirement.
The entire netbook industry was kicked off by the linux brand name. So that makes little sense.
>A tablet isn't a "computer", so it doesn't need to run Windows in order to sell
Android or iOS are as much brands as windows or linux are.
>A Google Nexus 7 now has everything except a keyboard in terms of netbook hardware, but it gives up no margin to MS and has a market big enough for economies of scale.
Netbooks also had a market big enough for economies of scale too.
You heard it here first.
baytrail has the vast expense of cutting edge foundries and the entire expensive might of intel r&d behind it, and it just got beaten by a cpu produced with far less engineering and r&d expense on a last generation foundry.
Doesn't look good for baytrail then.
I cannot shake the feeling through, that while bay trail is competitive and will be very interesting for many purposes (looking forward to getting a small home server based around the new atoms) - it needs to truly trounce arm to initiate an industry change.
A rock solid PC in the home connected to a nice big monitor and other useful peripheral devices, is a good thing to have. Be it a compact PC, laptop or desktop, Windows or something else.
"Post PC" is a stupid agenda-driven term. We live in a "post horse and cart" world, but the PC has no inherent limitations preventing it from evolving. If you bother to look, there's currently more enclosures, cases, and interesting "desktop" configuration variety for PCs than ever before, cheaper than ever before.
In short, the article sucks.
This doesn't really speak to market trends, I guess, but making your products physically unpleasant to use probably isn't helping your cause.
I should delete this comment as a pointless rant... but, I'm shopping for 7 year old laptops, dammit. I want to type quickly and pain-free, and also have some vertical context without eyestrain.
I think when a dock for tablets or phone finally happens for consumers, they'll just get it. Desktop mode is not intended for using your fat fingers on a touch screen. "Metro" mode is for that. Windows 8 is all about for when you get off the bus in consumption tablet mode and dock into your desk and your dual monitor with keyboard and mouse lights up and you go into productivity mode.
In 5-10 years, i am pretty sure that the real desktop PCs will be for professionals only, while most consumers are using some mobile tablet/laptop hybrids.
In a sane world, this would be a feature, not a bug. PCs are now mature enough that you can buy a decent machine and expect that it will not be hopelessly outdated in two years. This is a good thing.
The problem is that hardware and software manufacturers have a mutually beneficial relationship whereby new software just won't function without that extra 10% hardware capacity you get from a new computer. Even if it's a word processor or a not-terribly-impressive game. (Remember "DirectX 10 requires the power of Vista", which requires a much faster computer that XP?)
And the other problem is that doofuses like the article writer have been so thoroughly gulled by the planned-obsolence treadmill that they actually think that's how it's supposed to be, and throw tantrums if this year's hardware isn't at least 10 times shinier and more sparkly than last year's.
Being able to get use out of older hardware is a feature. New hardware not providing any new possibilities is a bug, and that's where we've been for the last 5 years at least.
I think the core point of the article is valid. It's ridiculous that my $230 tablet has a better resolution than nearly every "high end" laptop. And apparently nobody has any ideas for using our tremendously powerful multicore CPUs and near-teraflop GPUs other than rendering increasingly bloated websites.
You mention resolution--once the pixels are smaller than the naked eye can distinguish at typical viewing distance, everything else is polish and marketing. Yes, I know, retina displays are indefinably "crisper", or something. But you can't functionally cram any more useful information on to them, because the user won't be able to make it out.
We're seeing features like this because we've reached the point of diminishing returns on desktops. There aren't going to be any more massive game-changing jumps in raw processing power. In the 90s, we went from Wolfenstein 3D to Doom in a year and a half, and from there to Quake in another two and a half years--all of them gigantic, envelope-pushing leaps in gaming technology. In this century, we've gone from Half-Life 2 in 2004 to...what? Crysis 2? The latest Call of Duty? When was the last time a PC game got the same kind of uproar over graphics that Doom did? The degree of change that used to come once a year is now coming every ten. That's not because everything got boring, it's because we're getting asymptotically close to perfection, at least from a practical home-user standpoint. That's bad if you're in the business of convincing people they need new computers every year, but it's decidedly good if you're a user.
The problem here is Microsoft, not Intel. They never figured out a way to make sure that things looked consistently good on a high resolution display. When the OEMs all decided to piggyback on the TV industry and declared 1920x1080 was "high end", that nailed the coffin.
seriously though, can someone break down the article, i'm having a tough time figuring out what it's trying to say.
"Things in the PC space suck, and absolutely no one seems to be interested in making them suck less. Instead, they keep putting different shades of lipstick on the same pig and calling it 'innovation'."
Isn't this the same for most industries, and for pretty much everything? Genuine innovation takes lots of effort building things that people don't know they want yet, even IF they are in fact monetizable. And a LOT of things are not monetizable, or face serious difficulties gaining marketshare in environments with deeply entrenched players and expectations.
The history of computing has much more to do with "innovation" in monetization schemes rather than computing itself. Great things get built only when someone discovers some hackish way to get people to purchase it. The monetization itself usually makes the product/technology worse than it would be without the monetization (e.g. DRM, copyright, ads, preinstalled spyware).
For me there has not been enough improvement in CPU features since the Core 2 range was introduced for me to need to upgrade. Sure a Haswell would be nice from a battery life POV but as I am plugged in 99.9% of the time that isn't an issue to me.
But really it was just an angry rant about chips not getting better, even though Intel makes the best chip on the market.
U mad bro?
I just wanted a PC because I tough it would be easier to install a traditional Linux on it, but UEFI. Oh the humanity, UEFI is the most disgraceful scam the industry ever did. How they could be so wicked?
I don't want to live in a world that the only good option is a monopoly of Apple machines and software but the PC industry is not even trying.
2. Users see laptops that run faster on desktop relevant applications: games, spreadsheets and programs like Adobe CS and MATLAB. The rest don't count as they arent' motivation to buy a desktop.
As an aside, Chrome OS devices are gaining alot of traction... Probably because they're more than adequate for most people's needs as is, and developers can always switch on development mode for a full set of Linux-y features...
That some vendors produce crap is irrelevant. But there are nice offerings out there.
I've seen a Samsung in the shop recently that looked and felt great. They only screwed it up in offering max 4GB RAM and their lineup is so confusing that I don't even know which one I would have to look at online. But that is not a general issue of "Ultrabooks".
However, it seems that this disappeared somewhere, and now laptop with 1366*786 15" TN screen and rotary storage (with a measly 16 GB "SSD cache" that does pretty much nothing to make it faster), but it is required to have Windows 8 and touchscreen.
The brand lost meaning, it went from "Apple quality from someone else than Apple" (so we can buy nice computer without supporting walled garden approach)
This is rather hard, though, we'll see how Haswell Zenbooks will fare. Or maybe XPS 13's
To be fair, all OS companies (Apple, Google, and Microsoft) want to go this direction. It would probably be enough to push me back onto Linux.
It's amazing how hard it is to find a screen with a decent pixel density and how expensive these average screens are.
I'm actually considering buying I-Pad screens off e-bay and creating a wall of them to get what i need. How deplorable is that?
The PC needs to become the main machine of the house and to do that it needs to step up its game.
Just try out http://html5dev-software.intel.com and judge yourself what Intel can do in software.
For that I need processors that don't become small suns 20 minutes after booting up.
Please stop reading this. Also the "PC is dead" discussion pops up every 5 years or so.
Have you looked at Windows on 27" Thunderbolt display?
The first time I saw that display with Mac OS X rendering on it, I thought: "Wow, that looks awesome. The glare from the glass is bothering, so there's that, but placed right in the room, it's totally fine and well worth for that size and clarity.".
Then I went to the specs and saw that the PPI of this monitor was exactly 6 points above my own (1920x1080, 21.5", 102 ppi vs. 108 ppi)? Is it possible 6 points to be such a differentiator?
Of course, it turned out it wasn't - I immediately tried windows on the same Mac machine and there it was - everything looked almost as badly as on my own PC.
So, I asked the guy that owns the machine (he is a graphic designer), why there is such a difference and he said: "Microsoft simply doesn't care enough about detail. Check the default icon sizes in the OS - OS X's icons on regular-density display (non-retina) are 4x bigger (512x512 vs. 256x256 on windows) and on retina this grows to 16x more pixels (1024x1024)."
So factor #1, IMHO, is people ditch Desktop PCs because of Microsoft's lack of flexibility and innovation - they decided "If we have a monopoly over desktop computing, why should we care what our users say? They still buy 'it'.".
Factor #2 - Microsoft still has the numbers!!! They have locked the enterprise tight, and still don't care about "personal" user experience.
So we have this giant corporation that gets enough money and has no incentive to innovate and we hope it will bring us the great singularity of software and hardware we have all been waiting for...
And the last factor I will point out is relevant mostly to developers - Mac hardware runs it all. Then having a Windows and a Mac machine, doesn't really make sense, both as expenses and simplicity of work.
Apple made the, arguably, "evil-corporate" decision to be totally in control of the hardware running its OS. So you, developers, have the option to buy:
- iMac, that sits tight on your desk + Macbook (probably Air) for your work on the go;
- Mini + Thunderbolt + Macbook (probably Air);
- Pro + Thunderbolt + Macbook (definitely Air after what you paid for the previous 2);
- Macbook Pro.
Well, as I see it, most developers are headed for the Macbook Pro with eventually adding a Thunderbolt.
For developers in need to support the Mac ecosystem, this is not even #3 reason - it's #1. I had to make this transition and I am glad I did, because Mac OS X turned out to be no less then great, and although smaller in size, my 15.4" retina display got me into the 21st century. :)
My problem with Macs (i use one as my main dev platform at the moment) is that they unpolished features are extremely weak. As a technical person doing development work on a Mac its a constant headache vs. Windows - VS is ugly, but its much /much/ MUCH!! more functional than xcode (it has what... a 15 year headstart though?) even in areas where Apple traditionally excel - e.g. the tab interface in Xcode requires more, less intuitive, user actions than the equivalent feature in visual studio, and it doesn't allow side by side docking of the tabs (afaik).
The other thing about Apple and displays is that they cheat a little and use a very good color profile which is vaguely related to sRGB at best. Try calibrating an Apple monitor - its basically impossible in my experience, even if you use their 'sRGB' setting. On the other hand, my well calibrated HP monitor does look like garbage by comparison... this is also a problem for development, because most people not using a Mac or iPhone do not see what you see when you work on there...
It may be popular and "obvious" to accept that Intel and Microsoft own the PC market and decide where it goes, but the reality is that the market makes the demands and Intel either meets them or they don't. I think this is clear when AMD pushed 64 bit first and Intel adopted it. This is also illustrated with the fact that PC sales have declined along with the stagnation of Moore's Law. That last point seems counter-intuitive but it shows that Intel can't force a market if it doesn't deliver.
Now both CPUs and GPUs are "as fast as they're going to be" for some time now. For some reason, next gen GPUs are joining the theoretical ranks of "Moore's Law is more threat to economics and security than fruit of civilization," giving us 10% yoy speed improvements but doubling up on security and management overhead, added coupling, APIs for compilers only, dedicating more silicon to hypervisors and management that should go to the programmer and his compiler.
IT has become a completely dysfunctional market at the macroscale. The demand doesn't know what they want or how to shop for it, and the supply is to scared to deliver anything new.
At the micro-level, those who know what they want are still taking it one step at a time with their own feet, to their own drummer, but the mess that is the macro-market is just destroying knowledge and value like a wildfire. However, those few programmers who know what the Internet should look like, instead of one that's built to be profitable for thing manufactures, aren't able to keep up with the complete mess big software is doing to the collective wisdom of the netizens and the internet infrastructure, both physical and social.