Ok folks, let me know when you throw out your computers. I mean, seriously. I don't want to hear the 100th story about how ipad changed your life unless you literally get rid of your computer. If the number of people who replaced a computer with a tablet exceeds the number of people who still have one, we can talk about that post-pc mumbo-jumbo. So, how's gonna be?
Alright folks, tell me about how great this whole "computer" thing is when you throw out your pen and paper. Maybe I'll get on the internet after you all stop talking to people face-to-face.
Doesn't follow. The question is how technology is affecting people's lives, and the answer, for now, is: immediacy and ease of use. For you, the iPad is a Unix machine with no file system and no shell. For a normal person, it is a magical way to share high-quality photos and talk with people across the world. That may not be very high tech, but it's currently holding together human relationships and families that otherwise wouldn't, and ultimately that's what is making an impact.
Yeah, but we don't say that we are in the 'post pen-and-paper era'...
... because pens and paper are still around, and still useful for lots of things, even though some of the things they were used for in the past are now done on the computer.
Just like PCs are still around, and still useful for lots of things, even though some of the things they were used for in the past are now done on a tablet.
But you didn't graph the phrase "post pen-and-paper era" so I don't understand your point. The previous poster's point was that we don't claim that pen and paper are dead. Are there countless of uses of the world "paperless?" Sure. What is the connection? I tried to graph "can't find anymore paper because no one uses it" but it didn't yield any results.
But I would say in the 'post mainframe era,' we've gone from a time when most computers were mainframes to a time when most computers are not. Thus talking about the 'post pc era,' implies (in my opinion) that the vast majority of people will only have a table.
Surely that would be when the vast majority of computers are not PCs, not that the vast majority of people only have a tablet. Given the ubiquity of devices in the average home that have microprocessors, that do computing jobs that PCs can do, that aren't actually PCs, we've been there for a while.
Now if he'd said that we were in the tablet era, that's another thing.
"those calculators that printed to roles of paper" are called adding machines, or "10 keys". They are still around in a virtual form.
If you have a full size keyboard with the number pad on the right side, in Windows, you can open Calculator|View|History, and you have a digital adding machine right on your computer.
Its an interesting question. Probably won't get a lot of takers here on Hackernews but my non-programming friends are totally in the "Great I can toss this stupid PC" camp.
For them, they have never written a line of code, their work doesn't require them to change a computer but it does require that they communicate, schedule meetings, take pictures, exchange data. For them a 'phone' is not enough and a laptop is a bit too much.
The market for computers has many components, one component is the engineer / programmer / scientist. Their use of computers is like their use of calculators, they always have one somewhere. But there is another component to that market which is much larger which are the people that are forced to use computers because businesses don't use the yellow pages, or have catalogs, or answer customer support calls, or deal with people on a face to face way.
For that segment of people the iPad is a total win. They can look things up on the web, they can order from Amazon, they can keep in touch with their friends, and they can fill out forms etc. They will have a PC in their den or something for a while and when they haven't turned it on in 6 months they will put it out with the other junk at their next garage sale.
You are completely ignoring the MS Office market. Try typing an essay or a report on an iPad or making a good presentation. You certainly can, but its not even remotely efficient (or a good "user experience" or "magical").
These days the only time I need Office is when some idiot sends me a document in one of their proprietary formats. But taking your point to be the more general "word processing / spreadsheet " market (since I don't feel like the data base aspects really caught on mainstream) I observe that there are fewer and fewer documents that are 'word processed'.
What is perhaps more relevant, is that I have tools available on the iPad that are at least as functional as using a typewriter to type up something (the pre-Office world) And if I absolutely must type rapidly on the iPad there are keyboard solutions out there. But I don't think I am compelled by the argument "You can't do Office[1] therefore you can't get rid of your PC."
What I do see is people using a device for all of the things they used to use a full blown computer for, knowing that those same people never exploited half of what their computer could do. And much of that has been driven by the notion that there is "one thing" which you use to do this stuff. A wedge was driven in that notion by 'smart' phones, and a bigger wedge is being driven into that notion by tablets.
[1] "Office" here is of course a stand-in for a variety of capabilities which one might classify as 'information manufacturing'
Way too small, requires a fidgety external keyboard... tablets don't look good for general office use, especially when compared to ultrabooks -- the real future of office computing IMHO. Why should I waste my time attaching/detaching/packing a tablet and an external keyboard, when I can just close the screen and be on my way? And all this trouble just to get... a screen smaller than 20-year-old monitors? Most office people don't care about screen resolution; if I had a penny for every monitor I've seen running at resolutions much lower than what the hardware could deal with, I'd be closing in on Warren Buffett. They want BIG LETTERS and LOTS OF SPACE, and don't care how it's achieved.
Tablets are good for certain use-cases, but when it comes to typing to save your life (i.e. your business), they are simply not a match and will likely never be. Couple that with the small screen real estate (spreadsheets these days are enormous), and you see how the iPad can be taken seriously only by the detached upper echelons who play golf all day. The rank-and-file will keep asking for (lighter) laptops forever more.
Bold statement. Keep in mind lots of us were balking at the idea of touch screens for even writing text messages when the iPhone came out.. Or jump back less than a lifetime to assumptions about how long it would take to get computers the size of our bedrooms.
Take a step back, and really consider how likely it is that laptops are FOREVER. It might be true, but things change and I think it's historically much more likely that ANYTHING you point a finger at today will be transformatively replaced in due time, than it is that you'll be able to accurately predict a permanent future for whatever an ultrabook is.
I actually don't have any idea what an ultrabook is, but it sounds like a very small idea... not something I'd bet on or invest in for the very long term.
I still hate writing on a touchscreen. It's slow and error-prone. It's only better than not writing at all.
Touchscreens are fundamentally limited by the size of your finger. Firstly because you have to physically press the UI element you want to interact with. Secondly because your finger obscures whatever's underneath.
Of course, you can get around this by decoupling the physical location of the UI element and the physical location of your finger. It's called a trackpad.
Touchscreens are great for some tasks. But they are not great for tasks involving high information densities (writing, spreadsheets, mostly anything that can be classified as "editing"). For those, people will continue to use keyboards and mice until they evolve cone-shaped transparent fingers.
Ultrabooks are MacBookAir-like laptops. Technically the term is an Intel trademark, but it's a widely-accepted synonym for "very thin and light laptop with extra battery life and no optical-disc bay".
As soon as they start to get over the 11-to-13'' form factor (and lower in price), they're going to replace the current generation of work laptops. They'll get thinner and thinner; as batteries and bluetooth-like technologies improve, sooner or later they'll even lose USB ports, and you'll be left with a device that is literally a screen and a keyboard. Some models will have a detachable tablet screen, no doubt, but the heavy lifting will still be done with a real keyboard.
But interestingly, the touch keyboard does suck very badly. It's only because of other factors that I accept the "touch keyboard." But in reality, it's very, very annoying, let's be honest. Pre-iPhone, I texted happily without even looking at the screen. Now, I can never figure out why I keep hitting certain letters that I didn't mean to. I feel like an elderly person typing. And you can google this stuff, it's a very common view. We haven't really innovated with a touch keyboard, we just deal with it. If I had a technology that could let me have a touch experience but give me a tactile keyboard when I need, it would be great. Although, I agree with you on the ultrabook thing, not sure about that.
I don't know. What happens when you can dictate your documents? That and gestures and the occasional keyboard might go a very long way.
I don't think it will change tomorrow. But I could see the home market making the shift, gradually. Where the business market will be fifteen years from now we'll have to see.
1. Every office in the world will buy each employee a separate office room so they can dictate their documents without being disturbed by the other employees.
2. Every office in the world will buy each employee a $10 physical keyboard.
I don't think folks will do all their work from home, but I think it will increase quite a bit for professional workers over the next 20 years -- mostly because space is expensive, and working from home is desirable for many people.
Surprised I got down-voted. It's kind of naive. Many programmers that have families prefer to go to work instead. Working at home, as a non-bachelor, you're seen as being able to take care of all the domestic issues (dishes, trash, run errands) and many people would just rather strike that balance. Sure, a lot of people like it but I don't think it's going to be that common. Even our industry, supposedly the most forward thinking, and even in SV, many require working on-premise. If working from home is too liberal for SV, it's going to be too much for the rest of the country for much longer than 20 years.
Well, I didn't down-vote you and I have a family, but my wife worked from home for quite a while and liked it. I think it would be tough when the kids are little, but when they are older it would work fine.
I don't think most folks would end up working from home 5 days a week -- there are still things that work better face-to-face. But I could see folks _only_ coming in for face-to-face time.
I do not disagree, engineers need CAD tools or simulators, architects the same. Take all the professional people and put them in the category "Need a computer for their job." Now create another group called "Don't need a computer for their job." Compare the sizes of the two groups.
I don't think anyone expects that general purpose personal computers are ever going away, but there is a pretty compelling argument that the bulk of computational assets will fall into the 'appliance' category rather than the 'general purpose' category.
As Atwood points out, Microsoft strove to have a PC in every house, and they succeeded! PCs have higher penetration in homes than TVs. However, given tablets and their capabilities, it is looking more and more likely that this trend will stop and that the number of homes without a PC in them will begin to rise again. its an open question how many homes that will be but it seems likely it will happen.
They do, but they don't have to. Some careers are notoriously averse to evolving technology - I still see HP-12C calculators around the office because many of its users refuse to learn anything newer - and business and accounting are two of them.
There is little stopping them from using newer tools but inertia.
That's because they're exactly what a competent finance person needs for conversational analysis. Spreadsheets have their place but if you cannot talk without a keyboard you are way too deep in the numbers. And the batteries never die, at 3am on day 3 of due diligence in Toledo your phone and its finance app will be dead but that 12C will still be cranking out present values, amort schedules, target return rates and IRR. Those calculators are maybe the most fit-for-purpose white collar tool I've ever seen.
EDIT: I don't care for MSFT, I loathe Word. But Excel on Windows is a genuinely fine tool -- overused by a lot of people b/c the Windows anti-pattern keeps them from proper tools, but still a really useful piece of work. You can't do detailed modern financial analysis without it. The alternatives -- Linux, Google Docs, even Excel on Mac -- aren't even close.
I'm OK with pocket calculators. What I find weird is that inertia has kept the 12c being produced long after it became a relic. There are many other business-oriented pocket calculators out there with much nicer and efficient interfaces than the 12C.
What some courses teach is not financial math - it's "how to get numbers out of a 12C". This is just perverse.
For writing, almost anything. For spreadsheets, unless you need Excel plugins, again, almost anything. For project management? There are a good couple tools around.
They don't need Office. They are just used to it, much like accountants are used to their 12C's
The problem here you see is that people don't want to spend time learning computers if they can avoid it except for browsing.
Now most students are being trained to use Office at school.
I use to see the world from an "Open Source" programmer's point of view: "You have the choice to use OpenOffice". But then after I saw the reality for a while, things start to sink in and it becomes obvious that people don't want to be liberated the way I thought they ought to be.
People are OK with MS Office. It becomes a piece of software that, no matter how hard geeks want to argue about the Pareto Principle or not, will stood the test because people were being trained regardless its complexity.
So yeah, from "feature/functionality" perspective, there are alternative. But from skill-wise, willing-to-learn aspect, there aren't until Universities and/or schools start teaching young people to use something else.
OTOH another argument is: you don't need the other alternatives if you have Office.
I know your use cases may ignore it, but Excel, from what I've seen is unmatched. There are things you can do in it that are either difficult to do in other spreadsheet apps or require a database.
It's limitations are outweighed by how fast you can do cut and dice data, including pivot tables (ie, basic OLAP). Google Docs is still too limited (ie, try filtering and sorting on multiple columns).
If you know of a way I can do this in a desktop or web-app for the entry price of Excel (free would be nice, but I don't mind paying), I'd be happy to know.
You're missing the point. You are on Hacker News, you will probably care about such things. My mom doesn't care if you can't filter and sort on multiple columns. She cares if the thing is easy to use and portable, which the iPad wins hands down over a PC.
Most people have PC's at the office or in school's for that. And with the keyboard dock I found the iPad pretty good for wordprocessing (and I prefer doing presentations on it).
Of course at that point it gets annoying to have to take your hand and reach over to the screen all the time , so you probably want to plug in some sort of pointing device..
Now it becomes a hassle to carry all of these docks and accessories around with you so you think "gee , I wish somebody could invent something that combined my screen, keyboard and mouse into one unit..."
Another point to bear in mind is that whilst people will have computers at work and at school a great many people , especially students will want to do work from home.
People do not just base their requirements on what they need 99% of the time, but rather on what they forsee their possible needs might be.
Let's take cars for example, how many people do you know that own medium to large size cars with 5 seats and a trunk where for 90% of their daily driving the car only contains the driver , possibly one passenger and an empty trunk?
Jiggy, I completely respect this point of view, but understand you are arguing from the base assumption that people have to spend a lot of time doing that complex processing.
To exploit your analogy (cars are always good for that :-) how many people own trucks these days? There is always a market for a truck, its the best selling unit of Ford's inventory, but people who only occasionally move stuff or carry stuff often will borrow or rent a truck when needed rather than own one)
The other interesting part of this conversation is the conflation of cause and solution. Your point 'especially students will want to work from home' is particularly salient. When I was a student the PC didn't exist, I still worked fine from home :-) But more importantly as the presence of PC's in the home grew, teachers gave students the opportunity to do work on them rather than say writing it out long form, etc. But as tablets become common at home and with students, and especially tablet based text books, the opportunity to provide even better tools for students is possible.
Imagine a world where your history home work is to write an essay on some topic in the text book that is also on your tablet. Writing an essay is pretty trivial mechanically, you don't mouse a lot and you can do the whole thing even from a soft keyboard.
And this comment for me really sums it up : "People do not just base their requirements on what they need 99% of the time, but rather on what they forsee their possible needs might be."
This is spot on, and there are many people who are looking at tablets and deciding that it can cover even their foreseeable needs. And that, for me at least defines, the term 'Post PC', when something other than a PC can meet the extant and foreseeable needs of someone with respect to information communication and manipulation.
I think it depends on what you define as "Post PC" , whether we really talking about "post wintel" or "post keyboard and mouse".
I think the precision and speed advantage you get from a proper keyboard and mouse setup vs a touchscreen, especially when you consider how cheap both of these items are will mean that they are around for some time even if it as an optional add-on rather than a core part of the whole interface.
I see the "post pc" as more of a rethink of the PC minus the general historical baggage that various versions of Windows etc have carried with them for last 20 or so years.
What I think people are really clambering for is a better user experience from their software.
Let's take the filesystem as an example.
Years ago it made sense to have an OS with the concept of files and folders being exposed to the user.
Why? Because you needed to be able to group your stuff together, either things for a particular project or because they were of a particular file format (word documents , spreadsheets etc).
You would also have "drive letters" which told you whether it was on the "C drive" (inside the computer) or the "A drive" (portable). This is no longer important since we can trivially sync most things to the internet regardless of where the bits are physically stored. In fact Unix doesn't even have the concept of drive letters at all.
File formats can now easily be replaced by Mime types and meta info since fast SSDs and indexes can be used to search either the device itself or cloud storage very quickly. This means it is easy to have an app that loads and immediately provides a list of all the documents which it can work with.
The only requirement that is left is the "grouping", folders are pretty bad for this because you can only group in one dimension. A better system might be some form of "tagging" for example tagging as "work" or "my project" etc.
By removing the filesystem from the user's consciousness you have already massively improved usability and I think this is probably the biggest thing that has made the iPad so easy to use because it essentially turns the install process for an application into a 1 stage thing. Combine this with having updates automatically install silently in the background and an application ecosystem built around the idea of the internet and web as a first class citizen in the OS rather than an extension of it and you have a very seamless experience.
The physical "input into the device" landscape though I see as something blown open and I really don't think there needs to be one dominant paradigm here.
While typing on a tablet is certainly not ergonomic, iWork for iPad has pretty amazing user experience. It actually feels better and more intuitive than the Mac version thanks to touch and gestures.
I used to use my laptop all the time. Browsing the web, social web jerk-offery, entertainment on planes, whatever.
The iPad has pretty much replaced my laptop entirely for casual computing. I barely use it these days. Mostly it gets dusted off for looking at porn, since Flash isn't coming to mobile. And even now, that's becoming less and less an issue as more content distribution – adult and otherwise – moves to H.264.
Laptop is still good for designing things, for writing code, cutting video, whatever. But I'm quite impressed by how thoroughly the iPad has replaced it for the majority of use cases. And that's how it works. It doesn't happen overnight. It creeps. I didn't immediately stop wearing watches when the cell phone became ubiquitous. But eventually it was a no-brainer.
+1 for the most part the only time I ever use my desktop at home is to manage images from my SLR and... That's about it. 90% of my non working computing time is spent in an iPad.
I'm curious to see how the iPad comes together with photo management. The retina display makes photos gorgeous to explore. And you've got iPhoto now.
The big problem is the workflow sucks – importing images to the iPad from an SD card can be pretty slow. And fiddly, since you need to hold onto an adaptor. Wonder if there's an iOS client for any of those WiFi-enabled SD cards. Wouldn't help with speed but maybe the ease-of-use would make up for that.
I'm with you - I'm an amateur shutterbug with about 100K photos in Aperture, and my house was recently burglarized - they got my EOS7D and a bunch of glass. I promised myself I would _not_ purchase a new camera until I could get something directly onto my iPad. I went to London a couple weeks ago with my (ancient) EOS-10D that the burglars couldn't be bothered to take, and I found, that even though I had a zoom 28-135 + fully charged EOS 10D with the Lens cap off in my hand - most of the pictures I took where with the iPhone 4S. I knew that they would seamlessly get into my photostream, and editing/sharing them into path could be done realtime.
Part of the "Post PC" era, that I'm starting to grok, is also entering the "Cloud" era. Data that is not connected, or highly mobile, is much, much less valuable in that world than it used to be.
Huge (HUGE!) opportunity for an SLR with dead simple and fast streaming into the cloud/photostream/iPad.
So the iPad can't be disruptive for computing unless people also stop using their computers completely? It can't be disruptive and, say, replace 80% of the usage for 80% of the people?
I've shared this anecdote several times on HN, but my parents bought iPads instead of laptops for travel.
I know people who email from iPads. It's really pretty common.
And I think the "expensive toy" trope is pretty passe at this point. I have an IDE (AIDE) running on my phone, and since it's a Droid 4 with a keyboard, it's really pretty usable for projects of modest size. And the debug cycle is ridiculously easy, unlike running a phone emulator on a PC.
The fact that you have to jailbreak an iPad to be able to write code you can run locally seems like a defect to me, but that doesn't mean the tablet form factor is a waste of time, just that the iPad is not my particular cup of tea.
And iPhoto, iMovie, and the myriad of music applications provide the opportunity to do all sorts of creative work on a lightweight device that gets fantastic battery life.
I'd say that's more than enough to take it out of the toy category, and into the "very affordable PC substitute for many applications" category.
I used to do small code fixes and push them back to Git with my old N900. This tablet has he capability as well, but somehow I haven't gotten into the habit. Maybe because this device is WiFi-only, so not necessarily online when I would like to clone and weak some repository...
Well, while we're throwing out anecdotal evidence, my accountant constantly uses his iPad. I've never seen him use a laptop for anything but he takes notes, sends email, and even creates and edits spreadsheets on his iPad.
When you stop insisting that iPads are toys and start just using them, it's pretty incredible what you can do. Can you do everything on an iPad? No. But that doesn't mean you can't do anything.
(And I bet something you've looked at online over the past week has been created on an iPad... hell, this comment has been created on an iPad.)
You can get typographically correct apostrophes on an iPad by holding down on the ' key on the onscreen keyboard. A palette will pop up including curly apostrophes.
Perhaps not you, or many HN readers, but for many the answers to all of those are "yes".
For example, my non-tech girlfriend has had an ipad for 6 months now and the only time I've ever seen her use the computer was for word processing and creating presentations. "Office" kind of stuff.
The iPad isn't designed for user like us. It's designed for casual, everyday uses: like your mom.
> the only time I've ever seen her use the computer was for word processing and creating presentations. "Office" kind of stuff.
i.e. for work. Laptops/PCs are for working, tablets are for playing -> little more than expensive toys, in the same way your TV was. Are tablets replacing TVs? Absofuckinglutely. Are tablets replacing work laptops? Not a chance.
Toys eh? Where have I heard that before? Oh yeah, pg likes to invest in things that forum trolls dismiss as toys (http://www.paulgraham.com/organic.html).
You cite content as creation as a reason why we're not in a post-PC era, but I think you overestimate what most PCs are used for. What percentage of PCs are used to create online content anyway? Also, how do you know you haven't received an email from an iPad or seen a Facebook photo uploaded from an iPhone. You're extremely dismissive, but the proof will be in the numbers. Certainly post-PC is overhyped as a meme, but tablet sales growth is astronomical. If trends continue for any appreciable length of time, it will be impossible to deny that tablets are dominating every-day computing.
I'll agree that this is they key difference. I'm sure that my ideas of computers are good for - there value to society - is different to most users.
Perhaps what really gets me is the idea that the tablet 'revolution' is necessary leading us to a better place. One day we might look back fondly at the 'family computer' that sat in the living room, where someone discovered a love of programming, or creating movies, or whatever. Stuff that other less 'general purpose' devices don't do.
I write and edit blog posts and books on my iPad. Some of them have been front page, top ten and even #1 on Hacker News. Are you SURE you haven’t ready anything that was created on an iPad?
Interesting. In the last two years my mother hasn't sent an email on anything but an iPad, and my dad does it entirely via his iPhone. Perhaps they're just unusual that way.
Unless you're checking email headers, you just might not know if you've received emails from people using an iPad. I send dozens of emails weekly from my iPad 1, both directly through the Mail app and through online apps like Desk.com. And this is doing real work: tech support, dev communications, and more.
And TV? I think I watch video on my iPad perhaps once every couple weeks. And then its likely just a 3 min Vimeo clip. My iPad is absolutely a work device; it's in fact my full on-the-go work solution.
My parents, who are 70 and 66, both primarily use an iPad for computing these days. My mother's main email client is her iPad. When they went on vacation late last year, both brought their iPads along to allow them to communicate, find things, etc. It's gone way beyond expensive toy.
> Have I ever received an email from someone using an iPad?
I have. My father, my mother, my aunt, my best friend, one of the most skilled software engineers/architects I know, an engineering manager I work with, several members of mailing lists I'm on... Oh, and I send email from my iPad occasionally, though I'm still more attached to my laptop than most people. I've been too busy to adapt my workflow.
Before Television came along, people were going to movies at least once a week. Movie theaters still exist and people still go, but it's no where near the same.
For post-PC, I think the important metric is how much more time are people spending on their tablets and phones vs. their PCs rather than whether or not they still own a PC.
Exactly. I'm only using my laptop when I'm actually writing code. All the rest of the computery stuff (like HN) happens in a comfortable chair with a tablet.
Eventually either somebody invents a great way to write on these devices, or I'll finish my visual programming project (see http://noflojs.org). Then (and only then) I can see my laptop usage starting to near zero.
That looks interesting, most of the visual programming tools I have seen or used thus far are unusable or inefficient compared to writing code but that actually looks like something I could see myself possibly using.
When I think about a program in my head I basically construct it as a huge mental diagram anyway and then translate it into code. Perhaps detailed visual design is the way to go.
Visual programming has been around since the 70s and has never really taken off because it requires lots of screen real estate. The conciseness and semantic density of text will never be beaten.
You don't discuss philosophy with flowcharts; why should you code that way?
Well , we have vastly increasing screen resolutions as demonstrated by the new iPad not to mention easy zooming etc.
I'm not necessarily suggesting that you entire program will be converted to a diagram overnight but it might be an interesting way to declare general structure.
I've designed algorithms via OmniGraffle similar to this, though had to manually write the code later. It's an interesting way to do things, and it does seem to make designing complex things easier.
Yet almost every software project I've been to has started with boxes and arrows on a whiteboard. What if these boxes and arrows would be the executable application? And what if you could go to a running application, see how it is connected, and how data flows between components?
False dichotomy. "Post PC era," at least as this article is using it, means "the time after PCs became commonplace in the vast majority of homes," not "the time when people started getting rid of their PCs." The rise of a new type of device does not require the extinction of an old type of device. No one would deny that automobile have replaced horses for transportation, yet plenty of people still own horses.
The iPad doesn't have to replace a computer to make a big difference in computer sales. I bet there are large numbers of people who already have computers but that aren't upgrading them like they normally would because the iPad serves their needs. Or not buying additional PCs (one for the kids for example).
The iPad is sort of like buying a scooter when you live in an urban area. If you already have a car you might not get rid of it, but you drive it a lot less and when it comes time to be replaced, its worth is now a conversation instead of a sure thing.
I think this is partly because there hasn't been a compelling reason to upgrade your PC for a while unless you are really into PC games or are doing some very intensive work.
If you bought a new PC 6 years ago it probably has a dual core CPU , 2GB+ of memory , a 100GB+ HDD and some hardware accelerated graphics.
In other words enough to comfortably run Windows 7, Office, Chrome and even Photoshop.
There are still plenty of workplaces around where computers are still predominately Pentium 4s.
In the last 8 years I've gone through three laptops (MBPro, MBPro) six phone (Palm Treo, iPhone, 3G, 3GS, 4, 4S) and three tablets (iPad, iPad 2, iPad 2012).
At work, for that entire 8 year stretch, I've done 100% of my network diagrams/power points/Boring-office-stuff on a Dell Precision 650 with a grand total of 4 GBytes of ram (That was a lot back in 2013) running (and still running) windows XP.
I think _that_ has a lot do with why AAPL is now roughly double MSFT.
Linux has traditionally maintained support for older hardware - through driver support, forums to answer questions, and Linux installations that are snappy even on old hardware.
Actually, I see desktop computing (including laptops) hitting a plateau already just from being good enough (factor in Windows XP's decades-long lifetime).
Arguably if desktops are plateauing, embedded devices will follow a similar trajectory on a shorter time scale - all the hard-knock lessons are already known.
Ultimately the year of the Linux desktop won't happen - but businesses will face fast-shrinking margins for desktops, laptops, and per-seat software licenses.
Seriously. And on a related note, I don't want to hear another word about how the "television" is going to be a big deal unless it's from somebody who's already thrown away all of their radios.
In fairness, if anyone pronounced that the advent of the TV was going to herald the "post radio era," I would say they turned out to be mistaken. People don't listen to radio the same way they did before TV, but it certainly didn't go away.
If the PC remains as ubiquitous as the radio did, it won't really be a "post PC era", it will just be an era with a greater diversity of computing devices.
I agree, and additionally would like to know; how much tunnel vision must one have to believe that the next generation of a single successful tablet, in a sea of tablets that are failing, denotes that we're "post-PC" because the PPI has been increased
More than one third of US households have more than one PC. It’s probably a good assumption that most US households with iPads will keep at least one PC, but it’s very possible that many households are replacing their second or third PCs with iPads. After getting an iPad many households might not replace their kitchen or living room PC once it breaks. Maybe kids will get an iPad instead of a cheap laptop.
It would actually be very interesting to look at that question more closely. I’m not sure whether there is already data out there.
I think it’s important to emphasize that for the iPad to supplant PCs, people don’t have to ditch every PC they own. Many own more than one.
In the past, I always had a Power Mac / Mac Pro. It was a needed item for some of the stuff I would do. In the last couple of years, I have switched to a 17" Macbook Pro. I expect in a couple of years, I will be down to a MacBook Air with a nice external monitor at home. The form factor of my needs is getting smaller.
My Dad is now retired. He was a big PC guy and had various PC portables not a few of which were tablet convertibles. Today, he has an iPad and doesn't use anything else. He can e-mail, look at pictures, browse the web, and keep track of his bank accounts. It also makes for some decent entertainment for his grandson (my nephew).
We both started at much different places, but the Post-PC era has arrived for him, and I am one of those people who needs the trucks just as some still need bigger iron.
What people are replacing is their upgrade cycle: they buy an iPad instead of a new PC, and just hang on to the older PC for a while longer. Many of my non-techy friends are doing this.
My girlfriends computer broke a few months back. She bought a tablet, and has no plans to get a laptop anytime soon. Not saying this is happening in droves. But it's slowly coming to pass.
Let me know when I can sync music and podcasts from my music library to my iPhone without a desktop then I might consider believing this post pc mumbo-jumbo.
If you want the benefits of iTunes in the Cloud for music you haven’t purchased from iTunes, iTunes Match is the perfect solution. It’s built right into the iTunes app on your Mac or PC and the Music app on your iOS devices. And it lets you store your entire collection, including music you’ve imported from CDs or purchased somewhere other than iTunes. For just $24.99 a year.
I think Atwood is right, PCs are on their way out simply because with tablets there is so much less that can go wrong. He mentions tech support in the article: THAT is why they will win, regardless of whether or not they will fully replace PCs, or whether or not they are better suited to writing emails. Millions of people who can't be bothered with anti-virus updates and the like will just find it a much more pleasant experience: "it just works" indeed.
(I'm a Mac user and I find myself spending a lot less time on tech support type issues than back in the day when I was using Linux and Windows, still it's hard to deny that iPads are currently the epitome of simplicity, compared to their power.)
And that's what I'm worried about: "No user serviceable parts inside". I'm concerned the very reason tablets will take over the PC market will also mean that tomorrow's kids' experience is very different from, and I would argue poorer than, my experience when I got my first computer, a Commodore VIC 20.
Ironically, my VIC 20 also was very user friendly. You turn it on, it's on. You put in a diskette -- well, it didn't load automatically but making it load was a very simple incantation, and once the game started, it was started. Things were simple and worked, for the most part, quite well. No tech support needed. But the BASIC shell was there right at your fingertips -- hell, the manual came with example BASIC programs.
Quibbling over the merits of BASIC aside, it was a simple experience but it was also tremendously open.
Things got much more complicated since then; network connectivity in particular introduced previously unknown threats to users, so of course this is a simplified comparison. Still, I wonder, is giving up the freedom to tinker with the system really the price tomorrow's consumers will have to pay in order to buy simplicity?
And if you're thinking about replying with links to IDEs running on the iPad: no, something that runs in the cloud and has no direct access whatsoever to the internals of the machine does not qualify as a modern-day replacement for this experience, no matter how sophisticated.
I think the whole question Tablet vs Computer is pretty dumb in the first place. It's not a versus, it's a peaceful cohabitation, only some evangelist and/or journalists want to push us to believe there's a war but that's really not true.
Tablet and PC simply correspond to two totally different use of media. A tablet is consumption centered whereas the PC is able to produce complex goods and services than then can be consumed by a tablet. Will the tablet push away PC usage to very low level in some areas? Probably! Will it replace a PC for every usage ? Are you kidding me: Development, 3D Effect, Advanced Audio Processing, Advanced Photo processing. That's just as stupid as saying that photoshop is dead because Instagram can do sepia...
PS: I will absolutely never buy a tablet, I fight every day to stay on the creator side of society, with a tablet I would just give up!
It's not a versus, it's a peaceful cohabitation, only some evangelist and/or journalists want to push us to believe there's a war but that's really not true.
I wouldn't call it a war, but Atwood (who, by the way, is neither an evangelist nor a journalist with an agenda) makes the point that tablets are about to take over large shares of the PC market. And my point was that this way, people (non-hackers) get even further removed from the "close to the metal" experience that I and others in my generation used to have.
With all respect, I think you and dozens of other posters in this thread are missing the point by "defending" the notebook and saying it won't get fully replaced. Of course it won't, but that's not an interesting question. The interesting question, in my mind, is: how can we avoid the TV-ization of computing? How can we make sure future generations don't equate a computer with a consumption device rather than a productive device? How can we make sure they can find out that this "magical device" isn't actually driven by fairy dust? Whether or not some share of the population uses notebooks, when the default for the millions and billions of non-hackers out there is to use a tablet, doesn't matter very much at that point.
Fortunately there are initiatives like the Raspberry Pi. It's telling that one of the main initiators of that project, David Braben, is also an old-school guy who grew up with computers in the 80s. Or so I'm guessing -- at any rate, he wrote some of my favorite 80s games, Elite and later Frontier. But I fear that if you plot the Raspberry's sales curve into Atwood's graph it won't look anything like the iPad.
It's maybe a philosophical question but who doesn't have an agenda? I mean when you buy a device, don't you want to convince the other that you haven't bought a stupid toy or a useless object? Everybody has an agenda, even someone just buy an iPad he has incentives to convince that his device is great, the only ones who don't have an agenda are the ones who genuinely don't care.
I have an agenda, I am trying to push people to see computers as something else that a tool to consume media/ slack around. But from the current design of tablets, especially Apple's iOs Devices it's getting really hard to see something else than a "pure-consumption" product.
To continue on the philosophical level, I think humans deep down aspire to do very little while receiving a lot of pleasure. The most productive of us have managed to channel this desire towards building systems that, when they will be working, will provide us with a lot of rewards with little additional work (the startup!). In a way the self-driven programmers have achieved a Post-Slacker era in their mind and god knows how hard it is to keep that up. The iPad/iPhone/iPod Touch/Android Phones are enabler of the slacktivist part in everyone of us, and that's one of the big reasons of their success.
Sure Raspberry Pi is great but, again, it's in the realm of post-slacktivism and thus it will probably NEVER be popular. It doesn't mean it won't be successful.
I think tablets will replace "PC's". But I think it won't be as bad as you think, You'll still have you dev environment, you mouse, keyboard, and touchpad all working in harmony. Here's a snippet from another post I made (discussing mobile vs desktop OS's, hope it provides some alternative viewpoint:
"I'm pretty sure this is the direction all OS's we will be using are going to end up in the coming years. Some people will cry and yell and holler that there is no way something can be made for both keyboard, mouse, and touch screen. I see this as simply a failure of vision. That's absolutely the way things are going to be going, and it's coming sooner rather than later. (I actually think the OSX launchpad is pretty close to allowing this implementation) Devices are getting smaller and smaller. I use a Mackbook air. but guess how it gets hooked up at home? That's right, wireless keyboard and mouse, nice big monitor, I never see the actual computer. It's only a matter of time before my laptop gets replaced by a tablet that has comparable hardware specs. The OS allow for my normal desktop interfaces, along with a nice touch screen interface. I'll use it to pick out movies to play on my TV, from my couch. I"ll use the same device to write code at my desk (with big monitor and keyboard). My kids will use it to play video games, both mobile, and on the TV. I'll use it to send email in the backseat of a car moving at 65mph. (As a wireless comms guy, I fully appreciate the technology it takes to perform that last action.)
But the bottom line is, It's going to happen. Sooner rather than later."
I guess my point is, mobile and desktop can blend perfectly. Those that don't need a monitor or keyboard won't get one, devs and graphics guys will. Technology will keep the size of processing power shrinking. Could you ever imagine something as small as an MBA being a full on computer.
Last bit, and my only reservation of this whole "movement", is that, I hope they keep it more open, more like OSX, vs overly walled garden a-la IOS. But the signs are there, I think it can be done, and done very well, it's only a matter of time.
edit: It seems you biggest problem address the issue that these devices are built for consumption rather than creation. And I'll agree, that's my biggest reservation about the mobile concept. I guess I could have been more clear about that. I'd like my iPad (not that I actually have one) to be more like OSX, where I can hook up a monitor and keyboard, and go to town on my OS. While being able to switch into the "launch pad" mode while mobile.
Apple's devices (and other embedded devices) will never replace PC's - and exactly for the reasons you state. Thus the Raspberry Pi and the PC (PC platform) have a bright future.
Well things will need to be produced somehow, I imagine this will still be done with some form of computer. Whether or not you keep the keyboard has little to do with the "freedom" of the device.
The question is what the difference will be between the consumption devices and the production devices.
Whether the devices for creating content become specialized pieces of equipment priced out of reach of the general market and you get to touch one for the first time when you turn 18 and start your computer science course or perhaps with increasing viability of virtual machines you simply download a program that gives you all of your power tools but inside a sandbox so if you mess it up you can just re-image and go again.
There will also be as you suggested "hacker hardware" like the Raspberry Pi, I think the key for these is to make sure the costs are low and that they are available to kids in school.
The "hacker spirit" can overcome many obstacles.
Hell in the 70s nobody had a computer at home and the original hackers used to break the law by breaking into companies computer systems via telephone lines just to play around with them.
Thanks for this optimistic reply. You're right, the "hacker spirit" has proven time and again that it can overcome any obstacle. Those who explicitly seek out open hardware will always be able to find it.
My concern is just that the lure of simplicity (and parent's paranoia) will mean that kids will be more likely to end up with a closed system rather than an open one, and consequentially deprived of the ability (and, more importantly, the inspiration) to tinker. But maybe you're right that hacker souls will always seek out systems that allow them to do what they want to do, and it won't make a difference in the end. I hope so!
Yes, I think inspiration is the key here.
I was largely lucky in this regard though, I had very liberal parents who let me have a computer that I could mess around with as well as unfiltered internet etc.
Most people in my peer group at the same time were only allowed limited access to their home computers and often were not allowed to install any programs on them etc.
Really though, I think it is in the governments best interest to make sure that kids are inspired to tinker with things if we want to stay competitive with BRIC countries in terms of creating and innovating.
I can only speak as a British person here but I think that from Alan Turing to BBC Computers (and now raspberry pi) etc , the "hacker mentality" is very much a part of our national DNA and it would be a great shame if that was lost.
I wouldn't call it a war, but Atwood ... makes the point that tablets are about to take over large shares of the PC market.
No he doesn't. He says that everyone now owns a PC, so innovation is moving to post-pc devices.
I think the term "post-pc" causes many geeks to project a vast ethical struggle onto the tablet market. Chillax, game consoles didn't kill off programming, this is no different.
Personally, I prefer the term "portable device" but I guess that's not as link-baity.
>I think the term "post-pc" causes many geeks to project a vast ethical struggle onto the tablet market. Chillax, game consoles didn't kill off programming, this is no different.
The iOS locked-down norm will likely migrate to the conventional computing world. Mountain Lion's Gatekeeper software establishes default behavior of not being able to run unsigned software. This gives Apple the power to censor software and is the a step towards App Store-only software on the desktop.
> This gives Apple the power to censor software and is the a step towards App Store-only software on the desktop.
And that will lead to more open platforms being where consumers find the most innovative software, as well as being where commodity software is cheaper and/or better because competition will necessitate it.
Locking down your platform to prevent crapware that users don't want getting onto it is one thing. Locking down your platform when noncrapware that users do want is also available but only one someone else's gear is how you turn Apple back into being what they were before the iPhone: a footnote.
Of course, this does mean those developing for more open rival platforms will have to actually produce noncrapware, not the poor excuse for software many places ship today. It's about time our industry grew up and stopped pretending that shipping junk and charging for it is acceptable anyway.
I would like to see a network enabled virtual machine approved for the iOS App Store, or the basic human decency of allowing sideloading on iOS, before trusting Apple with the OS of the future.
Now, Tim Cook has shown a good track record of undoing the most extreme abuses from Steve Jobs megalomania (you supporting employee charitable contributions, paying dividends to investors, admitting imperfections in factory working conditions), so there is hope that he will do the right thing for humanity with this issue too.
>how can we avoid the TV-ization of computing? How can we make sure future generations don't equate a computer with a consumption device rather than a productive device?
Why do we need to concern ourselves with this? The Desktop already is "tv-ized" for most home users. People who are interested in producing will continue to do so but the majority most likely never will be and trying to make them will only turn them against us further. We should just get out of their way and make it as easy as possible to do what they want to do.
>Will it replace a PC for every usage ? Are you kidding me: Development, 3D Effect, Advanced Audio Processing, Advanced Photo processing...
No. But for a large proportion of the PC Market this doesn't apply. They only ever used their PC for email, youtube, email etc. So for what is probably a majority of PC users (ie. Not you, me or most HN readers) this is 'Tablet vs Computer'.
To call the argument 'dumb' is dumb. Make your point without attempting to dismiss the argument in such a cheap manner.
I hope the barrier to entry to be a producer will stay as low as it is now. I worry that as PC demand goes down, the cost to get started in programming will go up. I started to learn how to code web sites while procrastinating one afternoon during finals week; I discovered Apache on my macbook. The barrier to entry was almost accidental, because I had all this great stuff on my computer waiting for me to discover it. Those experiences will be more rare in the post-PC erra.
Funny you mention that, but in a world totally dominated by iPads the price to just have your application being available by legal means is 99$. When I began to code(around 12-15) 99$ was WAY above anything I could imagine. I don't know if my parents would have allowed me to spend such a sum. Hopefully you can get started and publish for Android for free!
So, who knows what the future holds for the cost of content creation... Again people often ignore the long-term externalities they produce when they make their choices, buying a PC or a tablet being one of these choices.
Are they ignoring the "long-term externalities" or is it just impossible to actually know what they are? Or maybe they just disagree and are putting their money where their mouth is. To even expect a non-programmer to consider this issue strikes me as pretty crazy.
I bought an iPad and I don't think that it's going to affect distribution of software in any meaningful way, besides making it easier for developers to reach users through the app store. For one, anyone can deploy an app to Heroku for free. You can host plain html, css, and js for free. You can learn how to program an iOS device for free with tons of great guides and materials online. When you want to distribute it, there is a hurdle to clear, but (and this might be a cultural thing), $100 as a 12-15 year old is not such a wild sum. It also makes my experience browsing the app store better.
I'm 25. Maybe I was born in a conservative family on spending but I would have to seriously push my parents for weeks to obtain $60 worth of something as abstract as a SDK (like they even know what it is...)!
And for families that live check after check, they often have a computer but affording a $100 license can be very painful, I was in that situation!
true , but it's only recently that there has been so much good development software available for free.
I absolutely disagree. When I was 12 (in 1994) I became interested in Linux because it had so many development tools available. I thought computers and programs were magical, and Linux/bash/Perl/gcc et al. made it possible for me to learn programming. And Slackware Linux could be obtained for 10 guilders or less.
My generation became hackers through GNU and Linux, just as the generation before used MSX, C64, or a ZX machine with a free BASIC interpreter.
IMO, the sickening development is not so much that Mac and iOS developer accounts cost $99 per year. It's that the world (Apple and Microsoft) is slowly moving to a model where there is a gatekeeper who decides what gets in and what does not. As a bonus the gatekeeper gets 30% of every purchase. I can sympathize with the need to provide a 'trusted' source of software, but it should also be possible to install whatever the heck you like.
What would the Internet be if it followed this model?
I hadn't used Linux in 1994, my first experiences with it were ~1998 but I remember that getting it to install and work correctly with my hardware as well as getting X to work was no mean feat.
So I imagine that would have been quite a large hurdle for somebody with a casual interest in learning to program to jump through in 1994, I also remember paying somewhere in the region of $100 for my Linux distribution then (SUSE I think).
Even at the point you had installed it , you would be compiling binaries with GCC that would have targetted Linux/glibc etc so you wouldn't have been able to share many of your creations with the rest of world apart from a slim minority of Linux users.
I'm not sure what the best compromise between having a "trusted" source of software is and being able to install what you want.
Most people seem quite happy with the app store lock-in, in fact most people I know who own android devices are not even aware of it's sideloading feature, they just get stuff from android market.
Of course if Apple become over restrictive then they do risk damaging their own ecosystem to the benefit of competing platforms.
Unfortunately there does seem to be more popular support that I had previously expected for the Internet to develop something closer to this model. I recently watched a documentary on cyber-bullying in which groups of parents were calling for an authority from government to be able to control the content of social networking websites as well as remove any anonymity from the Internet.
I hadn't used Linux in 1994, my first experiences with it were ~1998 but I remember that getting it to install and work correctly with my hardware as well as getting X to work was no mean feat.
X was no worry - our machine only had 4 MB of RAM, so I was practically forced to use pseudo-terminals. The learning curve for Linux distributions was not that steep. Slackware, especially in those days, was orders of magnitude less complex than current distributions. I picked up a cheap bargain UNIX book, which was enough to get started.
Even at the point you had installed it , you would be compiling binaries with GCC that would have targetted Linux/glibc etc so you wouldn't have been able to share many of your creations
Well, sharing my creations with the world wasn't very much possible anyway, since we had no internet connection. Besides that, for me the magic was in creating a program.
Besides that, your comment is factually incorrect, since a DOS port of gcc was fairly quickly available (DJGPP). In fact, IIRC Id Software's Quake was later compiled with it.
I'm not sure what the best compromise between having a "trusted" source of software is and being able to install what you want.
Me neither. I see how it is beneficial for some family members and friends to have a controlled software ecosystem. Also, App stores improved usability a lot.
On the other hand, if kids only get their hands on devices that are controlled completely by corporations, how will the next generation of hackers learn?
I imagine virtual machines, both cloud and local will become a commodity at that point so whilst they may not be taking their iPad apart or replacing the software it could potentially provide a dumb terminal to an infrastructure of disposable Linux instances all loaded with state of the art FOSS.
I can see them doing things like building mashups of their social data and possibly using the next generation of arduino like devices to create real world interfaces.
They will still "hack" just their building blocks will be different. Hell in the 70s you probably weren't a real hacker if you weren't a whizz with a soldering iron, how many of the RoR type hackers today practice that?
It's really weird to hear someone say that - because I grew up in the 80s with a computer that had a commercial-quality (for the time) assembler built-in! Sure you could buy Pascal or Lisp too, but out-of-the-box you got the same tools the pros were using, and all the documentation with it too.
Well truthfully my first computer (an Acorn BBC) had a BASIC interpreter built in, I think you could also do some assembler out of the box although I never tried that at the time.
Even my first DOS PC had QBasic pre installed but it had no ability to make .EXE files which was what you needed to be able to do if you wanted to submit your software to shareware libraries.
It seems at that point that Microsoft wanted a devide between "toy" programming languages like QBASIC and "professional" ones like Visual Basic which cost money.
If you wanted to create "proper" windows software you needed to fork out some cash.
Contrast that with now where you can download Visual Studio Express along with all documentation, compilers etc for free as well as a whole load of open source languages and libraries.
I now make my living as a programmer and I use almost no commercial tools at all, I doubt many people were doing such a thing in ~1993
Thanks for reminding me why I make more money now than I dreamed possible as a child, yet can't seem to afford anything beyond AmazonBasics and store-brand cereal.
Inflation is such an insidious way to pick the pockets of the working class. I wish the government would tax cash and equivalents instead of inflating the currency.
There's nothing inherent in the nature of tablets or iOS that prevents them from being used for learning programming. It's purely an issue of the availability of suitable apps.
Agree with your point overall, but if you look at it this way, there were lots of people who were, before, using a computer just as a "consumer" of media, internet, music, information, etc... they had to get a PC because there was nothing else to replace that tool. Now, for THESE people, a tablet makes more sense: it fits their needs. So the conversion we may be seeing right now is around those who were simple consumers in the first place.
Creators who need to consume and to create will of course prefer to have a desktop, a laptop on top of a tablet. And there may not be much evolution on the desktop or laptop side because there is no such need to be: the desktop environment has had dozens of year to evolve, its concepts are solid for its intended usages.
I don't think the point is that the market will shift all to tablets. It is more that, given an upbringing within a "consumer" household. The individuals there will have little opportunity to experience and gain familiarity with, something that can produce. You and I will continue to use PCs, our kids and our friends will likely be using tablets more and more.
the idea of the general purpose mechanical vehicle died a long time ago, the idea of the all purpose computing device should also. while these ARE general purpose machines as far as their internals go, even with pcs and laptops different tools are required to perform different tasks.
Artists use wacom pads and cintiqs to "create" audio experts use mixing boards and MIDI instruments, 3d artists use even more specialized tools (this is my proffession so I happen to know more about it than the others) a lot of what is happening with tablet computers is that the keyboard/mouse combination, once a first class input device used by coders and office employees is becoming a peripheral just like all the others I have been talking about.
Great comparison. If you look at things like Garageband on the iPad, you can certainly make music with the touchscreen only. But any musician would probably still be more productive when using the real instrument as the input device. Same with office work: yes, you can write on the virtual keyboard, but can still attach a physical one if you want
>It's not a versus, it's a peaceful cohabitation, only some evangelist and/or journalists want to push us to believe there's a war but that's really not true.
The issue is; lots of people who never should have had desktops had to get them because there was no other option. It's not that the Desktop market will die or go away, it's simply that it will contract to what it should have been all along: a very small home market and a large business one.
The thing we need to look at different is that there are types of creator. Damon Albarn recorded a massive chunk of one of the Gorillaz albums on an iPad, David Hockney sketches on one.
There are ways in which tablets have massive creative potential. The interface for instance allows a very direct interaction for some mediums and when you look at something like Garageband it has a very shallow learning curve (and price point) which will potentially pull a lot of people in who would have never have looked at that sort of thing on a desktop or laptop.
I think it's important to differentiate between the things that allows developers to be creative and the things that allow others to be so.
To say I'm on the side of the creator and am anti-tablets is to subscribe to a very fixed and narrow definition of what enables creativity.
It's not impossible to remain a creator while using an iPad (or other tablet) - at least not when it comes to development. There are certainly people who are happily developing on an iPad (albeit with a more powerful remote backend)[1]. I think we are likely to see even more of a shift in this direction as these devices become even more capable.
Ok, granted, it seems to work but when you look at the configuration:
iPad 2 (16Gb, WiFi)
Apple wireless keyboard
Stilgut adjustable angle stand/case
iSSH
Linode 512 running Ubuntu 11.04
Apple VGA adapter
This is basically the same configuration as an iMac! I'll take an example, most of the programmers I know when doing some serious coding have 2+ monitors because it's always good to run on the side, get the result live. Or just because you need to compare two files. On such a small screen, it's close to impossible. Also for a programmer a lot of the wasted time is browsing through long files, with an ipad and, again, a small screen = nightmare.
This is just not really that serious. There is not going to be widespread adoption of serious programmers doing this in anger. I promise. It's not going to happen. It's fitting a square peg in a round hole. This attitude wants me to have this overly complicated setup to have a really horrible version of what I can do swimmingly well with a little Thinkpad or whatever notebook you like. Done and done. No nonsense.
The question is what makes the thing a tablet or a PC for the purpose of what we're discussing. We can probably all agree on the point that a thing with a bigger screen and a keyboard will always be a big part of the mix.
But, something could fit that description while being more similar to a tablet. I think that's the point Atwood is making.
I apologize in advance for the dreaded car analogy but:
I'm concerned the very reason tablets will take over the PC market will also mean that tomorrow's kids' experience is very different from, and I would argue poorer than, my experience when I got my first computer
I don't think that's any more an issue than today's car owner's have a poor experience because they don't have to worry about setting the fuel/air mixture in their carburetors like I had to with my first car. I think you are right that the closed environment is a barrier, but I live in one of the poorest cities in MA, and yet there are no shortage of computers on the curb on trash day that can happily run Linux/NetBSD/etc.
The young hacker culture will live on.
Ironically, my VIC 20 also was very user friendly. You turn it on, it's on. You put in a diskette -- well, it didn't load automatically but making it load was a very simple incantation,
"Simple" to who? You and me? Sure. Random guy on the street? Doubtful.
I don't think that's any more an issue than today's car owner's have a poor experience because they don't have to worry about setting the fuel/air mixture in their carburetors like I had to with my first car.
Sorry, I didn't express myself very clearly. By "poor experience" I wasn't referring to usability but to the lack of opportunities to tinker with the system and make it do things outside of what its creators intended.
"Simple" to who? You and me? Sure. Random guy on the street? Doubtful.
Surely the iPad is easier to use than the VIC-20. Then again, the VIC-20 came with a manual and it wasn't very long. I'm confident every VIC-20 owner read it and even the dumbest of us didn't have much trouble figuring out what command to type to load the disk.
Come think of it, I guess I was confusing things: I only had a disk drive later with the Commodore 64, games for the VIC-20 came on a cartridge so no typing required at all, just plug it in and turn on the power.
Computers are becoming more reliable and less user serviceable in the same way that cars did.
My dad tells me stories about how he and his father used to service the family car, replace parts etc. My father and I used to do the same thing with the family computer.
I hope my future son and I get to play with the family 3d printer.
As systems become more reliable (Computers, Cars, Operating Systems) - the need for expertise is reduced. That's a godsend for those of us who use to spend days (weeks?) of every year providing computer support for relatives who's systems were constantly self-destructing.
I found this (recent) article in the NYT interesting:
Little details like this are what start to make a big difference:
"“The California Air Resources Board and the E.P.A. have been very focused on making sure that catalytic converters perform within 96 percent of their original capability at 100,000 miles,” said Jagadish Sorab, technical leader for engine design at Ford Motor. “Because of this, we needed to reduce the amount of oil being used by the engine to reduce the oil reaching the catalysts.
“Fifteen years ago, piston rings would show perhaps 50 microns of wear over the useful life of a vehicle,” Mr. Sorab said, referring to the engine part responsible for sealing combustion in the cylinder. “Today, it is less than 10 microns. As a benchmark, a human hair is 200 microns thick."
"hell, the manual came with example BASIC programs."
I think part of the problem with people and computers is that most wouldn't even bother to read a manual no matter what was included in it.
People expect it to be so intuitive that they don't even have to figure anything out. As though this highly complex machine was purpose built for their own expectations. How far would today's typical computer user have gotten on your VIC 20 if they put a diskette in the drive and nothing happened? The incantation that seems so simple doesn't just summon itself.
So there's an entire market for highly intuitive and trouble free systems that is now being catered to. But there are tradeoffs in building these systems. They're not as flexible and can't be used for all the things that a PC can be used for.
I just bought a tablet and have been trying to figure out a use for it that justifies the price. (laying in bed watching youtube doesn't cut it) It doesn't replace my PC, and it doesn't even replace my phone. I don't see it replacing anything. It actually does the opposite. Overall it increases the ubiquity of computing devices in people's lives.
Computers of that era were very simple to use.
I never had a Vic 20 , but in Britain "BBC" computers made by Acorn were popular, especially in schools.
If you wanted to start the computer up you pressed the button on the keyboard and it was booted and ready in under a second, ditto for switching it off (no hibernate/shutdown/suspend etc).
If you wanted to run a program , you put the floppy disk in the drive and typed "RUN" then pressed enter.
I literally learnt to use one at age 6 without much trouble, so did my school peers. Even our aged school teachers could operate them.
They were pretty much impossible to screw up since the entire OS was loaded onto a ROM chip so no viruses ,config files etc.
Even BASIC was very easy to use, the shell and the basic interpreter were the same thing. I don't think there were many people who didn't know how to do:
10 PRINT "SCHOOL SUXXX"
20 GOTO 10
They were also very extensible machines, you could add joysticks, mice, modems and it even had an analog output board you could plug into the back and control robots arduino style (we were doing this at age 8!)
The idea of being "computer literate" didn't really spring up until Windows 95 (or maybe 3.1) and people suddenly had to worry about C drives and "programs" etc.
I remember moving from programming text games in basic on my BBC into trying to understand OOP and Visual Basic for Windows and thinking "why does everything have to be so complicated?". To be honest I think if I hadn't had that taste of magic from the BBC I wouldn't have had the will to carry on learning to program.
Yup. I remember back when I was a kid, and I wanted to make the jump from writing console/text based games to using the full graphical power of the Mac. The transition was brutal, going from Basic with no pointers to MPW pascal, which used handles(1), not even pointers, was just impossible. It requires a deep understanding of how computers work which the Basic interpreter available with most 8-bit home computers carefully hid from the budding programmer.
1. Handles for those that don't know, were, to use the C idiom, double pointers eg char, or Window. The entire MacOS API was based around them, because it allowed the memory manager to compact memory without having to tell you - it would change the value of the pointer to the memory, but as you only had a pointer to the first pointer, you wouldn't even notice the change.
I remember studying Fortran in college in 1993 and deciding to write a program (using no dynamically allocated memory, because I think we were still studying Fortran 77) that would solve simultaneous linear equations with arbitrary numbers of variables.
The Microsoft Fortran compilers we were using on, I think, 386's, would allocate all memory at compile time and therefore the more memory requested, the larger the executable. I discovered that this lead to serious issues on the 5.25in disks and I was better off writing Basic on my C64 for that task.......
People are perfectly capable of learning all sorts of fancy computer stuff if they want to. Generally they don't want to. My mom taught me all sorts of stuff about DOS (if you have autoexec.bat, it will get run automatically on boot). She knew that because she had to know it. But now, she doesn't have to know it, she just wants to do something and has my dad fix it. So she barely knows how to use it.
I think the current "PC" paradigm needs to die because the things that we do with our computers is just so different now.
Regards the "no serviceable parts" issue, there is nothing stopping manufacturers from building systems that are more open in addition of course it may be the case that kids grow up without access to one of these. However that was probably also the case in the 80s for the most part, most of the kids I knew growing up didn't have a computer but they probably did have a SNES or Sega Genesis.
Of course what might happen is that all the geeks and hackers buy the "open" hardware and build cool things with it, once the general population sees these things then perhaps they want the open version too.
Free market economics dictate that competition will create innovation after all and there is only so much you can lock a platform down without stifling that.
Thinking about this for a long time:
"And that's what I'm worried about: "No user serviceable parts inside". I'm concerned the very reason tablets will take over the PC market will also mean that tomorrow's kids' experience is very different from, and I would argue poorer than, my experience when I got my first computer, a Commodore VIC 20."
And found the answer in automobiles. They are as essential to us as communication devices will be to everyone tomorrow. And yet only a few only know at least the fundamentals of an engine.
It's a problem society has to tacle by embracing STEM but it's not as problematic as it seems IMHO.
I think this is a classic example of us old timers wishing our kids got the same experience as us.
Back then, you HAD to spend time fiddling with the hardware. Today's kids are much more empowered. Since everything just works they can spend their time creating much more awesome stuff than we ever had the chance to do.
The BBC makes a big deal of "digital natives" (kids who grew up with the Internet) and "digital immigrants" (the old folks).
What this model completely overlooks is that the digital immigrants built it, and the digital natives merely use it to watch videos on Youtube and poke each other on MySpace. Kids these are days are consumers, not producers.
Atwood's thesis is that the new iPad improved on those things that make tablets useful.
I agree with him that the updated display is the killer feature. But you have to give props to a high performance wireless network architecture too.
That being said, two tablets with identical specs, I prefer an Android tablet and a more accessible ecosystem, the problem for Android right now is the R&D and supply chain investment competition. Ten manufacturers each investing 10 million in their tablet design is not nearly as efficient as one person investing 100 million in their tablet design.
I keep hoping Google will address that issue by providing hardware/manufacturing R&D but it isn't one of their core competencies. When I worked there folks would call me to get in introduction to the folks designing the next Android phone, I'd chuckle and explain they had to go talk to Motorola or HTC. Perhaps now with their Motorola acquisition they will be in a place to make that investment, time will tell.
"But until the iPhone and iPad, near as I can tell, nobody else was even trying to improve resolution on computer displays"
The original Droid came out in Nov '09 and had a 265ppi screen (same as the new iPad). When I upgraded from my 1st gen iPhone to the Nexus One shortly after that and put the two side-by-side, I simply could not stop admiring how incredible the screen was.
Sorry, but it really was just simple economics and we would've gotten high-DPI displays regardless.
I don't think it's about iPhone vs Droid bs Nexus: it's about phones and tablets versus desktops and laptops. Most laptops today aren't sold with a 200+ DPI screen. I think I'd have a hard time finding any. And yet, most phones (and tablets, with the new iPad) do have these high-density, better, screens. Jeff's point is that the switch to phones and tablets is what enabled that improvement to occur.
Those are 22" desktop LCD screens with a resolution of 3840×2400. The T220 came out in 2001. At an original price of $18k they certainly cost an arm, a leg and half your firstborn. But there you are.
That's true of phones, but we haven't really seen anyone try that with traditional PCs. Even Apple hasn't achieved anything there.
In a broader sense, there seems to be much more innovation in tablets and phones versus traditional PCs. Some of that was sparked by Apple, some of it, such as Droid's high-res screen, was created first by someone else. But where is the comparable innovation with the PC?
"""The slope of this graph is the whole story. The complicated general purpose computers are at the bottom, and the simpler specialized computers are at the top."""
This is a terrible graph. The Mac has always been a premium niche computer. 28 years ago computing was in its infancy and computers were expensive, unnetworked with limited benefit to the average household.
Compare this the iPad / iPod Touch and iPhone. These are main stream devices that achieved immediate traction. It is unrealistic to compare Mac sales with those of main stream devices.
You then have the desktop market as a whole. If you compare any single company or brand of desktop against iPad sales, desktop sales would look in trouble. However, if you compare desktop sales to tablet sales it is clear that tablets are still only in their infancy.
BUT TABLETS ARE SELLING FASTER THAN DESKTOPS!
I don't know if this true but it doesn't matter and it doesn't say much about the state of desktops. Everyone has a computer, the market is saturated. No one has a tablet. It makes sense that tablet sales would rocket.
Buying tablets also make a lot of sense for people who just consume content. The fact is though that the iPad isn't great for productivity. It is far better for consuming content. This is what most people do.
However if you want to program, edit images, write a novel, maintain spreadsheets, make movies etc a desktop / laptop is what you need.
The PC isn't dead and it isn't dying. It has reached a point where people only buy replacements. Now yes, some people may switch perminantly to a tablet. Thats fine. For the forseeable future though there will be a large market of business and consumers who require more than a tablet can provide.
------
The article also touches on how great the new iPad screen is. I don't think its all that. I walked past the demo of 'the new iPad' twice before asking a sales guy to point to which one was 'the new iPad'. Yes, if you put the screen close to your face you see less pixelation on icons. For general web surfing though I saw no perceived difference. Hell, I don't really see any pixelation on my iPad2. Perhaps I am not holding it close enough to my face...
As has been pointed out in the replies about 30 times already, PostPC doesn't mean the PC is dead, it only means that the PC market is mature and innovation slows way down as everyone focuses on newer more dynamic markets. The fact that people feel no need to upgrade their PCs anymore yet they are still rushing out to buy the new iPad is symptomatic of that. Surely I'll buy a new PC in the future to replace my old one, but I don't expect my sister or dad to do the same.
I once had a workmate in 1999 whose sentiment was similar to your post, just focusing on a different device. We got into an argument about whether the desktop PC market was over or not. He didn't really believe that laptops were really more than niche devices, they were too slow, displays were too small, that most people would use desktop PCs for decades to come. That you could never do more than surf the web on it, you couldn't hack serious code with it.
That tablets are for consumption only...really? Diagramming (e.g., OmniGraffle), vector graphic production, sketching, mixing and producing music, editing images, updating a spreadsheet...why not? Many are using tablets for production already. Every year someone says iPad can't do X, the next year someone releases something that does X and it actually doesn't suck.
I often use my iPad 2 in bed (~1 foot distance), the screen is really close and I can see the pixels. The new iPad is absolutely frigging amazing, I'll never look at my crappy DELL/HP monitors at work in a positive way again (yes, I can see the pixels!). Why have we been stuck at the same crappy 1920x1200 pixel resolution for at least 5 years now? Could it be that no one care about innovating in the PC market because they can't make any money? That a 10" iPad has a higher resolution than my 24" PC monitor is totally Post PC.
Simply wait ;-) probably with age your sight will decrease and at one point you simply will stop to see difference.
I guess that 70% or more of people in theirs 30's will have problem with seeing difference (except brighter colors). Of course in one or two years everything will have screens with more pixels, but from some point it will be always only marketing thing.
As for now if you look on 22 cm long screen with resolution 1280 pixels for width, and you keep screen in 0.5 m distance it means that one pixel have size of 1.17 minute, so some, and maybe even most of people aren't able to distinguish single pixels.
Some people are using tablets for work, some are doing the same on phones, but for many computers are much more comfortable. It is very personal thing, for me tablet is cool for playing in some addictive games like Cut the Rope or Where's my water? but I will kill anybody who will propose to change my computer which I'm using for development for any tablet. The same, after reading several books on tablet I much more like reading on Kindle, it hasn't this ugly glossy screen when I can see me instead of text ;-)
Higher PPI is more important, not less, as your eyesight degrades. Turns out pixelation is bad for your eyes. The advantage to having a higher PPI is that things are clearer, not smaller as some windows aficionados have come to expect.
Nobody is significantly investing in desktop pc display technology right now as far as I know, not samsung or LG, it's very stagnate. That's the point, PC R&D seems as good as dead. It's a big shame, I would really love a decent display for my workstation.
Back in The Day, they had some nice quality dot-matrix printers. They were pretty nice, much more flexible than my parents' letter-quality daisy wheel printer. You could print out cool banners with Print Shop! You could even play music on them. I'd guess they were about the same resolution as the iPad1. I don't remember anybody complaining about being able to see the pixels.
When 300 dpi laser printers arrived, nobody looked back. You could almost imagine that your school reports were typeset. It looked so professional! (If only the writing had been professional...) With 300 dpi iPad's you can almost imagine you're reading a book. People who are visual will care about this. I can see the subpixel coloring on my 20" monitor. If I turn it off, my antialiased text still has blurry vertical lines. It drives me nuts!
Or maybe the best example (again, back in The Day) was when my friend got his 32-bit Nintendo. He demoed a game and I said, "meh, it doesn't really seem all that better." Then he brought out Super Mario 3 on the 8-bit Nintendo and wow, was it painful.
I'm assuming prewett meant 16 bit, which would be the SNES. The only comparison I've seen people make between the N64/Gamecube and NES/SNES is that sprite-based games aged better than early 3D games.
He might be talking about the N64 (which IIRC most games used the set of 32-bit instructions because they were faster and "accurate enough" at the time. - Not sure, though.)
> This is a terrible graph.
I suppose it depends on the perspective. From an investor's perspective it's actually a very interesting graph that visually (partly) explains the ongoing explosion of AAPL profits.
The day I can code an iPad app on the iPad is the day I enter the post-pc era.
Or code, test & deploy web apps from an iPad.
Or design & layout web pages on an iPad.
We live in a post-mainframe era even though mainframes are still alive an kicking. But a mainframe isn't required create, test and deploy PC software. Once PCs are no longer required to create, test and deploy code for mobile devices we'll be in a post-PC era - that is, once PCs are no longer a required part of the general purpose computing ecosystem.
In that way, I think of the iPad a lot like I think of game consoles. They're not self-hosting (or whatever you want to call it), and they're not designed to be anytime soon. They're designed to be a software/content distribution channel for a hardware manufacturer who takes a royalty of that in addition to selling the device.
I think you're missing the point. The post-PC era means that only tech savvy people need/will use PCs, not that PCs will disappear completely. What I'm really waiting for is for some company to realize this and make a device that caters exclusively to developers.
Part of the claim was that for most people, the tablet (with the retina display) is enough for all the computing they want to do. I think "most people" excludes us - the programmers/makers. We will always need a full general computing machine.
But Steve Jobs certainly saw the Post PC era looming as far back as 1996:
The desktop computer industry is dead. Innovation has virtually ceased. Microsoft dominates with very little innovation. That's over. Apple lost. The desktop market has entered the dark ages, and it's going to be in the dark ages for the next 10 years, or certainly for the rest of this decade.
If I were running Apple, I would milk the Macintosh for all it's worth – and get busy on the next great thing. The PC wars are over. Done. Microsoft won a long time ago.
The fact that Steve jobs could see this coming from back in 1996 is incredible. Add to that, that Steve also knew all through the Tablet PC initiatives of the early 2000's that it was too soon, technology wasn't able to properly support the post-PC form factors yet, and that styluses were a loser proposition. Bill Gates also saw the same forces at play, but he responded in the early 2000's with a failed attempt to move the PC into the next era. I guess that's a natural reaction for the winner of the PC era -- to try and carry it on.
You can see the same forces at play in the audio industry. Audio has gone beyond the early DIY tinkerer days, through a period of standardization and ubiquity, through an era where big, complicated, elaborate machinery was a status display, to a time of maturity where convenience and design prevails and the technology fades into the background.
The problem with audio, is that it doesn't let us extrapolate to the future. What's next? I can see a period where feature-phones increase in capability to the point where they're like the 1st generation smartphones, but with networks that are 1000X more capable -- just in time for widespread adoption by the developing world.
Will the developing world leapfrog the developed world in much the same way that it skipped over the era of hardwired networks straight to mobile? Will they be coming to economic power just as the digital realm has obsoleted many forms of material wealth?
I agree the quote from Steve Jobs was impressive, but it also fits the well known quote that goes "the best way to predict the future, is to invent it"
As happens all too often at HN, these comments are devolving into a pedantic argument about definitions and irrelevant technicalities.
Atwood does not say "hey throw out your PCs". In fact, he opens by stating that PCs are already ubiquitous. This article examines the idea that major innovation is currently happening in the post PC area. He posits that the new iPad display is a killer feature.
This is a much more interesting talking point than "o noes but my Linux netbook is kewler".
It's sort of weird to separate them out. Laptops were for the early adopters. They make up the first 10% of the area under the curve from zero to the adoption plateau of general purpose computing devices.
pads will constitute the renaining 90%. They are bringing a useful machine to the $500, $400... even $100 market. They, like cell phones did for cameras, are what will make the technology ubiquitous.
So, it's hard to say what is "more revolutionary". If you are more concerned with the invention of the basic concepts, the PC is. If you are more concerned with what were the last big changes required to unleash the technology in a ubiquitous way, pads are.
It might seem like laptops are revolutionary, or were, but useful ones have been around for 20 years, and to what end? They've become progressively more useful in terms of screen quality and battery life, but they have consistently failed to produce any kind of revolution. And why should they? They're just a more expensive version of your desktop computer, that's small enough to carry around.
I'm happy to class this as useful, but the laptop's persistent insistence on being merely a small version of the desktop falls somewhat short of "revolutionary".
Laptops made new types of work and socializing possible. Affordable laptops enabled people to cowork, work nomadically, come together for "data potlucks", etc. Laptops largely replaced turntables and hardware instruments. *pads merely iterates on this initial revolution, making it more convenient and providing training wheels for those not born into technology-influenced culture.
Sure, but "post" implies replacement, implying that PCs are somehow obsolete. It would be akin to heralding the introduction of compact cars to being the "post-sedan" era. "Mobile computing" is a more accurate and less marketing-driven way to say "post-PC". Mobile computing hardware, like the Palm Treo, has been around for awhile... 3G is the technology that has really helped it come of age.
>> iPad 3 reviews that complain "all they did was improve the display" are clueless bordering on stupidity. Tablets are pretty much by definition all display; nothing is more fundamental to the tablet experience than the quality of the display.
While I agree that the display improvements are noteworthy, I would say that tablets are not "by definition all display." The tactile interaction is as much a part of a tablet's worth.
Yeah, I went to Best Buy to take a look at the new iPad the other day. They didn't have it clearly labeled (iPad 2 vs. the new iPad) and so I didn't even realize I was using the new iPad at first. Granted, I've only used an iPad on 1 or 2 occasions in the past, but if there were some kind of new touch feedback functionality, I would have noticed instantly.
The iPad 3's display is notable mostly because of what you don't see: pixels. If you saw two of them running side by side, I expect you would have seen the difference immediately. However, a technology which simply removes a problem, even when it's a big ugly problem, is often invisible except by comparison. It's still a big deal even, nay because, it's not immediately noticeable on its own.
iOS devices have had there tactile interaction down since the beginning. They are so responsive compared to Android devices (from my experience), and so I don't see where Apple could have improved in this area.
I don't know which Android devices you've used, but I simply haven't found this to be the case. From my experience, the touch response on high Android devices (such as the Galaxy Nexus) is indistinguishable from iOS devices.
I'm on a Nexus One at the moment. The touch is good when it works, but it's buggy. The more quickly you tap it the more likely it is to into buggy mode where a big dead spot develops in the lower center part of the screen. It's absolutely infuriating, and it really makes me much more envious of iPhone users than any of the apps.
If the Galaxy Nexus is really on par with the iPhone (and I'm highly skeptical of that) then I might stick with Android for my next phone, but otherwise I have to go iPhone just based on touch characteristics alone.
Didn't the Nexus One ship with a notoriously poor capacitive sensor? I recall it being blamed for the N1 struggling with multi-touch compared to other phones.
I have an LG Optimus, and the touch response is horrid. The phone works well in other areas, but UI responsiveness and text input accuracy aren't among them.
Not only the tactile interaction. The OS, the API, the user interface in general, the networking and communication devices, the battery, etc. The thing that is "by definition all display" is called a monitor.
Jeff is making the point that [Apple is] "fundamentally innovating in computing as a whole...".
I see the iOS devices (iPhone/iPod Touch/iPad), as just the beginning. Apple has made astounding progress on these mobile devices where their competitors haven't really caught up to yet:
1) The display, amazingly higher-res than HDTVs now (2048-by-1536-pixel resolution at 264 pixels per inch (ppi))
2) Connectivity (Wifi/multiple carriers/LTE even)
3) Battery life to weight ratio (9hrs using cellular network, at 1.46lbs)*
4) Graphics/Processor punch (Dual A5X processor with quad-core graphics)
5) An ecosystem of applications that are on the whole, of good quality, and generate a lot of revenue (25 Billion apps sold [2])
When I say "just the beginning", I really mean that. And I don't mean that Apple will always be at the top, probably not by a long shot. What I want to ask, and I wonder why everyone else isn't asking is, what's next? Apple won't be at the tippy top forever, so won't the next generation of innovators please stand up?
[1] Anyone remember lugging around their 8+ lbs Dell laptop, that had a battery that lasted maybe 2 hours in 2005? I do. My back still hurts.
I realize I'm straying dangerously close to Get Off My Lawn territory, but I worry about the post-PC era. As a programmer, I worry about turning into a sharecropper on a closed platform [1].
As a consumer, I worry about "buying" a closed, networked device and discovering that I don't really own it, but merely have a licence to use it under draconian terms of service. I don't want to "buy" a book or a movie or a piece of software, only to have it yanked off my device remotely.
Meanwhile, the government of my country (Canada) is rolling out a new copyright law that will make it a criminal offence to circumvent "digital locks", even if it's for legitimate, legal personal use.
However, I still hold out hope that as the technology matures, costs fall and competitors catch up to Apple's tight vertical integration, it becomes more feasible to run a full-featured open source device with a healthy ecosystem built around it.
I love my iPad 2, don't get me wrong. The thing is, for me the iPad supplements my MacBook Pro: acts as a second monitor, I use it for reading, and it is just large enough to be good for watching movies.
The thing is, most of my computer use is programming and writing. I need a keyboard, support for a term window, Emacs and IntelliJ, TeXShop, etc.
I might be able to earn a living using just an iPad by running a term app and doing development using a remote Linux server with SSH, Emacs, etc. and do writing using Pages, but that would be like running a race hopping on one foot.
I think that "iPad 5" might do it for me however: a larger physical screen size, about half the weight, and great IDE support for doing Lisp, Java, Clojure, etc. I'll wait 2 years and see. I think that this will require new paradigms in writing and programming tools. Mind is open.
My wife is not an advanced user of computers by any stretch of the imagination, but I would never buy an iPad for her.
She has a DSLR and 60k photos in iPhoto (~320G). iPads will eventually store this much in 3 years, but you will want it backed up somewhere (and in 3 years the "cloud" still won't be doing this much (personal) data at a reasonable personal cost compared to a physical device). And she will not be tolerant of doing edits or looking at photos on a 10" screen.
Not to mention that once you increase the screen size, portability goes out the window -- you will have a monitor. And guess what? Touch input doesn't really work ergonomically on a 24" monitor.
There are solutions, Airplay will obviously be involved, as will BT input devices. But it's not quite ready for prime-time yet.
Everyone's case is different of course, but it's very possible that soon, your wife will be on iPad only.
For large monitors, AirPlay to the TV works great today. I can snap a picture on my iPhone, step inside my house and the next second, show the picture on a 50" TV to all my guests. Instantaneous. I don't have to connect to a computer at all. It even works for videos.
Combine that with a card from Eye-Fi and cloud storage, and I'd say that in two years tops, you'll feel like it's a completely normal and safe way to manipulate your pictures. Use the iPad to arrange them in collections. Show them on the TV. Store them online.
The only thing missing there is editing, if you are trying to do something beyond adding a few filters you are probably going to want either a mouse or something closer to a proper graphics tablet (which is quite different from a capacitive touchscreen).
Yep, the article is right BUT it's for consumers. The iPad has allowed my grandmother (who is so old fashioned she doesn't believe woman should drive) to pick up a device and video call me on the other side of the world - we told her "It's not a computer, it's a new invention" and she picked it up and invested time in learning it.
Most people here are actually producers when it comes to computers, so most people say things like "I still need to product (spreadsheet | programming | lolcat images)" and the pc will never go away. But I think this article was talking about your websurfing, facebook updating, web reading consumers.
Workstations killed mainframes, because you didn't need that stupid command line. You could point and click. But real software engineers would install UNIX tools, or SSH (or telnet) into a server, and take advantage of the pretty display for richer reports.
PCs killed workstations, because they were cheaper, and their OS was more user friendly. Unfortunately, "user friendly" meant "dumbed down". You could take advantage of innovations in IDEs, word processors, email programs, and web browser; but it was harder to instal UNIX tools. You could still SSH into a server (which would be a workstation, if you were on a tight budget) and get UNIX stuff done. You could also configure X, and do workstation stuff.
Now, tablets are killing PCs, because they are cheaper, more portable, and more "user friendly" (dumbed down). There's probably no way to install UNIX tools. But you can get a virtual server for peanuts (or set up a home PC), hook up a USB keyboard, and an SSH "app", and everything is fine. The display is a bit small, but many tablets these days can power a HD display (look at the Raspberry Pi, which uses mobile phone components). Apple may or may not include the right port, but Android tablets can. And you can take advantage of innovations in touch interfaces.
"Very, very small PCs – the kind you could fit in your pocket – are starting to have the same amount of computing grunt as a high end desktop PC of, say, 5 years ago. And that was plenty, even back then, for a relatively inefficient general purpose operating system."
I would argue that it's more than they need, that a service like OnLive is before its time but not more than a few years from hitting prime. I would say get ready for the Post Hardware Era.
That's going to need some serious network infrastructure to back it up. You will essentially have every computer user in the world constantly streaming very high resolution video for 8 hours+ a day.
The 1GB/month limit most carriers provide is going to be nowhere near enough, you will need more like 1/2 TB per month and it needs to be provided reliably.
I think that's completely possible if we change the model of our current infrastructure.
Here in Portugal, our main (wired) ISP has been putting Foneras[1] in every subscribers' home for a couple of years or so, which means that every subscriber they have can connect for free anywhere there's a home with their service. A couple of months ago they launched an iPhone app which can use their network of Wifi routers to make calls through them - they essentially became a mobile provider using Wifi.
I can see this approach taking place on a larger scale; not only related to connectivity, but to actual computing: imagine that each router was also a beefy server, particularly in terms of CPU and GPU capabilities; these would provide the computing needs for wireless devices connected to them.
Using existing ways of moving the "state" of an application between machines, you could have your wireless devices hopping from server to server as you moved around, without having a great impact on the rest of the network.
This would still mean hardware in each home, of course, but it'd be a black box hidden from sight and chosen, bought and managed by some company.
The crux of the problem is that bandwidth is currently oversold since it is largely unused most of the time.
Let's compare my session here on HN with how it would be if I was streaming my web browser from somewhere else.
Currently I pull down a few KB of ascii every i load up a comment thread and then spend maybe 20-30 minutes reading it during which time no bandwidth is used.
If I was streaming this from a browser in the cloud, I would be pulling down an order of magnitude more data since scrolling down the page could easily be a few MB vs no data at all.
You could have hardware "basestations" as you suggest which would be an interesting idea.
Of course locating these geographically would be a difficult task. You might end up with rural areas where it would be impossible to do any computing at all because no basestation was in range, or you would end up with urban areas where there would be massive demand.
You would also have to account for what happens if somebody decides to have a big party or event somewhere and suddenly the local basestation computer goes from only having to service 10 - 100 clients to suddenly having to service 1000+?
You could have hardware "basestations" as you suggest which would be an interesting idea. Of course locating these geographically would be a difficult task. You might end up with rural areas where it would be impossible to do any computing at all because no basestation was in range, or you would end up with urban areas where there would be massive demand.
Rural would probably fallback to mobile networks and real datacenters, possibly with degraded service. Urban would just be adjusted based on trends - places with higher demand would get more powerful boxes; the same as scaling any other servers, really.
You would also have to account for what happens if somebody decides to have a big party or event somewhere and suddenly the local basestation computer goes from only having to service 10 - 100 clients to suddenly having to service 1000+?
Assuming a decent infrastructure, you'd only need to have enough wireless bandwidth; computing resources wouldn't necessarily have to be tied to local devices. So a party would just mean that more machines from the neighborhood would help via wired connections (I'm assuming basestations in the same neighborhood could speak to each other using local routers, without having to do a round trip to the ISP datacenter).
Of course, this still means that machines shouldn't be running at 100% capacity, but that's just common sense.
It's certainly any interesting concept with some merit.
However it would change the game from being "damn I can't get any signal, no internet for me but I can still work offline" to "no signal, I literally can't do any computing at all!"
So what I could see happening is people at that point carrying around "personal compute servers" with them for just such an event, or at least having on in their home/office.
Also bear in mind that CPUs/GPUs appear to be getting pretty cheap as well as pretty small and you would need one anyway to do the video decompression so having something on your device capable of doing the basic UI etc isn't much of a stretch.
Perhaps you end up with 3 tiers of processing, i.e your device, your local node and "the cloud" (i.e a proper datacenter). The issue is making all of this transparent from a usability perspective.
Oh, I don't think we'll ever return to actual dumb terminals - I think the difference will be that some services will be available and others won't. Or some others will just be scheduled until you get near a basestation.
> I would say get ready for the Post Hardware Era.
That's what they said in the mid-90s as the web was catching on (remember the "network computer"?). And then came Flash and Java applets and Javascript and various other things and it turned out that a dumb web client had to be pretty smart to keep up with the web.
I try never to say that something is impossible, because I don't really know what the future will bring, and convincing myself that I do would be foolish. But I will have to be shown a lot more than what we've seen so far with OnLive before I'm convinced that the future of mobile devices is as dumb terminals.
We've come a long way since then and still have a ways to go before the dumb terminal model comes back... but I think it will happen. I think the problem for that model has been that the only people who have found a use for the power of a full size computer on a mobile device have been gamers, but I think this is probably because developers have never had the ability to build applications that use more than the mobile device's native hardware. OnLive needs to make a development platform on their service.
I was so inspired by this article that I actually went to the Apple store in San Francisco and purchased the iPad this evening. After spending a couple hours with it, I'm certainly impressed by the hardware, but I've already decided to return the device tomorrow.
It's a great couch device, but it's too heavy to carry with you. I don't carry a murse, and I'd rather sling a Macbook Air than this for the extra functionality.
Minor things include it running too hot, and the flip cover leaving striated streaks on the display. In my opinion, iOS doesn't scale properly at this size - there is too much white space between the icons on the home screen, for example, and I would prefer to see widgets to show more information without having to open an app.
Ultimately, whether you're happy with the iPad is how much you buy into the Apple cloud versus the Google cloud.
In all truth, there is very little difference between the iPad 2 and the iPad HD. If you squint and peer very close to the screen you can see how high quality the screen is, but looking at the device from where it rests in your hands it's almost impossible to tell the difference.
Just today in the IT department (this is why it's good to occasionally work in the office): "So my old PC at home finally bought it. I've seen you're all using Macs (this guy is observant) and I've been wondering if I should get one of them. Or could I just get an iPad instead? I've been tempted to get one of them anyways".
Turns out the iPad does everything he and the wife want, which is mostly surfing the web, listening to iTunes[1] and getting pictures from their point-and-shoot camera to someplace safe. Remaining issue: printing the odd document. But you can get an AirPrint enabled smearjet for £35.
Our recommendation: Get the iPad now, see if it does everything you want. If it does, get another iPad for the wife instead of that Mac you'd share between you.
[1] Explaining iTunes Match made his eyes light up: "So I won't need to back it up[2] to an USB disk anymore? And I don't need to buy the 64 GB iPad to fit it all on?" (Hey, it's like someone designed this service for people like him).
[2] Most casual computer users seem to do this these days because they've already learnt the hard lesson of losing all their stuff once.
Two years ago I bought my mom an iPad. I was hesitant to do so because I wasn't sure if she would actually use it. A year ago I bought her a new iMac (upgrade from a previous gen. iMac). She likes it, but now considers the iPad to be her main computing device.
My 5 year old son spends 90% of his supervised 'computer time' on an iPad. He doesn't understand the concept of a pixel, or why it's good that we don't see them. He has an older iMac in his room, but still prefers using the iPad as his 'main computing device'.
Talk about burning a candle at both ends! The term 'desktop' is archaic when thinking about a computer as a tool. Designers and engineers on the forefront of technology are spending their time creating devices for the 'Post PC' generation... not creating a better 'desktop'.
Reading some of the comments here, I cannot help but picture the accountant who still keeps his numbers in a [paper] notebook. At least they still make pencils!
PCs will stick around, of course. The worry for me is that when PCs serve only engineers and content creators the production scales will diminish, and prices increase.
For me, the time per day I spend on a PC has not changed since getting a tablet. However I can count on my hands the number of times I have bothered turn on TV since getting tablet. The tablet has replaced the TV for me.
Apple already announced the Post-PC era. It'll be a slow transition for a couple of more years. We really better input options. Fingers are great but sometimes I want precision.
What's interesting about Jeff coming around is that he's a pretty hard-core Windows guy. It was funny to have Macs come up on the StackOverFlow podcast. Joel would recommend it for family, etc and you could hear it in Jeff's voice that he didn't see the need. We'll find out in 6 months if it's going to be a 3 horse race.
Now Jeff has a reason to learn C. Not to worry, ObjC on iOS is not your father's C. It's a lot more fun.
As a hacker, I feel like I am missing out on the wonders of tablets and the Ipad that I see non-hackers getting extremely excited about. I can get super excited about my smartphone, but that's because the device(s) it replaced were infinitely less exciting (my dumbphone, my mp3 player). But for Ipads, I just feel like everything I could do with them can be done on my laptop better. My laptop has a bigger screen so movies is a more pleasant experience. I can play more fun games on my laptop (The many many years of flash games is probably more vast than the games on the Ipad). I can compose documents more comfortably than i can type on a touch screen. I can CODE on my laptop! Granted, my laptop doesn't have 10 hours of battery, or that level of pixel density (where i completely agree with Atwood, displays need to get better), but I have yet to find a reason to buy a tablet. I think that non-hackers who don't really use that many features of a PC should get VERY excited about tablets, but until I can program on a tablet comfortably, I don't think I could bite the bullet and buy one.
Email has been around (in some form) since 1965. However, I think it's a bit too early to say that we're living in a "post-mail era". Will such a thing every truly happen? I don't see that happening until we invent teleporters for all the physical items we need shipped.
By that same token, are we truly in a post-PC era, or is it simply an era where tablet use is on the rise and PC use is on the fall? Nifty charts and screen resolutions aside, I don't see any evidence that we're past the age of the PC. My parents still haven't replaced their computer with an iPad. Neither have any of my old friends from school. I use my tablet for more and more, but I just still can't avoid using my laptop for certain things. And I most certainly don't have a tablet or phone hooked up to my television. And how many offices have you seen where all the PCs have been replaced by mobile devices?
In short, while I see things moving more towards tablets and mobile devices, I'll be convinced that we're in a post-PC era when I see it. Until then, it's all arbitrary speculation.
However, I think it's a bit too early to say that we're living in a "post-mail era". Will such a thing every truly happen? I don't see that happening until we invent teleporters for all the physical items we need shipped.
Well, we are definitely past the start of the post document-mail era. As for the rest, it's just a matter of implementation: is the advanced fabrication tech in your living room, across town, or across the ocean? So long as you get your product in a timely fashion, what's the difference? (Answer: manufacturing and supply-chain costs and how these impact profits and the consumer.)
True, fewer and fewer documents get mailed. But I think it's still a bit early to say that we're in a post-mailed-document era. I still get documents mailed to me every day! There are still some uses of physical mail that haven't been replaced by email.
There are still some uses of physical mail that haven't been replaced by email.
Yes, but it's clearly just a matter of time. There's no infrastructure or technology we need to wait on. There's no document transmission function that email couldn't fulfill for more than 90% of the populace, no barriers to that becoming a norm.
Cars didn't cause 100% abandonment of horses instantaneously. Before the development of the largest trucks and most capable offroad vehicles, there were certain uses only draft animals could fulfill for quite awhile. Still I think the post horse-carriage era did begin with the first commercially sold cars.
Screen resolutions are not growing because Windows doesn't scale display well.
Real world: many people and a business I know use HD resolution on Full HD displays (and 800x600 on 1280x1024 displays) just because letters are bigger.
I really hate current standard resolution: 1366x768. It fits 20 lines of code in Visual Studio. Even 10 years ago we had better monitors.
There is an exponential relation between required dpi and distance of the eye from the printed material or monitor (further you are, less and less dpi is needed). Therefore I'm not sure retina screens are such a great idea for big desktop screens, in a usual desktop setup you probably just wouldn't notice that much difference.
He recapitulates Tufte's argument that information density is curcial to speed of finding information and observes that stagnation in the improvement of screen resolution is holding us back.
Updates. Toolbars. Service Packs. Settings. Anti-virus. Filesystems. Control panels. All the stuff you hate when your Mom calls you for tech support?
Sounds more like a Windows problem then a pc problem. Since converting my family to Macs, I haven't had to handle any of this stuff but when they used Windows it was a monthly annoyance.
Let's not go overboard - both Mac and iOS have Updates and Settings and service packs, it's just that these are simpler to use than on Windows.
Also the first call I got from my Mum after buying her an iPad 2 was to ask how to increase the size of the Notes font (turns out it couldn't be done).
What happened to web apps? Five years ago they were The Next Best Thing because they didn't have updates, toolbars, service packs, filesystems, anti-virus...
It's not like the days of web app development are over. Just look at Google Docs and Gmail or Amazons Cloud Reader. With the push for HTML 5, constant JS optimization and new technologies like Dart emerging it is getting more feasible every day to develop web apps. Especially in the Post PC era because web apps are cross-platform and cross-device.
I think the (not-too-distant) future of home computing will have a lot in common with today's PCs. Articles like this one always seem to underestimate the amount of real creative or productive work people want to do -- something which is still a challenge on the iPads of today. Have YOU compared typing on a small screen to a big one?
With custom manufacturing becoming affordable, as traditional computers start dying, open source hardware will start replacing them. I think we are seeing the first steps in this direction with Raspberry PI, hobbyist open hardware people, 3D printing (in the future: cheap chip printing?), CryogenMod, etc.
I wonder if you will be able to use the open one to create stuff for the closed one?
More disturbingly, I wonder if the open ones will be allowed to have a fraction of the functionality. Hollywood would love to only allow Internet access from locked down devices, and if 98% of the population ends up on appliances, things like that actually become feasible to enforce.
Perhaps though the 2 devices will have massively different use cases.
This is already happening for me to an extent, I run a dual boot of Linux and Windows 7. Essentially Linux is my "programming workstation" and Windows 7 is my entertainment.
I would quite happily replace my Windows 7 boot with something more like iOS since the only applications I really use there are Steam and Netflix.
I'm actually quite glad that Linux does not support most of my games, since this keeps my productivity high.
The only irritation is when I want to watch something from netflix in the background whilst working on some code but even that is easily overcome by running netflix from a PS3 on my TV beside my computer.
Regards restricting internet access only to certain devices. At a government level that would be difficult legislation to pass.
I can see domestic ISPs only supporting specific devices perhaps to keep their support costs down, but I'm sure there will always be ways around this. I remember the guy who came around to install my first broadband connection insisting that I would not be able to access the internet at all from my Linux laptop until I showed him how I did it.
I'd argue that there is one last barrier to these slate based computers (smartphones, tablets) and that is to take these screens and make them sunlight readable (reflective), color, same retina resolution, and same refresh rate.
once the screen no longer requires a backlight, and can be readable any place a book can, computers will truly be ubiquitous. as for creation tools on these machines they are beginning to arrive now, and they will become more and more mature as time goes on. hopefully a less controlled ecosystem will win out but that much I dont know I can predict.
I usually agree with most everything Jeff writes, but I'm highly skeptical that what pushed the "post-PC" era over the top was better display support.
I've seen the screens, they look nice and crisp and clear and...I kind of don't care. Oh, the views on it are great, but it offers me nothing new.
Quite frankly, as product releases go, I question whether Steve would have let this out the door. It's a nice product, but insanely great? In comparison to prior developments, I don't think it meets that bar.
It is probably not much the hardware itself as the concept. Everybody can figure out how to use a tablet, the app store, etc. but not everybody can figure out how to use Windows or Ubuntu or OS X. We are close to the point in which everything a 'normal' person (not a developer) may want to do with a computer, such as listen to music, watching videos, receiving and sending email, buying plane tickets he or she can do with an iPad.
I've been netbooking it for my chillin' computin' for the past couple years. Since getting an 800x400 AMOLED screen on a smartphone, I haven't even thought about getting a new netbook. There are all sorts of useful form factors. the iPad _IS_ a PC. It just uses a different CPU and proprietary hardware.
Now... a small and light PC that is completely intractable with via a touchscreen, wireless data, etc...
I think the demand for tablet-like-PCs is potentially huge. If I'm right, their first couple of years will probably tell us where the frontiers between PCs and post-PCs are and what the world will look like.
I'm curious how that chart would look taking into account US and international population growth in the last few decades. Perhaps devices per capita is a better ratio?
The difference between the MBP and the iPad is night and day. The italics, symbols, and math in the document render much more nicely on the iPad. Take a look at the parentheses and the script letters in particular. (I did this experiment.)
Incidentally, the T220 is no longer available new.
I'm sorry, but what's the difference between DPI and PPI? Is it just 'dots' (print) versus 'pixels' (displays)?
As for T221 - sure, we had technology 10 years ago, but that isn't really important - there's a quote in this article, that there are similar displays available today - but for thousands and thousands of dollars, not in device that costs $500. T221 was $18k when it was introduced - if there are no ~$500 displays with similar PPI, it might as well don't exist, since there will be maybe fifteen people in the world that will buy it, as opposed to 3 million people that bought new iPad during weekend.
I think the article is correct except for its timing. The iPad 1 did all of this, the updated display resolution really doesn't change the device fundamentally. Sure the fonts and icons look great to Jeff, who just bought his first iPad. But guess what? They looked great on iPad 1 too.
What irks me the most about the "post pc era" is its genesis. It is portrayed as a revolution that will fundamentally change computing.
However, by dubbing the tablet the "post-pc", its very definition still depends on the pc, being a definition ex negativo. Many of the arguments here are about whether the tablet can do everything a pc does.
The tablet does not revolutionise cultural artifacts or cultural goods, it merely allows for a consumption that is somewhat different from the consumption the pc offers. The consumed goods do not change, social media, media, art and science remain exactly the same, namely consumable goods.
The established dichotomy between the pc and the tablet is a fake one, used to artificially create demand that can then be met through new supplies.
The tablet does not change the mode of production, it is merely another facet of consumption, this time even more openly advertised as such by being advertised a "consumer device".
Calling it innovation is in my opinion a misrepresentation, because it basically reinvents the wheel once more, it undergoes the same syntheses and evolution as the pc did, just accelerated thanks to already available knowledge.
The tablet is a clever remix of technologies already available. It combines different modes of consumption previously divided, but it only changes the mode of reception, not production.
True innovation on the other hand has to change the modes of production first. The tablet cannot achieve anything other items cannot also do, and is thus not innovation or revolution, it is merely evolution, or even just another playing field added alongside others.
Film, recorded music, books and the radio changed the means of production, as did the pc. The walkman, the tablet and the tv may have changed the modes of reception, but nothing new can be created through them that wasn't already available beforehand and is merely refined, not substantially altered.
The "war" of tablets vs pc is just another sales pitch to make either more appealing to their individual audiences.
>> edit (for ease of understanding, an example)
As an example that is less abstract, take the "Hipster". Wearing clothing that is tattered might be "cool" to you and you wear it "ironically, as a statement". If you are poor however, you might have to wear tattered clothes because you lack the means to afford something else. Ironically or not, you are fundamentally still wearing tattered clothes. Irony does not influence the plane of action, but instead the plane of ideas, or in other words: ideology. It's the same in the debate of tablet vs pc: Whether you type a blog-post on your touchscreened tablet or with a keyboard on your pc doesn't change the underlying action, just the mode of action. Whether you prefer one over the other is ideology, the basic action remains unchanged.
Lets play to the clairvoyant, the post-tablet era is going to be glasses-PC (circa 2025) and after that is going to be micro holographic touch projectors (circa 2040); and those are going to be more like terminals with everything in the cloud.
Really? Where did you read that?
I thought he wrote how people who never wrote a line of code in their life and never intended to do so now have their needs fulfilled by this device.
The author conflates 'PC' with 'desktop' and 'general purpose computer', which I take issue with. Three points:
Firstly: The iPad is a PC, a 'personal computer'
Secondly: The quotes refer to 'desktop computer' era being over (though one uses the term 'PC' in the context of 'PC wars'), so why not use 'desktop computer era is over' as the title?(more accurate though less shocking)
Thirdly: Through all this conflation, the author also tries to say that the 'general purpose computer' era is over. Personally, I believe there is a big future in 'general-purpose computer' tablets/phones/etc. (non 'desktop'), as well as in desktops and there is not a shred of evidence in the article which challenges this view...
Although you appear to have been marked down for this, I kind of agree. Not that Apple paid him of course but that there is no real news in this article? If any other person had written this (not mainstream), I would argue no one would read it. It certainly wouldn't have made front page HN. It doesn't teach me anything I didn't already know and even if I didn't know it, it poses no real value apart from being very sales heavy.
No, I don't think Apple paid but I understand your point. This sort of fare is par for the course for Atwood. Honestly, it's pretty cliche stuff; people have been throwing "post-PC" around a lot lately.
There would have been a cost involved recently with giving a group of journalists private briefings on osx mountain lion recently. Which would have been done to get more press.