For what it's worth, PCs are subject to the exact same slow decline, even if the author's romantic notions of repurposing an old machine ring true. While they may be suitable for old games on an unnetworked, phone-activated WinXP, or as a power-hungry Linux machine that will perform admirably as long as web browsers aren't involved, both fates are the same: frozen in time, locked out of the dizzying speed of evolution of online services and the portals through which they are viewed. Stallman's right yet again: the offline programs will run just the same, but the online programs are subject to change at a moment's notice, from Apple shedding support of 32-bit hardware, to application makers shedding support for old OS versions, to online backends forcing mandatory updates.
It feels worse with the iPad because its marketing made an impression of a sleek, "just-works" kind of device. Of course, its utility is tied exclusively to applications from a captive marketplace, and often, to remote services frontended by those apps or the browser, making its elegance a leaky abstraction. Today's Chromebooks, another "just-works" device, will be in the same situation in five years.
But with Chromebooks, or Android phones (if you're lucky), or x86 or x64 PCs, you can replace the stock OS and repurpose the device on the hardware's own remaining merits. An iPad is just stuck.
>For what it's worth, PCs are subject to the exact same slow decline
Nope, they aren't. My home PC hails from 2008, runs the latest Windows 10 update, and a range of software from early 2000s to 2018. It runs modern games, too (I like playing indie games that aren't too demanding).
Secret sauce? Modern SSD and a GPU from a few years back, plus an HDD to serve as a repository of all my data.
But even without the benefit or running Win 10 and hardware upgrades: I also spun up my old 2006 laptop the other day to rip a CD (my other machines don't have a drive), and guess what, FF Quantum runs well on it. Bring on your web apps, that XP machine will run 'em.
My point is that the PC's really don't suffer from the level of planned obsolescence that device like iPad do.
The PCs, ultimately, have one large window to see the online world with: the browser, and as long as that runs, the PCs don't lose any utility: whatever task they have enough computational power for today, they will be able to handle in a decade. The mobile devices, with their shift to the app-for-everything ideology pioneered by Apple, have thousands that are easily broken, prompting you to upgrade.
It's not about planned obsolescence. Mobile has seen the same type of dramatic performance increase you use to see in PCs back in the day. Can you imagine using a computer from 1996 in 2008?
Yes, and I am definitely seeing this curve already flattening out for mobile. My current iPhone 6s seems usable with even very good speed way longer than my 3gs did.
I have no issues with my 128GB 6s running iOS 11. The only reason I'm thinking about "upgrading" this year is to hopefully get an upgraded SE. I want to go back to a four inch phone.
The only real obstacle is the lack of NVME support. SATA3 is a bit of an upgrade bottleneck on these old systems, but even software development on this box was just fine with a fast ssd.
PC CPUs have not really got that much faster in the last ten years compared to the previous decade. Each new Intel generation of the i- series was maybe 10% faster than the preceding one.
A fast Core2 or 1st gen i7 is perhaps half the speed of a mid-range system today, and will probably beat some of the budget chips.
Upgrading an oldish system with an SSD can make a big difference. Obviously, GPUs have also got a lot faster too.
Hopefully, the recent competition from AMD will make us see better progress again.
I did exactly this to my PC, and it feels almost like new again. I think my i7-2600k might give up soon, but for a seven year old CPU, it is running suprisingly well. 500$ was all I spent altogether for RAM (16 GB Corsair), GPU (GTX 1060 Ti) and SSD (Samsung 500 GB 850-EVO).
"For what it's worth, PCs are subject to the exact same slow decline"
Maybe eventually, but not within 5 years. I'm currently typing on a 2011 Dell with an i3 proc and 6GB of RAM, running Ubuntu. I have 19 other tabs open in Chrome. It dual boots Win7.
I can use this 7ish year old PC for social media, email, HD videos and streaming music with few annoyances due to performance, in either OS. Not so with the 5 year iPad he describes. While I can do several of those things at once, it seems he has trouble with even just one.
My i7-2600k, which is 7 years old by now, still does everything I need it to do (among them being gaming newly released titles) and more. I did upgrade the GPU a couple of times, but the basic system is still the same and performing like it always did.
In that regard, I really don't understand what's happening with smartphones/tablets that suddenly makes them incapable of performing the same tasks, at the same performance, they always did. It's not like I'm suddenly trying to use some performance hungry app/game I never used before, it's the same apps I always use, but they simply perform worse.
If it wasn't for this simple fact I'd probably be still on my iPhone 4s, but after a while, that became even too slow for simple stuff like using WhatsApp. It can't be that the apps become that much more demanding, WhatsApp still did what it always does: Sending and receiving messages, so I really don't understand where that extra demand for performance comes from which suddenly turns whole generations of portable hardware into fancier paper weights.
A lot of performance is lost to feature creep in the underlying OS. This is easy to see in Windows as you can open up services.msc all the way back to Windows 2000 and see something like a tenth of the number of running services (or less). (Now I wish I had a Windows 2000 disc to test that theory out with)
Plus factor in that the app frameworks themselves get more complex, and the current trend of using-browser-as-frontend (plus all the bloat of the web) and one can see how old devices don't cut it as much. And since each iteration gets bigger, the longer an OEM maintains support the worse it performs.
It would be interesting to compare how well old devices function by how soon OS support is cut -- I'd bet that the earlier a given device is left behind, the better it ends up running.
> what's happening with smartphones/tablets that suddenly makes them incapable of performing the same tasks
Competition, that’s what happening.
On PCs we had no competition for many years, that’s why performance has stalled. For example, cpubenchmark.net gives 8739 points to i7-2700K from 2011, and 12078 points to i7-7700K from 2016. Very modest 38% improvement in 5 years and 5 generations of CPUs. Same 4 cores/8 threads, same 8MB cache.
You think WhatsApp of today is the exact same app or backend server logic from a year or two ago?
Any online service is constantly adding functionality on the server side even if the app code is constant ( given its a mobile app this is simply unlikely)
Unless you have an app that's not been upgraded and running in offline mode it's definitely not doing the same thing as a year ago.
> it's definitely not doing the same thing as a year ago
To me as a user, it does exactly the very, mundane, same thing it always did: Sending and receiving messages.
It's not like WhatsApp now has become an awesome 4k video player, it still does the very same thing it always did.
In that regard, "Added functionality on the server side" does literally nothing for me except make the usability worse by making my client perform worse, so what's in it for me except being forced into buying new hardware? As far as I can tell: Nothing at all.
Not anymore, but it's only recently that that's been spun as a good thing.
Tell someone around 2000 that a 7ish year old PC would be adequate and you'd get laughed at. 1993 to 2000, for instance: You'd be comparing a Pentium at 60Mhz (some quick googling says 60Mhz was first hit this year) on the high end to a 1Ghz Athlon. Probably >20x real performance improvement (the clock alone is 16.7x).
Mobile devices will catch up and stagnate too, sadly.
There was an odd hole where machine lifetimes got really short. They didn't used to be. An Apple //c (manufactured 1984) with an ImageWriter printer and a second floppy drive was happily in use for school papers, spreadsheets, correspondence, basic accounting, casual games and kids learning to program through about 1996. And it was passed on to another family after that, where it was used for another 3 years or so. Mainly as a Number Munchers/Carmen Sandiego/Oregon Trail/ChopLifter/Moon Patrol machine, I think. And while I'll grant that was a longer-than-average useful life for the time, we weren't the only ones I knew to keep an 80s machine with a 70s core in service well into the 90s.
Then somewhere around when people started to expect Windows on a computer the lifespans got quite a bit shorter, and that seemed to stick into the late 00s. And it's swung back the other direction now... my late 2011 MacBook Pro is good with current Mac OS or current Windows 10 installations, and is only being retired because its soldered-on discrete GPU has started to flake in a way that makes it unusable. (And a replacement motherboard would be stupid expensive.)
I think part of that is the abstraction layers.. The apple][ had a really long lifespan. The IIe had a production run of 11 years and was a slight evolution on a 7 year old design when it was released. During that time you were free to run nearly any apple][ application (excluding the IIgs ones) despite many of them being 10+ years old. Their functionality wasn't reduced and you didn't have to deal with an OS that changed under the applications and dropped support for them. People understood the target and the applications were tuned for it (similar to game consoles).
The advent of hard disks, and installable OS environments changed the model. Outside of zOS, and maybe IBMi, there simply aren't any OSs that actually have provided a fairly stable OS API over time. Even windows has decided they don't care about backwards compatibility despite the fact that for most purposes the core win32 API present in win2k provides basically everything needed to host the vast majority of modern applications. Games might be the exception but people forget the directX was an install-able component until MS decided to tie to to the OS with vista).
> Even windows has decided they don't care about backwards compatibility
I can run 25-year-old Windows 3.1 applications on a 32-bit Windows 10 box and they work correctly. If I have upgraded to 64-bit, I am merely constrained to 22-year-old Windows 95 applications.
You say "recently" and then compare computers from 18 and 25 years ago. That's just about a generation in human years. The comment I replied to said, "For what it's worth, PCs are subject to the exact same slow decline". ARE subject, not WILL BE or USED TO BE.
"Mobile devices will catch up and stagnate too, sadly."
Uh, if they catch up to my point, then they will still be useful at 5 - 7 years, thus the opposite of stagnant.
My parents kept using their 1989 Mac SE to do most of their typed work up through at least 2005 (a job for which it arguably worked a lot better than many later machines, with simple predictable software and a fantastic keyboard).
Plenty of people used 7-year-old computers in 2000.
Just because someone did it doesn’t mean it wasn’t horrible. You are comparing people who are obviously way behind in technology. There was and is a large portion of people who aren’t tech savvy but not a tech dinosaur.
My grandpa never got a smart phone. But he was never more than 5 years behind in Home computing well into his 80s.
It wasn’t horrible. It was a very well made little machine, and it did just what they needed it to do. I would argue it was quite a bit nicer for what they wanted than a typical PC of 2005. (For one thing the keyboard was dramatically better.)
Gadget nerds have criteria for computers which are not necessarily related to predictability, sturdiness, or fitness for a particular narrow purpose. Instead there is a mix of excitement about technology qua technology, new toys to experiment with, status signaling, etc., in addition to functional improvements for doing real work.
Not everyone shares a gadget nerd’s sensibilities. It’s fine for people to be happy watching TV on a 18" CRT, listening to music on a cassette player, reading paper books instead of a screen, rolling down their car windows with a little crank, playing card games instead of VR games, etc.
Well, yes and no - the Mac SE had a really tiny screen, native resolution was 512x342[1], which I would say is pretty small for composing text of any length on (though I wrote plenty of undergrad papers, including a 110 page thesis on mine).
But it's largely because today's improvements have gone into lower CPU consumption than core count. But it could have gone core count, it just 99% of users wouldn't know what to do with more cores and would rather take the battery life.
I think the vast majority of the time unresponsive GUI are because of blocking network or IO calls. A faster CPU won't do anything. I rarely max out my CPU.
A UI shouldn't be unresponsive in that case. It should present a progress bar, or other visual indicator that the UI [thread/actor/process] is waiting for the IO [thread/actor/process]. A UI should only become unresponsive when only one [thread/actor/process] is trying to do everything (I.e. both IO and UI), which should only occur if your application is not written with concurrency in mind.
Mobile SoCs still increase core count but it doesn’t need to increase power consumption.
E.g. Qualcomm SDM845 has 8 cores in total, 4 faster ones that are only used rarely, and 4 power efficient ones.
Also they offload increasing amount of work from less power efficient general-purpose CPUs to other on-chip hardware, DSPs (for many years), and now GPUs (marketing folks call them neural engines or AI chips).
19 tabs opened in chrome with 6gb ram? Are you sure? Last time I used my friends pc with 4gb ram and it froze after I opened 12 tabs. Ram was fully used at 7 tabs.
The laptop I retired four months ago (Asus 1215b) has 4gb ram and an AMD E-350 CPU (slightly better than an Atom back in 2011). Still I can open 100+ tabs in Firefox (on Linux).
I would love a screenshot with 100+ opened tabs in firefox on a 4gb machine, sorry I am skeptical about that. Not just empty tabs but tabs with some website loaded. I have 3 tabs opened with some websites right now and it's already eating up 1gb of RAM. https://cl.ly/2y3k0I1T2i09
People like to shit on MS, but Microsoft's commitment to backwards compatibility allows you to keep the same OS and use your PC to use all the software you paid for, for FAR longer than any Apple or Linux PC.
In Linux world, the problem is mitigated by availability of specialty distributions and desktop environments. Even older but open software can usually be salvaged with a few patches too.
And once you have the hardware new enough to run hardware virtualization well, there are truly no limits other than raw RAM and CPU.
Assuming I have the source code (a big if), I will still need authorization from the vendor to modify the source. And even if I do get that, as an end user, all I want to do is double-click on the icon. I am not going to be writing any patches, or wanting to pay someone to write patches. I prefer to simply pay the $100 Microsoft tax when I purchase my PC and let MS worry about it.
Thought that RAM and CPU can hit you. I'd add disk -- swapping spinning rust for SSD should make an improvement.
I've revive an iMac 5,1 under Debian, and it runs. But with 1 GB RAM, it's good for only a small handful of Web browser pages before starting to swap heavily. The system should be good for memory and HD upgrade, though I've yet to look into that.
Compared to a newer Retina iMac, it's actually the screen which is one of the elements that's got me hesitating on an update. The resolution (and brightness) are sufficiently poor that even an update wouldn't give it all that much utility. Given the cost of modest new generic PC hardware, I've got to question the validity of that.
For non-Web use, basic apps, shell, and server roles, it's just fine. It's browsers that are absolute resource monsters.
The decline is similar but not nearly the exact same. PCs can generally do tasks they once did years later without compromise, iPads get slower and hardware is forced to update to a newer OS just to run existing applications, making tasks like simple note taking slower and more of a process than before. With iPad you can't keep your past experience because your product's ecosystem isn't your own vacuum you can't keep it away from forced updates.
I fail to see how that is the case? Apple neither forces OS updates, nor forces app updates. An iPad will perform exactly the same, given the same software, after 5 years or 10 years, and the user is in control of that software.
If you are speaking of services, well, that's not unique to iPads, or Apple, but is a consequence of the Internet.
If you stick to an old iOS, you lose the ability to install usable applications at all.
Apple will let you download the most recent version of an app supported by your OS... but only if you'd installed it previously. If you try to install an app for the first time, it won't give that option, and will instead tell you to upgrade your iOS instead. And then there's the fact that many of the old apps simply won't work, because they rely on out-of-date API's no longer supported by servers.
Due to the inability to choose versions of apps to install, I'd say that in iOS the user is not in control of their software.
This isn't entirely true, and only applies to really old versions of iOS like 6, on truly old devices(ipad 1, iphone 4, etc). I just installed some new applications on my ipad 3 running iOS 9 this weekend with no issues. It grabbed the newest compatible version, and i was able to start plugging away almost immediately.
The only thing that hamstrung me was having to sign out and in to the itunes store once, but i was able to install several games and hulu(which worked!)
I do wish you got a popup with a few options of which point release to install, but i think "newest compatible point release" is a good system for 99% of users
> and only applies to really old versions of iOS like 6, on truly old devices(ipad 1, iphone 4, etc).
I find this amusing; iOS 6 is just 4 years old (at least, the newest revision of it). My iPod Touch is about 9, and started losing access to new apps about 6-7 years ago.
And of course, Youtube, Netflix, and such stopped supporting the relevant codecs years ago too.
Hi! I'm not talking about a device introduced after 2010; I'm talking about a device introduced in 2007, and which I bought about mid-2008. You have 3 devices that I wasn't talking about (beyond being amused by iOS 6 and iPhone 4 being called really/truly old).
Back to the iPod Touch that I was talking about: The Netflix app that's on there (the newest available for iPhone OS 3.1.3) apparently can't authenticate with Netflix's current backend. Their site says that there's a version for iOS 5 that will still work, and that the current version requires iOS 9.
That's true. I also have a first gen functional iPod Touch from 2007. It's not a codec issue, Netflix doesn't support that version of the authentication API.
As I said in another post, comparing the longevity of a mobile device in its first decade to a PC is Apples and Oranges.
Today, I have a few 10 year old computers that are still being used either by me or my parents. My 10 year old PC, I'm using as a plex server can use modern USB 2 peripherals, has gigabit Ethernet that takes full advantage of my gigabit internet, runs a modern OS (Windows 10), etc. It has a 1920x1200 display. It has 4Gb RAM but can be upgraded to 8. 10 years later, most consumer PCs come with 8GB.
Now in 2008, my Gateway Solo from 1998 with an 800x600 display, 16Mb RAM, no USB, and a 4GB hard drive would have been useless. It definitely couldn't run modern software.
But going back to iPads. Compare the longevity of the iPad in 2010 running iOS 5 with a 2007 iPod Touch that came out 4 years earlier. On the other end, compare the longevity of an iPad 2 that got upgraded from iOS 5 to iOS 9 to the first gen that only got upgraded from iOS 3 to iOS 5. I fully expect to see the usefulness of mobile devices - at least on the Apple side where you actually get OS updates to continuously lengthen.
> It's not a codec issue, Netflix doesn't support that version of the authentication API.
And you'll notice that I corrected the statement from my first post, in the post that you replied to.
> As I said in another post, comparing the longevity of a mobile device in its first decade to a PC is Apples and Oranges.
Where in this thread have I compared the longevity of a mobile device to a PC? If you'd like, I'll add a device comparison. My 40 year old Atari 2600 (from the first decade of home gaming systems) still sees use, and works as well as it ever did.
Time becomes much more of a factor when we rely on constant 3rd party maintenance (app stores, online services, and so on). The age of the device category is less pertinent than how much it relies on off-device services to provide functionality.
> Now in 2008, my Gateway Solo from 1998 with an 800x600 display, 16Mb RAM, no USB, and a 4GB hard drive would have been useless. It definitely couldn't run modern software.
A couple things here. First, I hope you didn't spend much for that computer. Mine at the time had 6x the RAM, USB, and double the hard drive space. Second, I don't see why it would need to run modern software, for the sake of comparison. I'm not asking the iPod to run modern software, I'm asking it to provide the capabilities that it did when I bought it, which it can't, due to excessive reliance on an app store that refuses to carry older software and other 3rd party services that like to move forward in incompatible ways.
Was helping someone out recently with getting their iPhone set up to download apps. They had the phone for a while but only used it as a phone (they aren't tech-savvy). We couldn't even install the Kroger (grocery store) app because it said it required iOS 11, and she was still on 10.
She was not given the option to download an older version compatible with iOS 10 and the device is old enough that it does not support iOS 11.
When I was growing up, my parents would give me their 5 year old hand-me-down computers, which were already 4-5 years old or so. the iPhone 4 is 8 years old, so that would already be DOA for my teenage self.
I'm using an iPad Air 2, and I've found it sluggish for the past year or so. Still usable, but I've switched to preferring to use my phone for quick searches and 50% of idle surfing/shopping.
My dad is happy with his iPad 4th generation, which is one iOS version behind. But he is more patient than I, and also it's his only internet-connected device, so he's not comparing it to a fast PC or fast smartphone.
Yeah sorry, I wasn't really objecting to the parent comment. My sister used an iPad from 2007 until earlier this year, when she upgraded to an iPad from 2016.
You can, if you jailbreak. I'm not defending apple, I think it's incredibly shitty what they've done but what you're describing is absolutely possible. Beyond that, having modified those files, it's possible to install the modified program on a non-jailbroken device.
Apple uses dark patterns to trick you into updating. Not to mention that if you accidentally do update, they purposely make it impossible to go back to the factory state. Updating would be no issue if they were beneficial, and most of them usually are. However, once you're 1-2 years in, newer firmware slows down the device, and can also cause apps you paid good money for, to stop working, or be rendered incompatible. In other cases, the only way fix a messed up ios device is to do a full system reset and guess what, you're forced by apple to use the latest firmware. Its a complete shitshow as far as I am concerned.
In India, I got my iPhone serviced for a camera issue. They updated the device to the latest iOS before returning it to me. I was given no choice in the matter.
Wouldn’t be surprised if they do it for battery replacements as well.
> Apple neither forces OS updates, nor forces app updates
Apple does not, but vendors of networked apps often do in practice (as they don't want to maintain backcompat APIs on the server side). So, you may be perfectly happy with iOS7 on your iPad2, but one day your Skype just stops working, and it demands an update to its newest version - which in turn requires a modern iOS, installing which will effectively brick your tablet. This way, little by little, your tablet's functionality is taken away from you.
> Apple neither forces OS updates, nor forces app updates.
A daily (?) annoying prompt to update the OS that can't be permanently banished isn't technically a "forced" update, but it's a clear degradation of usability.
> Apple neither forces OS updates, nor forces app updates
In practice they do. I recently deleted a timelapse app from my new phone (se) because the app wouldn't work with the new update. That may not be the best airtight example as the new OS update I did 'choose' to upgrade to which did somehow break the app, and the creator didn't release an updated version for the new OS, but in a real world scenario are you going to discontinue OS updates on a device you purchased months earlier for a few apps? Meanwhile there's decade old computers running windows xp that have no problems starting photoshop 7.0. There's also this video: https://www.youtube.com/watch?v=PH1BKPSGcxQ showing Windows upgrades from earliest versions to newest and software like excel being compatible from earliest versions onward.
Except Microsoft officially stopped supporting Windows XP in 2014[1], and they had already extended support much longer than they had wanted to because XP was so long-lived and customers pushed back against having to upgrade to Vista. Vista support completely ended last year, and mainstream support for 7 ended in 2015, and 8.1 last month[2]. If you upgraded that decade-old computer to a secure OS then it will almost certainly run terribly. It's fine to run Photoshop 7 on an old PC that's not connected to the internet, but how realistic or useful is that?
Under the old desktop upgrade model it's easier to run an unmaintained OS, but that doesn't mean it's a good idea.
Edit: Obviously running a lightweight linux distro is an option for the more technically minded, and it would be great to see projects like https://www.postmarketos.org become more widespread for extending the life of tablet and phone hardware.
> If you upgraded that decade-old computer to a secure OS then it will almost certainly run terribly.
If I chose to put Windows 10 on my 9 year old desktop, it'd run better than Windows 10 on the laptop I bought a year ago as a netbook-style machine. Anything that can run Windows 7 well can run Windows 10 well, and machines that would be well-suited to Windows 7 weren't uncommon a decade ago.
But that's all beside the point. The point is that having a single source for software to run on a piece of hardware, and not having control over access to your software, puts a hard upper limit on how long a device will be useful. It's more practical to do something useful on the 18 year old Windows 98 machine I've got at home than my iPod Touch that's half that age. My Android tablet is just a couple of years younger; it's still a useful device because I can keep around backups of apks, download software from project pages, or add one of several app stores that still provide working software.
There's lots of software that is still being used on machines running Windows XP, some aren't connected to the internet and others are -gasp- risking the lack of windows support (which was generously offered for nearly two decades.) exposing their photoshop to vulnerabilities. You can be pedantic and alarmist about my specific example and ignore the general point of software compatibility if you want I guess. IRL there are 3yr old dated, useless iPads everywhere and that's not the same with even most 5yr+ year old PCs (which would have the specs to upgrade to a new OS and not break your existing software, if you choose to do so)
> But with [...] x86 or x64 PCs, you can replace the stock OS and repurpose the device on the hardware's own remaining merits.
That's exactly the point. You know all the bazillion things that people usually do with their Raspberry PIs? An old PC can do all of that, and often much more. My old PCs are used for:
- HTPC (Home Theatre PC)
- NAS (Network Attached Storage)
- Retro gaming (this is the same box as the HTPC)
- Network-wide ad-blocker (same box as the NAS)
- Print server (same box as NAS)
Now back to the iPad. I am thinking to repurpose my old iPad Mini as an external screen using Duet Display (I would use it as a GPS display in a flight sim cockpit) but in general finding purpose for an old iPad is much, much harder than a desktop PC.
Well, a top-end PC from 5+ years ago can still easily outperform a lower end PC made today. This is true even if you don't have an SSD in either.
A "top end" iPad will lose in performance to the "bottom end" of the very next generation of iPads. Granted, that isn't the same thing, but the point stands.
How long has it been since we saw large performance increases year after year in the PC space?
>Intel's Kaby Lake Core i7-7700K is what happens when a chip company stops trying. The top-end Kaby Lake part is the first desktop chip in a brave new post-"tick-tock" world—which means that instead of major improvements to architecture, process, and instructions per clock (IPC), we get slightly higher clock speeds and a way to decode DRM-laden 4K streaming video. Huzzah.
I'm still using the PC I built in 2008. I've had to replace a few things, but not much. I can still browse the web. I run the latest packages in my linux distro. I can still work with all my old files. The apps I rely on are mature, well supported, and won't be going away any time soon. And in 10 years I'm betting it will still run. I might have to replace a component or two. I might have to give up web browsing, if sites keep getting fatter and fatter. But it'll still run, and I could find a use for it.
"For what it's worth, PCs are subject to the exact same slow decline"
PCs have a much broader choice of operating systems and software to install whilst maintaining compatibility. Take Voyage Linux as an example (http://linux.voyage.hk); it's a stripped down Debian system for embedded devices which can fit down to 256MB of storage, but nothing prevents you from apt-getting more software and make it grow to a full featured desktop system depending on the underlying hardware capabilities. Mobile devices as of today cannot even dream a fraction of this modularity, but we already know why: they're black boxes intended to be consumed then thrown away, and their OSes reflect that nature. Mobile manufacturers should be forced to publish all hardware source code after a reasonable time (say after 3 years) so that "old" devices can be repurposed instead of being littered into landfills.
Yes it could happen. But today February 8th, 2018. Every iPad that ever existed can view videos on Netflix, Hulu, Crackle, CW, etc. They can all connect to a standard mail server or Exchange server, sync Calendars using standard protocols, read ebooks from Apple, play music with Spotify, Airplay to the latest devices, or use Bluetooth, print to any AirPrint compatible printer (most consumer wireless printers), etc.
I know because I own a first generation iPad that does all of the above - after a reset last year and downloading older versions of apps.
The only thing I can't do is view the modern web with a tablet with a measly 256MB of RAM.
The capabilities of the first generation iPad in 2010 compared to a modern iPad is like night and day.
I have a 2008 Core 2 Duo 2.66Ghz laptop with 4GB of RAM, Gigabit ethernet, that runs the latest version of Windows. The average consumer laptop only comes with 8GB of RAM and gigabit ethernet is still the fastest consumer networking protocol. It serves as my Plex Server.
A modern iPad has 8 times the amount of RAM, a processor that's orders of magnitude faster, etc. You can't compare the improvements in mobile to the lack of improvements in desktop computers.
Back in 2008, could you imagine using a 10 year old computer and still expecting support from a new OS like you can with a 10 year old computer today?
Yes old PCs can be unusable too, but I can still use desktop PCs released in the same era as the first smartphones, and with upgraded graphics, memory and SSD they last even longer.
As a counter point I recently got a 12 year old MacPro to fix up and it is almost impossible to make usable even though the specs are better than some old PCs I've given away. So it's an design issue and Apple is a real crook in this regard.
As a counter point to your counter point I've used a 2010 Macbook Pro as my primary laptop for years. I upgraded the RAM and added an SSD and it still run great at most tasks.
A 2006 Mac Pro though? What issue is it giving you? Admittedly - the processor will no-doubt show some age. And it's not qualified for the latest releases of Mac OS in quite a while. But it had user upgradable RAM and disk drives. Throw your preferred distro of Linux on there and I imagine it would be a sufficiently capable machine.
My Athlon(tm) II X2 250 processor was released on Jun 2, 2009. It goes at 3 GHz. The 3 GB of memory comes from the same era. It is pretty much the only thing I ever do web stuff on. It is fast and I have no intention of replacing it any time soon.
What has happened is that maximum hardware performance has levelled off. Modern web design is slowly converging to a place where everyone needs maximum hardware performance everywhere.
When I had an iPhone it stopped being usable after 2 years and the last 2 non-Apple phones I had lasted 3 and 4 years and both got replaced on my terms, while still usable. You can't just handwave and declare that Apple doesn't do a worse job than others when the evidence is mounting that it does.
The same would have been true if your first phones were Android and your most recent ones were iPhone.
In other words, your experience has nothing to do with Android vs iPhone, rather that earlier devices simply didn't have the same amount of resource headroom as more recent ones.
Ever since the iPhone 4S, they've been usable for at least 3 years before expectations catch up to them. And now I think since the 6S and SE—which have the same hardware generation—iPhones can claim to have 4+ year lifetimes.
(At 3.5 years old, the iPhone 6 is still a great device today, particularly with a cheap battery replacement.)
It's worse than that. My gaming PC was top-of-the-line six years ago, and it still works great for browsing and work tasks, and runs many modern games effectively on lower settings.
My Samsung Galaxy Nexus phone was top-of-the-line six years ago, and is essentially useless now. Just switching from an app back to the homescreen takes multiple seconds, and launching or switching between apps often takes upwards of 15 seconds. It takes more than half a minute to start Kindle Reader and open a book. These are all things that happened almost instantly when it was new. I've wiped it and reinstalled Android multiple times, and it only helps a little.
Very dodgy. I'm still using my S3 Mini and it's fine. I have however seen some apps that just won't work 100% and will drain your battery because of bugs etc.
However, if the vendors on your alternate OS marketplace chose to go the same route as App Store providers, there's not much to gain. It's not so much about the OS, but about the ecosystem and the culture of this ecosystem, including online services. (As far as I do know, there's no reason why you couldn't support legacy iOS versions and hardware via the App Store. It's just not part of the predominant business model. The same is true for most commercial websites.)
I have a 15 year old Windows 2000 machine running a PCI based piece of test equipment. No problems or performance issues. It has not been on the network for at least 10 years, if not 15. On the rare occasion a data transfer is needed it's either a floppy (yes!), CD or thumb drive. Works fine.
Stallman may have been right about the dangers, but he was also a key impetus behind moving everything behind online APIs — the harder it is to keep stuff closed source (legally and socially) the more services will switch to an online-only model, which then can optimize for frequent updates.
Wait. Not sure if I understand you correctly here. As far as I can tell, keeping stuff closed source is one of the biggest reasons people like SaaS model (you can't even in principle have Stallman-style free software in SaaS model), another big reason being full control over access and user data. I'm not sure how he was a key impetus behind that - in fact, all of his teachings were strongly against this, but people ignored it and now here we are.
I believe the insinuation is that GNU, GPL, and FOSS forces the hands of many organisations that would traditionally release closed source software. Additionally it proposes that if it is not open source, an open source alternative will quickly begin competing with it.
"a late-2012 iPad Mini, model number A1432, black, with 16 gigabytes of storage. It retailed, at release, for $329."
So the cost of owning this iPad has been less than 20 cents/day over its 5 year usable life. That doesn't even factor in the slight energy savings vs using a full laptop if you want to be nitpickey. Nor does it factor in the current resale value. I just sold an original iPad 1 for $80 on Craigslist. I'm sure this would be worth at least as much, if not more.
"I still use my old iPad for passive consumption: reading, watching videos, checking feeds ...formerly easy tasks have become strained. Social apps have become slow, videos take longer to load and Safari can’t seem to handle the most important and fundamental services of the modern web."
This is all also true of my maxed out 2012 MacBook Air. Final Cut Pro X runs more smoothly editing HD video than browsing Facebook. And, while I agree the uses for an old iPad are limited (photo browser, music player / media controller, eBook reader, kitchen recipe browser), they are more elegant than keeping an old computer around.
The mixed blessing and curse of something like the iPad is that it still looks beautiful and functional long after it isn't. I'm sure it would be easier to part with the device if it was a 5-year-old $300 Celeron netbook.
My Panasonic 'Smart TV' is from 2014 and its app marketplace doesn't have Netflix. It's just not there. My laptop is from 2011, and since it runs the latest Chrome browser, it can play Netflix just fine. Made me think: the fact that laptops & desktops are consumer devices sold with hardware and operating systems that do general-purpose computing is underrated.
In general, accept that the TV is never going to be really Smart or keep up-to-date with the latest and greatest in content and services. Given the fragmentation it's just not going to happen. (May be a day would come when there's a common platform for all TVs but I don't see that). On the other hand, any of the major "TV" platforms - Chromecast, Apple TV, Roku, ...etc would receive regular enhances and no major content provider can choose to ignore them. So, depending on your preference, pick one and make your TV just a dumb display (which is exactly what it should be to be honest - it allows you to upgrade the hardware while keeping the display)
I was in the market for a TV late last year and was astounded to discover that “dumb” TVs are no longer available at your typical retail outlet. Every one at Walmart, Sams, Target, Costco, etc. has some sort of OS running on it.
I just want an HDMI cable port and a decent display, but it’s quite obvious that isn’t what makes the TV manufacturers money these days.
> ...it’s quite obvious that isn’t what makes the TV manufacturers money these days.
Or adding in smart capabilities is < 10$. And without those features you loose some number of sales. I'm sure they do the cost analysis and find adding in features that 90% of people wont use still helps differentiate your product.
For example if I saw 2 identical tvs. One had smart capabilities the other didn't. Same price for both. What TV would the average joe buy...the smart tv. I'm sure people would also buy the smart tv if it's only 15$ more expensive than the dumb one. Netting the manufacture an additional 5$.
A few months ago, I couldn't watch TV on my TV because the TV-watching app kept crashing. It didn't even occur to me that this problem was possible in theory. I had to factory reset it.
Exactly my thoughts too. I was "forced" into buying a smart TV this year. The app store sucks and they serve occasional ads too. I pulled the LAN and wifi off the TV. I am happily using an after market custom android box plugged into HDMI and a Bose TV sound bar. The TV is just for the display.
Have you seen an Oled set? Those look awesome because the blacks are so black. They are coming down in price and will eventually replace my Plasma set.
Actually, no. Genuinely glad to hear from another plasma owner that there's current technology that can replace it, performance-wise. Will follow the oled prices, they're still quite a little expensive for the time being.
And that's despite the high-end games industry doing its damnest to stop it, trying to get people to switch over to consoles. And that's beyond the basic problem that popular AAA games being upper-bounded by the current-at-release console generation holds PC gaming back.
My Panasonic 'Smart TV' is from 2013, and its app marketplace doesn't have Netflix. It did have Hulu and some others that were discontinued a few months ago. So I had a smart TV capable of watching Hulu, and now I don't. The pre-installed app was even forcefully removed. And so have a few others. So the app wall now has holes in it. No notifications, by the way, I had to google what happened with my disappeared icons. I wouldn't even be surprised if people are still being charged for the Hulu service they can't use anymore. (I wasn't subscribed anyways, but found out when I finally decided I wanted to check it out for a trial period)
Why would you ever use your TV to be connected to the internet? Just a matter of time before your TV gets hacked. I use either a chromecast or apple tv, and see the tv as a monitor with speakers, nothing more.
It's all but impossible to buy a high quality dumb TV these days. I bought a 4k TV this past year with the intention of never connecting it to the internet but I guess I didn't do enough research because I can't even access all the functions of the TV without pairing it with an Android "remote" over a wireless network. Even better it's from a manufacturer that sells users' viewing habits.
Even now they make it opt out. To disable it I had to go into some obscure submenu and disable an option called "smart interactivity," something I never would have found on my own. I had to look up a guide on how to stop my TV from selling data about everything I view on the TV to advertisers!
The way "smart" TVs are going makes me wary of what other "smart" appliances will do in the future.
These still have crappier panels and features(HDR, etc) than the smart models though. You literally cannot buy a top-spec "display", like you used to be able to in the mid/late 2000s with plasma and earlier LCDs
I made tbe smart TV mistake. Not only are they bad for the reasons you state they are also utter junk. All I really want is a good panel with plenty of ports and good speakers.
Why would TV need speakers? It's almost the same argument as with the smart TVs. I have a nice set of standalone speakers, why would I want to pay for the crappy speakers in a TV?
A lot of people don't care about using fancy speakers, are perfectly fine using the TV's crappy speakers, and would be much more pissed that the TV they bought is useless out of the box unless they buy speakers for it.
A lot of people don't care about using fancy streaming boxes, are perfectly fine using the TV's crappy smart features, and would be much more pissed that the TV they bought is useless out of the box unless they buy a streaming device for it.
I've dreamed for years to have a single cloud OS, and then multiple "dumb" monitors and speakers controlled by that OS, Chrome seemed to be aiming for that, but privacy concerns could kill that dream.
Beyond that, that nothing on the hardware side stops you from running any software you want to.
Expect the PC to morph into something more akin to a tablet or "smart" TV if the MAFIAA get what they want (and Intel and crew seems all too willing to give in).
And that point your GPC will be relegated to a developer workstation that may well require a verified employer and regular visits from an auditor to own.
You’re getting downvoted, but while what you describe probably isn’t going to happen, you’re not too far off. We’re already “there” with walled garden app stores, and outside of that app signing where you need to pay to be in a “developer program” to be able to sign your apps [1], DRM baked into the browser (EME) and hardware (HDCP), mandatory logins to cloud services for appliances, etc.
[1] (Thinking of macOS here. I’m sure a non-centralized way of signing would also work.)
Yup. Cory Doctorow warned us about it for years (Google keywords: "war on general-purpose computing").
And the sad thing, I'm not sure we can escape that. MAFIAA wants it. Large businesses want it. And to top it off, computer security specialists want it too. Ideas like sandboxing, or trusted computing, or hardware crypto modules - all provide security while simultaneously taking control away from the user.
I seriously fear that soon, having a general-purpose computer connected to the Internet will be considered a public safety issue ("because botnets!"), and eventually you'll need a professional license to be allowed to work with a Turing-complete language ("because langsec!"). I very much don't like it.
Yeah he followed it up with "civil war on GPC", where he basically started to favor "DRM". This with the caveat that it was the owner of the computer that would have control of the keys, and frankly i do not see that happening.
Security people don't tend to ban these things. You can use Linux if you want to. You can sideload Android apps if you want to. But the vast vast vast vast majority of people will never do this and never want to do this. Therefore, we provide an environment that works well for these people takes advantage of that design.
Common things should be easy. Rare things should be possible.
Sadly this is a direction Microsoft is taking with the rumored upcoming "S Mode". If this works for them (and I hope it don’t) even PC will be locked down and no more useful than a tablet. Some decision makers are forgetting hardware is for running software, not blocking everything under the sun.
My mother-in-law is in the same boat. She is not a demanding user. She basically uses her iPad for 3 things: buying things on Amazon, looking at pictures of her grandson on Facebook, and checking her stocks with the Fidelity app. Well, the Fidelity app recently informed her that she could no longer login unless she installed a required upgrade. It turns out the upgrade is not compatible with the version of iOS that runs on her hardware. So just like that her iPad became obsolete.
Given her level of tech-savviness, that's not as easy as it sounds. Also, it seems more secure to have her just use the app - less possibility of phishing attempts and the like. She already bought a new iPad anyway.
One of my biggest fear nowadays is I will someday mistakenly press "OK" to the annoying "Update to iOS 11" popup without thinking, and my iPad will be practically bricked.
My iPad is a relatively recent iPad mini which I bought a couple of years ago, but I know from my iPhone 6, I know upgrading to iOS 11 will slow down everything, for example each app taking about 10 seconds to launch--including Apple's own apps like Mail.app and Messages app.
Is there a way to completely block that upgrade to iOS 11 popup so I don't make the mistake?
Also, I marvel at how watered down the term "bricked" has become. From 'my thing shows absolutely no signs of life' to 'my thing is a little bit laggy'.
Thanks, but I wouldn't say "a little big laggy". I am not exaggerating when I say every app takes ten seconds to load.
I actually measured them and that's the average time it took for all apps to load after I updated to iOS 11 (Before iOS11 it loaded immediately).
If you can sympathize with me because you have more recent phones, please try to close your eyes and count to ten and imagine, hopefully that may give you an idea of how frustrating it is for peasants like us.
My first smartphone was like that, pretty much out of the box. It would often hang on calls, applications took forever to start, and I had to keep syncing off because otherwise if I turned on Wi-Fi, I would have to remove and reinsert the battery to get the device to work again.
That experience made me never again buy a cheap phone. I now aim for the high-end, and can live for 2-3 years in relative comfort (though even expensive phones slow down quite visibly over time).
Then there's something broken with your iphone 6, because until the 8 came out I was using a 5S and it was observably slower but not '10 seconds for every app' slow as you apparently describe without exaggeration. I had upgraded it to iOS 11 before moving to the 8, too.
iPhone 5S isn’t subject to the battery-related CPU throttling.
My iPhone 6 took 10 seconds to load apps, but after a battery replacement it takes me about 4 seconds. Also, I noticed that apps stay in memory longer which means that I don’t have to do a fresh app load that takes 4 seconds.
My kid plays games on an iPad mini 2 which I occasionally borrow it for reading, and I haven't experienced apps taking anything like that long to load under iOS 11. Are there any particular ones you'd want me to try out?
Maybe it is different in other circles but as a software engineer, bricked still means you unplugged it while flashing the ROM, literally unrecoverable.
It should not take apps 10 seconds to launch on an iPhone 6 running iOS 11. Something probably went wrong with your upgrade, which does happen. Have you tried taking a local backup, doing a factory reset of the phone, and then restoring the backup?
On every computer I own and every non-cloud server I manage windows update has broken at some point in their life-time (often silently). 5 out of 5. So it's not just Apple. And quite a few of them it's happened more than once.
One server I noticed the other day because of Meltdown. My laptop got stuck on the creators update. Before that I'd noticed the laptop had got stuck trying to upgrade a driver because it's fan kept going mental every day if I left it unattended for 5 minutes because windows update is so poorly written. I had to hack the .Net optimization service to stop it running the 32-bit version because it would constantly fail and then try and re-run itself. I use .Net framework every day and the optimization service seems to be entirely devoid of purpose.
I swear more CPU and electricity has been spent on windows update and badly written maintenance tasks on that laptop than any of my programming, building, compiling, etc. The fan is moderately noisy and whenever it spins up, it's always a windows update or maintenance task. Or skype. What on earth they've done to that program I do not know, but it's rubbish.
My refurb X220 came with Win10, and it would just sit there, burning 25-30% CPU for apparently no reason, doing something probably update related. The battery isn't exactly new, but it's a 6-cell with about 40Wh capacity remaining. That should be good for more than just 1 measly hour.
Apparently Win10 sometimes needs to sit like that for a couple of days after installing, for reasons.
So I ditched it and installed Linux Mint. And whattayaknow, it installed faster, it didn't need any additional drivers and it doesn't idle at 30%. Everything Just Works™.
A 2011-vintage laptop with an i3 (plus 8GB RAM and an SSD) is still a perfectly usable machine today, if you pick the right OS.
I'm still getting that 9-cell battery of course, because you can still get brand new ones.
Yeah, if you investigate, which admittedly takes a lot of effort, one of your maintenance tasks will be erroring. The memory one seems to be one of the worst and a lot of people see bug reductions in the tasks runtime when they do.
As someone who tested this extensively, it really needs to be treated like a major OS upgrade on a computer. Back up important info(pictures, etc) without using icloud, do a full format(DFU restore), and reinstall from scratch.
I have an iphone 6 and 6s here that are still flying along great on ios 11. And i had previously had so many problems with the 6S that i thought this upgrade really was awful. It took until 11.1 or so to truly be ok, but the performance increase from doing a true "clean install" like i would with windows or OSX was really night and day.
Combine that with a cheap battery replacement from your local Apple Store and your 6 should be great for another couple of years, I'd predict at least through iOS 12.
Historically speaking, we would expect the 6 to be supported by iOS 12, possibly by iOS 13 and unlikely by iOS 14.
The current trend for current iOS devices is five generations of software support with the iPhone 4S, 5 and 5S all getting five years.[1] The only recent exceptions are non-flagship models that used "old" parts at the time of release (iPhone 5C and iPad mini 1) which only got four years.
Yep, it's my backup/testing device. It actually runs surprisingly great, i'd say it's still a better experience than an equivalent-aged android phone honestly, even like a nexus 6
Yes, you can block it by installing the TVOS 11 public beta profile on your phone or ipad. I had to do this with a broken iphone 6 which would literally be bricked by an update due to a baseband fault. Works perfectly, the kids can play games on it without the risk of updating it by mistake.
It is ridiculous that there isn't a setting to disable update nagging though!
I have a 3rd gen iPad that I feel like there should be a 3rd-party logic board / OS update for.
There are approximately a bajillion 3rd gen iPads out there. Imagine walking into the local screen repair place, having them stick in a new Android board that reuses the digitizer, screen, and case. Maybe they pop a new battery in while it's open. Bam! Renewed iPad.
Given how cheep various SBCs have gotten, I have to imagine there is an economic business model for someone in retrofitting all these old, homogeneous, plentiful devices.
The main issue with this is that Apple doesn't allow downgrading of iOS. Thus, one might be able to maintain a device on an older version of iOS that is most suited to it, but if anything happens that requires a restore, then that device is effectively dead in the water. It seems to me that at the very least, Apple should backport relevant security fixes at least one major version, if not two (for example, a bug found in iOS 11 should also be patched in iOS 10, should it be present), and they should sign the last revision of one or two major versions back, so that these things aren't thrown out en masse.
The new version of IOS broke some of my apps so I went to submit feedback on those apps that Apple broke their app with their new OS and it turns out that providing feedback in the app store is also now broken!
I literally cannot do anything now because I cannot downgrade my OS nor can I tell the developers that Apple broke their app with the new OS version.
I have 2 original ipads that are in excellent shape. I spent a weekend looking for some way to repurpose them into photo albums that sit there are cycle through photos.
I couldn't find a working solution.
All the apps won't work properly and the browser doesn't work with things like Flikr or Google Photos.
Other things it still does (to varying degrees of success) are playing games (e.g. where’s my water, fruit ninja), taking notes, web dashboard (though sadly not general browsing), music player and alarm clock.
Some sort of jailbreaking/software modification (haven't done this in years, not really sure if it's possible?) could allow you to do what you were trying to do in the first place?
I think most versions of iOS have had a built-in slideshow function in the Photos application. Try loading up some photos onto the iPad, add to an album, and see if there's a slideshow option somewhere.
I got my iPad in May 2010 and it still works fine, including (amazingly), the battery. Except of course, the software has become useless. YouTube doesn't work unless I do special URL mangling in the browser, which crashes every 2 seconds anyways. The latest OS release is so old it's from the days when Apple had a YouTube app, which Google later shut off the API for.
I really feel it deserves a second life running some kind of alternate OS and can think of several uses for it, but last I checked there wasn't much of that happening.
By way of contrast, I'm still using the PS3 I got 7 years ago (when it had already been out a few years) every day and it runs speedily and mostly smoothly - my sole annoyance with it is that it doesn't adjust for daylight savings time automatically. That said I never use it for web browsing as the browser has always been rather crap. Netflix and similar apps do need a few seconds to load, but then they always have as it's using an actual hard drive rather than a SSD.
I actually felt this way about my Asus Android Nexus 7 2013 - I didn't use it daily or even monthly, just enough to remember it. It largely collected dust and now it longer gets updates.
I rooted it a few days ago and also installed lineage 14.1 and I'll have to say, it's like a brand new device again. I was very surprised. I'm plan to mount it next to my thermostat and use it as a "solar" dashboard for showing consumption and production.
I have a 1st generation iPod Touch, still working as music player in an old JBL dock. It's just for that, and that is enough for me. An old iPad like the one described here is nothing more or less than an old laptop that's getting too slow.
So what has changed? Nothing much except excellent quality hardware. That hardware can still be used by the way, to display videos, play (old) games. Great toys for little kids?
FYI the old iPhone/iPod docks can easily be upgraded to bluetooth for about $10 by buying a bluetooth adapter (powered by the dock, so apparently a few docks don't work).
I'm not exactly sure what the chief complaint here is.
Is it that iPads are not upgradeable? Well, Apple has an entire system for recycling and credit. Also, an argument could be made that an upgradeable iPad would also cease to be as useful as an iPad, as it would necessitate some level of form-factor change.
Is it that software keeps moving forward? Well, that is not new, and is not unique to an iPad. A five year old computer running a recent browser will also struggle with new websites.
Is it that the author can't give the iPad to someone else who could use it and wouldn't be as frustrated? I fail to see how that is the case.
I'm sure using a five year old computer can get frustrating. But I think the fact that an iPad is still quite functional after five years of use is rather remarkable. No one writes op-eds in the NYT lamenting the slowness of their five year old netbook, because those devices are even more disposable than an iPad.
Edit - that is not to say that there is not an issue with aging iOS hardware. I just don't think the article makes a clear argument, or even statement, of what that issue is.
> A five year old computer running a recent browser will also struggle with new websites.
My _gaming rig_ is 5+ years old and still runs new games fine. I don't even know where you came up with this. My wife owns a 2012 Macbook Pro and it still runs everything fine, even the newest OS and MS Office applications.
In the 90s five years made a huge difference and some people still have that mindset. In the 2010s five years is nothing. I kept my last desktop PC for seven years with only an SSD upgrade and it was still running new games on medium graphics settings. In seven years the new CPU had not quite doubled in performance vs. the old one[1].
sorry, I should have said "low end laptop," "netbook," or "chromebook." Those are all in the same class as an iPad, with the same portability requirements, etc.
I've been using my $300 Acer c720p chromebook for almost 5 years. Keys are falling off, it has quite a few cracks and smudges, and the battery is less then half of what it used to be, but it still boots up in about 8 seconds, and there's no new noticeable lag. Right now sounds like my Chromebook was a better decision than a iPad.
Yep, my early-2011 Macbook Pro is still a daily-use machine that works great, the SSD upgrade really helps. The main thing I wish I could do on it is play Civ VI.
The article doesn't clearly says the problem, but there is a problem, and it's none of what you said (although the article does talk a bit about repairability).
The problem, in iOS (but also happens on Android, to a lesser extent), is that older devices are locked out of OS upgrades at a way faster rate than they break down, and in turn, because they run older OS, they are locked out of installing new apps, including older compatible version of those apps.
Simple fact: a 10 years old computer (and probably some 15 or even 20 years old ones) can get the latest Windows/Unix of your choice, but any iOS device that was released more than 2/3 years ago cannot get latest iOS.
Really, it's pretty simple: Apple doesn't care about retrocompatibility. Either you get a new device, or you have to do with what's already installed on it.
"any iOS device that was released more than 2/3 years ago cannot get latest iOS." is no longer correct, though it used to be. Apple is now providing iOS support for five years. Consider that when buying your iDevice, especially the expensive devices like an iPad Pro or an iPhone X.
It's more than that: Safari could be fine on these devices, but the original iPads had very limited memory, and websites these days are resource-intensive. It's hard to browse the web, when otherwise it would be a great web and ebook reader (it's just the latter now).
So, it sounds like the author is asking for: upgradeable, portable hardware, with longevity of 5+ years, that can have a "maintenance" release of software, for under $350. I don't think there exists a business case for such a product.
The complaint is that a device which was once useful is getting less so year by year because of vendor lock in and planned obsolescence, and not because of hardware failure. This is a “should” question: Should a device manufacturer be able to decide to make an otherwise fully functional device less and less functional until it is useless?
Case 1: Like others here I have an old iPad 1 which is stuck on iOS 5 or so, and although the hardware is still as flawless as the day I bought it, I can use it for fewer and fewer computing tasks year after year, can install fewer and fewer software on it, and have no way to fix the problem.
Contrast that with case 2, my 12 year old Dell PC which has undergone a number of software updates (currently running Linux beautifully) and continues to be very useful and updatable to this day.
The trend seems to be to move us more and more towards case 1 and away from case 2, which is in my vein an overall loss for end users.
While I understand the complaint, I have a fundamental disagreement with these types of arguments. If you were to restore this iPad to factory settings, it would function exactly the same way or just as well as if you had just purchased it. The apps that you get from the App Store are additions to the product and you know going in that they could stop working or stop being updated and get trampled in the march of technology. As long as you have a WiFi connection, though, that iPad will be useful for web browsing, notes, calendars, email and everything else that it was good for when it was purchased. In fact, I just restored an iPad 2 (Not an iPad Air 2, an original, non-retina iPad 2) and my parents are now using it as their Facebook/email machine. It's just as zippy (maybe a little slower but it doesn't feel slow) as it was when I bought it for those tasks and I can even FaceTime them on it without issue.
There are tons of applications on the PC that no longer work because of changes in Windows. Unless the manufacturer decides to give you some kind of backwards support (like Microsoft including an XP emulator in later versions of Windows which, kudos to them, is not something they should have done or needed to do), I don't see how there can be an expectation of the hardware outliving the software. The software has always been the bottleneck and it always will be. I think it's great that the iPad is still the same out of the box as it was all that time ago.
It's not just hardware upgradability: some of the problem could be solved if app developers created simple, lightweight iOS 5-compatible apps. But of course there's no business case for supporting early iPads, and I don't even think many early iPad owners necessarily think there should be.
The iPad is great because it is compact and a specific, coherent design. I could use a 12-year-old desktop tower for my evening reading, but I don't want to.
It doesn't sound like a complaint so much as an elegy.
I still regularly use an original iPad nearly every day. It can't update past iOS 5, and it's been pointless to try to install new software on it for years. At this point, it's mostly a Kindle and PDF reader that is still much more usable than our Kindle Fire.
That's no complaint! It's sad because these were great machines that are just becoming obsolete. Its the upside of such attention to hardware quality: they become obsolete before they break, instead of the reverse.
I use an early-2011 Macbook Pro too, which is far more battle-scarred (the battery swelled and cracked the touchpad before I replaced it, and I lost a few screws when installing an SSD), but it still feels like a great everyday machine.
The complaint in the article is that it shouldn't be obsolete. An iPad is touchscreen, SoC, wifi/bluetooth, battery. Quantitative improvements in hardware don't mean that the capabilities of old devices should be taken away.
Software is moving backwards (alarmingly quickly). In 2009 my 2nd generation ipod touch was able to browse the internet, run complex 3D games and stream videos with no issues. It still works and still does all those things because I never "upgraded" it. Taking away features for a given piece of hardware is a regression. Give new software features to the people who have hardware to run them.
Apple really should allow open source OSes after a few years, when the device is no longer sexy. Release the drivers as open source, release some signing keys that would allow boot image signing, etc (but obviously keep it somehow separate so tinkerers can't sign hacked iOS images using those keys), and allow the community free reign.
Obviously they are afraid someone will make an OS that, when run in a 2012 iPad, can beat the 2017 iPad...
You should be able to jailbreak then set up ssh. Not sure what tools exist, possibly some do that allow VMs or could be enough native software that you could do what you want. VM would be super slow, native would be fine if you can compile and run it properly
I feel like old ipads would be useful as the head to another box, rather than a headless box. They have decent screens, and would work well as, say, an extra screen to a movie player for kids on trips.
It's unfortunate that there isn't an easy way to do it.
The (original) iPad I received as an Xmas gift in 2010 is still alive and kicking. In fact it is the most robust piece of ("mobile") hardware I've owned since 2010; outlasting various iPhones, Nexus devices, laptops and other electronic devices which have been purchased or received as gifts.
Then again it's only used to stream movies, read news and light browser based activities. Amazingly and to the delight of the kids most games are still playable.
The only other portable electronic devices I own with similar resilience are the original Raspberry Pi and potentially a 2011 Mac Book Pro. Though having to replace the logic boardonce and battery twice probably disqualify the MBP.
My parents have an iPad 2. It's still in good condition, battery still very good. But they can't use it anymore for Skype: Skype delivered always slower apps as the time went on, and now the last version of the app is too slow to be used at all: when somebody calls the app gets so big lag that it just fails.
Also, some app writers make the app compatible with every iOS. Not Microsoft. Even newer versions of Skype aren't for iOS 9. But the last for iOS 9 was already too slow.
From this perspective, the biggest problem are the companies which decide not to test on the older hardware and which don't want to support older iOS versions.
What we've seen in the tablet space over the last decade is a yearly boost in hardware specs that has not been seen for some time in the PC marketplace.
Consider that the original iPad only had 256 megabytes of RAM and a single core ARM Cortex A8 CPU while the current entry level iPad has 2 Gigabytes of RAM and a dual core Apple Twister CPU.
If you go back to the days when PCs still saw that sort of repeated yearly performance advances and try to run Windows Vista on a machine built to run Windows 95, you're going to run into the exact same sorts of problems.
My first-gen iPad Pro, the big one from 2015 or '16, is getting really slow, particularly with Safari, and it gets all the updates. Having some trouble understanding what's going on here.
But I'd also suggest doing a clean install using iTunes; sometimes OS upgrades go subtly wrong, slowing things down, and resetting via the device itself is insufficient to get a truly clean install.
Your first-gen iPad Pro is still new compared to what this dude is packing. There's an iPad from 2012 sitting on my desk that stopped receiving updates and, for getting any kind of work done, the thing is useless.
You're using a 5 year old tablet for work? That thing is financially depreciated 2 years ago. For security reasons alone you shouldn't be using it to do work.
I bought my first iPad primarily as a music creation device. I purchased almost every keyboard, recording and guitar amp sim app that came our for the first generation iPad, and had a ton of fun with them [0][1].
But then, the inevitable happened, the keyboard synth algorithms, multi track recorders, convolution reverbs etc., all improved and became more complicated, to the extent that the processor on the old iPad simply couldn't keep up any more. So I pretty much HAD to upgrade.
The old iPad is still in service though - my wife is a portrait artist and uses it to display reference pics next to her easel that she can zoom in and scroll around in. As the author of the OP has said - no problem with its aesthetics or reliability, we just reverted to using the apps that still work within the confines of its processor and memory limits.
The iPad I use for watching cartoons in bed is a minimum spec 1st generation iPad. It runs iOS 5.1 and hasn't updated in about 5 years. Most of the apps are no longer supported. I can still use NetFlix, Amazon Prime Video and Crackle. I am not sure about Hulu, CBS, or HBO Go as I am not currently subscribed to these. I can watch YouTube and some other video sites in Safari, which frequently crashes because it runs out memory.
Originally I used to charge it every 10-14 days. Now I must charge it every 3-4 days.
As a video watching device it is still fine. I would be happier if the utility degraded a little more gracefully. I can't think of a good reason why AdultSwim or CartoonNetwork shouldn't be supporting legacy iOS versions for their video players. Their target audiences are more likely to be using older hand-me-down devices. On the other hand, this device is also now the only device I can use to still play older 32-bit AdultSwim games like "Monsters Ate My Condo" and "Mole Attack" that no longer run in iOS 11.
iPad mini 1 are selling on ebay for 80£, I wonder if there isn't anyone exploring the business opportunity here for corporate software, were instead of buying 300£ iPads to show the schedule of a meeting room, you can just code against an old version of IOS and supply your software with hardware at a high discount.
edit: had 30£ before, but was a bid not a buy now
Anyone doing this would be opening themselves up to a world of liability, and probably a lot of customers who would say no. I know my work is dumping older ipads even though all our integrated software for meetings etc works fine just because it could be exploited and isn't receiving updates. Even the possibility of that isn't worth it to a lot of places. It doesn't matter if it's connected to a "closed" network, it still has radio(s) on.
And since so many places are so slow to upgrade, this is a known attack surface. Once you exploit that old-ass iOS, you have a compromised machine with radios sniffing inside your environment.
I'd bet only a fraction of the cost of a system like you describe is tied up in the hardware. There's the cost of installing the hardware (running power, networking), managing it (device software updates), paying for a recurring subscription to the service.
Not to mention the risk of having old hardware on your network that doesn't get software updates anymore.
And anyways once a company is looking at something like this most will happily justify it balanced against something like "lost productivity".
Ipads where never meant to be computers. I mean just look at the new Ipad commerical with the little girl. This is made worse by the fact that it's completely locked down. You have no way to fiddle with it. If Ios was free software people could develop for the ipad so it could continue to do the things people expect of it, but it's not.
I took a broken hard drive to the electronics recycling bin in the basement last month. It's not often emptied, as it takes a long time to get even half-full.
Inside was a Macbook Air, a 2012 iPad, an iPod Touch, two 8GB iPods and an iPod Shuffle.
All except the Macbook are in full working order, but the iPad and iPod Touch were frustratingly slow to use.
Curious how old the author is. Those of us who had computers in the 80s and 90s remember how 3 years or so was what you got before obsolescence set in. Mobile devices are still developing at that kind of rate. Desktops and laptops last so long because for 99% of needs they maxed out on necessary specs many years ago.
My older iPad (yes, first gen iPad from 2010) is eight years old and it works great (from my perspective :)
Yes, I can't download most new apps, but last year, I released an update for 'Economy for iPad' (#1 Finance app for several weeks in 2010) and that app continues to support the original iPad and iOS 5.1
Generally, the best way to keep old devices running well is to keep the old OS. (Old major version at its latest minor version e.g. 10.3). There is a slight risk of not being able to pick up security fixes that didn't make it into 10.3, but made it to 11.0. However, as a developer, who tests apps on older OS versions, I keep old test devices with old OS versions on them.
Btw I should also add that Netflix runs perfectly on my old iPad with iOS 5.1 :)
I bought a refurbished PC in 2007 for around $400 and I recently put Linux on it and gave it to my mom. It's definitely fast enough for her to check email, watch the occasional Youtube video, and surf Amazon/Craigslist/etc.
After all these years, I finally picked up an iPad. It is an iPad Pro 10.5 with the smart keyboard. I like the device, its fast and has good utility for photo editing, like Affinity photo. Should I expect the device to slow down with each iOS update? Does the whole OS slow down or is it specific apps? Is this article accurate?
What is the average lifespan of iOS devices, given hardware is taken care of? This is my first iOS device.
Edit:
Has there been a change in how older devices are impacted? For example, iPads before iPad Air are more likely to be impacted by updates. Any iOS device after the A7 chip is now less likely to be impacted.
This is a red herring, the real reason why we even get to have this conversation is because Apple provides full OS version updates to devices for much longer than any other mobile platform. Your 6 year old iPad is a bit slow on the latest iOS? Uh, my 6 year old android tablet hasn't seen a new OS in 5 years.
5 years isn't all that long in the world of computers (and Apple is trying really really hard to market iPads as computers).
Any proper computer will be upgradable for way longer than 5 years when running Windows or Linux. iPad won't be because you're not allowed to install your own OS.
Not really a fair comparison when you consider that PCs have been plateau'ing in performance for at least that long, while the tablet/mobile market still delivers large performance gains year-over-year.
I don't know what Apple's iOS feature roadmap is, but I'd say there isn't really a rule of thumb you can apply.
My iPhone 6 struggled mightily with iOS 11 (even with an upgraded battery). It is the only time I have regretted an upgrade (since iOS 3) across 3 devices.
On the other hand, Apple has steadily expanded the number of years they provide updates for their hardware, up to about 5 years now.
I also recently got an iPad Pro 10.5", and I think that the Pro 10.5" in particular will last longer than most iPads.
The only three iOS devices with 4GB of RAM are the iPad Pro 12.3 first gen, iPad Pro 12.3 second gen, and iPad Pro 10.5. Earlier iPads (and iPhones) were often working with much less RAM so a newer device with more RAM coming out would make the older ones obsolete faster.
The original iPad only had 256mb of RAM, and the first iPad with 2GB was the iPad Air, released in 2014.
4GB is going to be the maximum amount you can get in an iPad for a while, I think, and RAM increases matter much less now than they did when it went from 256mb to 512mb.
The processor in the 10.5 inch pro is also the first iPad processor to have 3 high performance and 3 low performance cores, coming up from 2 cores for everything.
This advice is a bit late since you already made your purchase, but if you want your hardware to survive more software updates, usually getting the “S” or “tock” versions of things has been more future proof.
Most of what this article is talking about is people who are using iPads that are 6-8 years old. Those first iPads were very limited in their hardware, such as the amount of memory.
I have a 4th generation iPad from early 2013 that I use all the time. Ironically, it wasn't until about 2 years ago when my wife got me a bluetooth keyboard case for it that I really started to get some use out of it.
It definitely isn't as snappy as it used to be, and the battery reports that it has lost 20% of its original capacity, but I use it all the time.
I primarily use it in meetings to take notes with OneNote, I also use it to SSH/RDP/VNC into other computers to write some code and such. I also have some means to work on code using a git app and Textastic. Yes, there are such apps available for iOS.
GarageBand and iMovie get some usage when I'm feeling "artistic" though I will admit they are starting to get too sluggish (GarageBand especially).
As far as games go, I mostly stick to the knockoff bargain-brand versions of big-budget games by Gameloft; Order and Chaos 2 (Guild Wars 2), Modern Combat (Call of Duty), Gangstar (GTA), Minecraft Pocket Edition, and retro game collections (Atari, Activision, Vectrex). I have a number of Bluetooth controllers that are compatible with those retro game collections that helps. I used to have my iPad jail broken and would play tons of emulators on it as well.
The main thing that's nice about something like an iPad with a keyboard is that with a bit of work it can completely take the place of your laptop and do it better. When I go away for a long weekend, I inevitably have to ask myself if I want to take my laptop or my iPad. It always boils down to how much I think I'm going to have to work, and usually it's the iPad I take. It does 90% of the day-to-day things I need to do, still has a relatively great battery life, and it's the size of a notebook even with the keyboard-case. And if things come down to it, I can still access my work computer or my webserver at home and do the whatever else I need to do (albeit in a less than ideal manner, but I've never had to say "sorry, I can't look into this right now cause I only have my iPad").
I will admit, it's the addition of a physical keyboard that really does it for me on the iPad. But keyboard or not, even within the Orwellian App Store I can be extremely productive with my 5 year old iPad.
I've been using my nexus 7 2012 model, and work commissioned a bunch, so I ended up with 4 of them.
So I hit xda, grabbed slim OS 4.4.4 and lineage 7.1, both run fine. Gave my extras to my kids, runs youtube at 720 fine, browse fine. Only had to buy 2 batteries for 12 bux each.
We had an old Ipad gen 1 that because totally useless with apples market. Doesn't appear Apple handles older versions like the Google market, showing files that are supported by your version. Kinda sad.
iPads also lose app support significantly faster if you don't upgrade to slower newer version of iOS - app developers drop support significantly faster than they do on Android.
For the more tech savy this is where an open source operative system would make the difference (iAndroid). Though Apple should provide some sort of alternative for the less tech savy. One thing is saying that we don't support your browser because it sucks and you have alternatives, the other thing is saying "the 500£ you spent 5 years ago are gone, we don't care the hardware is in good state just buy a new one"
The author mentioned repurposing old computers for non-frontline work. Can't you do that with old tablets? Load a bunch of PDFs, pictures or whatever into internal storage, take the tablet offline, use it like a bookshelf in the form factor of a single book?
That's basically what I'm doing. But even for pdf browsing it feels sluggish.
I mainly use my ipad as a third screen, placed on a dock to read through pdf books/documents. It feels more natural to read on it compared to the computer screens.
Old ipads can definitely be repurposed as book readers, mp3 players (without spotify) or movie players (with older movie formats)
I have an "iPad with Retina display" in the same boat. It is my media player. I use it to play locally hosted songs. It makes me wonder if I can configure XCode to target it any more (its on IOS8 or 9, which ever was last for that version).
The hardware is not really stabilizing, if anything the rate of change in CPU and GPU has been accelerating. Those two things are probably the largest contributers to the treadmill of obsolescence.
Apple has tried to keep the connectors standard, but Lightning will eventually go away. But since they have not moved to USB-C yet (on the iPad side), there is a chance that they will skip that and we will have Lightning until the next major change (probably to USB).
What I did for my iPad2 was to disable js on safari, which makes a big part of the web fast and usable again, in combination with reader mode. I had Brave istalled for using websites more interactively, and for sites like youtube. It has adblocking without any hardware requirements. If Brave still supports ios 9, you could try that. It will still lag, and certain apps were simply unusable for me (protonmail), but I increased its useful lifespan this way.
Some services (like Opera Mini) basically rendered web pages server-side (including Javascript), and served them to low-end phones. I wonder if one can just make the iPad a dumb display/input device, using a PC on the local network to handle all the logic...
TL;DR comments in this thread: People use their own anecdotes as proof that 1) PCs don’t suffer the same decline, 2) Macs don’t actually decline that badly
I kind of understand the author's point, but at the same time, there comes a time when any kind of technology becomes obsolete. I still use my 2012 Nexus 7 - awesome tablet when it was released, and it was an entry-level model even then, so to have eeked 6 years of life out of it is pretty impressive. I know the OS is no longer maintained and it's stuck on Android 5.0; this is the price you pay for embedded systems. You own the hardware, but have zero claim over the software that runs on it. I still have up-to-date Firefox, for the time being, but I know at some point I'll lose that and more, and my device will be unsafe to connect to the internet.
In much the same way you wouldn't try to play 4k video on a computer made in 1995, at some point the device's processing power maxes out. It's difficult to strike the balance between power, battery life and cost; Apple like to play the Rolls-Royce card of 'it's enough for you' in this regard, giving you a bit of room to expand in the future, but it's in no company's benefit to make your device last forever. With a device that cannot be upgraded, the best approach for them is to sell you another one.
I think the field of computing is particularly skewed by Linux, how most hacker-types will dig through their box of parts every so often, slap together enough components to build an old-generation system and install Ubuntu on it and voila, a second (third, fourth, tenth) lease on life for an old machine. And that's mostly true; Linux doesn't much care what it's running on, the OS can be adapted to run on anything. But at the same time, the applications running thereon will quickly prove why the hardware is previous-generation; sure, you might get the latest Ubuntu running, but what then? Add Chrome and a few tabs and it's going to start to creak; a few things running in the background and eventually you'll give up. Embedded devices are much the same; at some point, you will reach the limits of the hardware through no fault of your own. It's questionable how much by-design this is, with Apple admitting they silently force the CPU clock down as the battery ages, but even with fresh batteries, application developers will be taking advantage of newer hardware features that don't work efficiently or at all in previous models. Some of it may be progress for progress' sake, but try running a machine with full-disk encryption on a Core 2 Duo, without the AES engine - the new hardware makes a lot of difference.
It is a shame that the device doesn't visually age and when powered off has no indication of its age, but such is the way of the industry; it isn't Apple's way to make devices that last longer than their release cycle, and few companies have ever made much profit that way.
> Some of it may be progress for progress' sake, but try running a machine with full-disk encryption on a Core 2 Duo, without the AES engine - the new hardware makes a lot of difference.
I have been running full-disk encryption on my Linux boxes since 2003, and the performance hit has never been significant. With an old computer, the most important thing to do with your hard disk is replacing an old hard disk with an SSD. That alone will significantly boost performance, and compared to that, encryption isn’t a significant factor.
I dunno. He seemed to catch onto the fact that Apple's business model likes to screw over their users via planned obsolescence. That is more than most people learn.
Why pick on Apple here? Are there any Android tablets from 2010-2012 that still see regular use, get software updates, and new/updated apps? Is anyone still using their Blackberry PlayBooks from late 2010? It looks like the Galaxy Tab 2 from 2012 can only be upgraded to Android 4.2.2 Jelly Bean, roughly as current as iOS 5.
https://en.wikipedia.org/wiki/Samsung_Galaxy_Tab_2_10.1
Why NOT pick on Apple here?! There the most successful tablet manufacturer pretty much leading the direction of industry. Why are you so interested in deflecting their blame? Why do you think it's ok if others do it to?
I think it's clarifying to look at tablet manufacturers as a whole. The reflexive complaint of Apple is that their system is too locked-down and limited by their approach to software. However, at the same time, there were Android tablets being made in the same era, and it appears that they also suffer from some similar problems (being only limited to Jelly Bean). If so, perhaps the commonest complaints about Apple's approach to software is off the mark in this case.
All hardware products will suffer tradeoffs, and I don't necessarily find it reasonable to think that tablet computers should always be hardware-upgradeable, if it is at the expense of handle-ablity. The author's lament is that the iPad went obsolete before it broke. The PlayBook went obsolete nearly as soon as it was released. How did those Galaxy Tab 2s do? If the problem is that "it went obsolete before it broke", that's infinitely better than "it broke before it went obsolete".
The problem is that it went obsolete artificially in a horribly short timespan, with no alternative way of keeping it going. If Apple didn't lock down their systems, we could keep it going quite easily. I have plenty of Android phones that, while they don't get updates from the manufacturer directly, still get patches by running LineageOS or one of the other variants.
I'm not saying Android is great here, I think it's a terrible system, but it's not nearly as artificially hampered as Apple's stuff is.
Hmm if we're talking tablets, at least an apple tablet will be useful for some time starting when you buy it. Can't say the same about many android tablets...
What's frustrating though is that I can pick up a 20 year old general purpose computer from the trash, set it up with software from that time (never mind the legalities) and play around with it.
Not so with the one iPad 1 I keep around... I think you can't install any new apps if you try now.
Planned obsolescence is the business model for most major electronics manufacturers today. Even non-"smart" devices are not really designed to be repaired (or upgraded).
I think something that is missing from this is that it is no longer safe in this time to use old software. It is an existential threat to your own information, which is increasingly linked to your personal device like a phone or tablet. You will be hacked by a drive by exploit, or a mutant ad network. Its no longer about risky behavior, simply existing today online with old software can put you at grave risk.
How long should apple support these devices? 5 years seems reasonable, and very few vendors of hardware offer anything like this.
I agree it sucks, and there is no reason a 5 year old ipad needs to be dead -- but in context this is fine. If you had a 5 year old car that allowed random drivers to stab you in the face remotely, people would replace their cars and not complain that this was a problem.
> If you had a 5 year old car that allowed random drivers to stab you in the face remotely, people would replace their cars and not complain that this was a problem.
Ironically, some cars in the takata airbag recall are 15 years old [1] and basically stab you in the face, yet contrary to what you said, the recall continues, and is payed for by the car manufacturers. People didn't just say "oh well I guess I'll buy a new car".
It is no longer safe to use old browsers. Old software works as well as it ever did, subject to your OS being able to run it. There is a world of offline desktop software that surpasses the cloud in features and functionality.
> If you had a 5 year old car that allowed random drivers to stab you in the face remotely, people would replace their cars and not complain that this was a problem.
I would absolutely expect the car manufacturer to fix that for free. In fact we just got a recall notice two weeks ago for a part on our 2009 car, which we took in to get a free fix.
It feels worse with the iPad because its marketing made an impression of a sleek, "just-works" kind of device. Of course, its utility is tied exclusively to applications from a captive marketplace, and often, to remote services frontended by those apps or the browser, making its elegance a leaky abstraction. Today's Chromebooks, another "just-works" device, will be in the same situation in five years.
But with Chromebooks, or Android phones (if you're lucky), or x86 or x64 PCs, you can replace the stock OS and repurpose the device on the hardware's own remaining merits. An iPad is just stuck.