>* If the screen, camera, and speed were good enough in 2016*
You say it's a rhetorical question but I'm not clear on why or why this sentiment is so persistent. After all, the screen, camera, and speed absolutely WEREN'T good enough in 2016, any more than regular computers were good enough in 1986, 1996, 2006 or 2016. They were simply what could be managed at the time with technology at the time. The only aspect of electronics that is "done" for typical audiences [0] is audio, where we have microphones, recording and reproduction that can (easily) exceed the biological limits of human hearing. In contrast exceeding human visual acuity in capture, storage and reproduction remains a work in progress (though it's conceivable we'll hit it in the next decade or so which will be a very interesting change for our industry). That in turn itself drives some demand for computation, storage and processing, though more fundamentally it's hard to say if there is any real limit on how much computation might be put to use. Storage has been on a fast enough upward curve that I think it might be said it's approaching the point where regular people always have enough merely in the course of normal upgrades, but to handle an entire lifetime.
So yeah, come back in 2032 maybe.
----
0: Scientific applications of course are frequently interested in sounds that well exceed human limits, though even there we have the tech for it albeit not in non-specialized devices.
The big difference I see comparing computers from 1986 to today is that our demands for computers today are vastly different. The way we use computers today might have some similarities to 1986 for some, but for the average person its massively different.
However, comparing a computer from 2016 to today, its not nearly as far. A desktop at home I use pretty consistently is running a Core i5 from _2012_ and otherwise works fine. Other than VR gaming there's rarely a task I have that the old computer can't otherwise do, other than run Windows 11 I guess. Everything I do today on a phone, I did the same on a phone in 2016. Messaging, phone calls, email, calendars, apps that are largely interacting with web services to render images and text on a screen, streaming video, etc. All things I do today, all things I did in 2016.
Honestly my phone use cases haven't evolved much since 2010, maybe even several years before then. Things maybe look a little fancier, the cameras are for sure fancier, but the fundamental use of the device _for me_ hasn't changed.
You say it's a rhetorical question but I'm not clear on why or why this sentiment is so persistent. After all, the screen, camera, and speed absolutely WEREN'T good enough in 2016, any more than regular computers were good enough in 1986, 1996, 2006 or 2016. They were simply what could be managed at the time with technology at the time. The only aspect of electronics that is "done" for typical audiences [0] is audio, where we have microphones, recording and reproduction that can (easily) exceed the biological limits of human hearing. In contrast exceeding human visual acuity in capture, storage and reproduction remains a work in progress (though it's conceivable we'll hit it in the next decade or so which will be a very interesting change for our industry). That in turn itself drives some demand for computation, storage and processing, though more fundamentally it's hard to say if there is any real limit on how much computation might be put to use. Storage has been on a fast enough upward curve that I think it might be said it's approaching the point where regular people always have enough merely in the course of normal upgrades, but to handle an entire lifetime.
So yeah, come back in 2032 maybe.
----
0: Scientific applications of course are frequently interested in sounds that well exceed human limits, though even there we have the tech for it albeit not in non-specialized devices.