Moving from a PC like ecosystem to a tablet like one is a net loss in my opinion. People are willing to absorb this loss and enter into a walled garden because right now building iPad-like hardware is not easy. I hope it doesn't continue to be so in the future.
Can you go more into why the difficulty of creating the hardware leads to willing adoption by users that most likely will never see "how the sausage is made"? I would think it would be the opposite (commodity hardware and easy app creation would lead to wider spread adoption by the hobbyist crowd).
I think it will mostly boil down to what you grew up with. I'm 41, I came of age building my own PCs, getting excited about an OS (Win95), the tools of creation (the PC) differing from the tools of consumption (the smartphone/iPad). My son and daughter (3 and <1 yrs) will be exposed to PCs due to my job, but I bet a lot of their friends in daycare will see a tablet as "the way computing has been and always will be". (Honestly my kids may think that too, daddy just hasn't accepted how much of a dinosaur he is yet).
Even for the others, an open platform means a greater variety of software for end users. Creating toll gates on the way to software development or suppressing software which reduces the platform owners's profit is not good for end users. Many people who will never touch a command line still benefit because a Finnish computer science student could decide to start writing a PC operating system of his own and that OS now powers their phone or allows others to provide online services on servers that runs that OS.
If building iPad clones with an open hardware specification like the PC's becomes possible for multiple vendors I'm sure we will see a software ecosystem that will easily beat the walled gardens. The problem is that currently the number of firms that can build the hardware is small enough that they find it in their self interest to use closed specifications.
As you remember the PCs were the outliers, everyone else was mostly offering the same hardware/software integration as tablets.
In fact Apple is the surviving company of those integration and most OEMs what to get back to it, because they hardly make any money nowadays just selling parts.
Had it not been for Compaq and a failed lawsuit, building our own PCs would have been an alternative universe, which I never bothered to do anyway (was more expensive than buying a full new PC on my region).
As a developer of course there is the other big limitiation: the constraints of developing software on the iPad. For me, this is the main distinction between a device and a computer: the ability to program it. Due to the app store and its rules, iOS is a very secure and stable environment. This makes the iOS platform very attractive, and I think it was the right approach to start with a restricted environment and remove some restrictions over time. So I do think it makes sense to limit distribution of apps to the app store, but the restriction on what apps can do on the device should be reviewed, even if that blurs the lines to the app store. An environment like termux, which still could be completely sandboxed, would increase the usability of the iPad a lot. Or even, going completely crazy, a whole Linux VM in an app sandbox. On the other side, having a full blown IDE for creating and locally running iOS applications would be a huge step forward.
The hardware itself leaves little wishes - except perhaps for a small overwork of the smart keyboard to have that additional row of function keys (less urgent) and of course the escape key. And little improvements of the ergonomics of the cursor and the return key. While the non-programmers users probably can live with the current layout, I am really surprised that Apple is ignoring the needs of any programmers so thoroughly.
Are you aware of iOS Dev environments such as Pythonista? It’s even possible to take a Pythonista app, copy it into an XCode template and release it on the App Store, though of course that has to go via a Mac.
This generation will be poorer and dumber than the last.
Edit: and it's likely Gassee knew exactly what he was doing by using that term, since that article includes a paragraph about domain-optimizations that don't translate well to others, specifically calling out certain foreign words that have no direct translation. Whoosh to me...
Anyways, in many regards the growing segment of the market is more like sedans/hatchbacks on stilts than it is like trucks.
As for the pickups as "status symbol", it's irrelevant because the very symbol is the offroad "real work, real duty, real matcho mens" factor. Given Ford seems (as an outside observer) not to care about electric and self driving, focusing on bigger Lincolns and the F-series makes total sense as a business plan.
1) Let the other spend their billions developing their hipster tech
2) Use reliable incomes that will outlive any trend
3) Get mature technology on the cheap once it's commonplace.
4) Ignore "carmaker has a service" because it's not their core competence and there will be profitless races to the bottom by the "data warehouse" players (rather than profit for the carmakers)
The use case you describe is best served by having a manual override button and retaining a traditional steering wheel as opposed to having a different more manual product
When they sell a pickup to someone who doesn't need one (>85% of the sales), they sell the myth and the social status. So, as my last paragraph pointed out, they will but "not self driving" as a "status symbol" and they will sell like hotcakes.
Plus, the advantages of self driving for cars you own, considering the current price premium, wont be worth it for many, many years. For cars you /rent/ fine, they will "work" 24/7, but spending 15k-25k on LiDAR and sensors and data and engineers and (carmaker) insurances isn't worth it yet for a car that spends 98% of it's useful life parked.
When you buy the self driving car, the manufacturer has to put a lifetime insurance on the car. It's baked in the retail price unless they decide to rent you the right to use your own car (that wont be good press). So actually, you pay more for the insurance in the hidden fees. In that imaginary 25k there is:
* The hardware $$
* The engineering (past, present, lifetime codebase maintenance) $$
* Recalls because the LiDAR turn out of miss important things $$$
* Maintaining the mapping data forever $$$
* The forever/lifetime insurance if the car kill someone $$$$$$
* That thing called profit margins $ (or no $ at all)
* Buyback cost (because they really can't support it forever) $$
That is, of course, if they do the "right thing" and contractually agree never to choose to turn your car into a very large brick (agreed buyback if they decide to retire the tech). This assume a real self driving car, no steering or fallback.
Selling high tech full self driving cars seems like madness. Lane tracking and "cheap" camera tricks with "the driver has to be alert all the time, if he dies then go blame his corpse" are the only way to turn a profit on driving assistance techs.
Im short a small group of people requires a greater cost to profitably insure plus sin taxes for insisting on endangering your fellow citizens vs falling rates for everyone else.
It's a heck of a lot better than the MacBook Pro one which really has to be Apple's worst technological innovation in the last 20 years.
But I would still call the new MBP Apple's worst keyboard because, the arrows are totally useless. If you type emails all day -> great. If you're a developer -> rubbish (yayaya I don't do VIM).
A lot of dedicated vimers will use ctrl-[ instead of escape to avoid the inanity of reaching up there regardless of their keyboard, which is also a good approach if you're in the camp that maps caps lock to ctrl.
My only problem with the touch bar now is that it has crapped out on me more than once requiring a restart.
Maybe this whole thing was just a big troll by the Apple engineers to coax more people into doing it "the right way" ;)
It's sort of like the nerdier version of the Jobsian "you're holding it wrong".