Any article that speculates on Apple's future viability in the market would be remiss not to mention Apple's unprecedented cash hoard. Currently, Apple is sitting on $126B in net tangible assets.
With 100,000 full-time equivalent employees, that means that Apple could pay every one of their full-time equivalent employees (which, as a reminder, includes a large number of retail employees) an average yearly salary of $100K for 12 years while making $0 in revenue and still not deplete their cash reserves!
Or, to put it all in perspective, based on Wikipedia's cited estimate (https://en.wikipedia.org/wiki/International_Space_Station#Co...) the International Space Station cost $150B to construct. Personally, I'm looking forward to visiting the iSS at some point in the future.
That's not actually how things unfold when a company gets into trouble. Your scenario is the perfect, idealized, never-happens outcome. We have dozens of historical examples at this point, from elite, massive corporations unraveling: what they basically never do, is the strictly rational idealized scenario.
What actually happens is closer to chaos loaded with self-interest and stupidity by management and the board.
The board gets attacked aggressively by large investors. Existing management and the board goes into bunker war bribery mode.
The board bribes shareholders, further depleting its cash position. See: IBM's most recent idiocy as their revenue melted for 23 straight quarters, while they continued to buy back stock aggressively.
The cost of their staggering $100+ billion in debt climbs over time as their financial and earnings situation erodes. By the time a serious problem erupts, they'll probably have $200+ billion in debt (they'll have that in just four or five years now), continuing their existing mistake.
They intentionally avoid paying off all of their debt while they can, to hold onto the cash (because hey, debt is cheap today, forget about tomorrow, tomorrow is valhalla and rainbows), which they then make the mistake of expending to bribe shareholders as the stock plunges. Because, you know, things are going to turn around any day now.
Then they end up with less cash than they have debt, and down it all goes from there, in a spiral they can't pull out of.
If we're talking about what happens in a bad scenario, that's more like what commonly happens in reality. That net cash position will not be preciously saved to protect employees, the employees are the first ones to go. That net cash will be used as a bribery slush fund.
Their earnings haven't increased in years. They're facing a zero growth future, if only because of their immense size. The smartphone market contracted by ~9% in 2017 (-16% in China). They're facing a PC market scenario, where consumers start replacing devices a lot less often (while simultaneously smartphones face increased competition from all sorts of cheaper AI-focused devices). Their cash accumulation is far slower now than it was in the past, because it's all going to shareholder bribery. Meanwhile, they've added $50 billion in debt in about two years, with that pace continuing in the latest quarter. Their income to debt ratio has been eroding for five straight years.
Consider: right now Apple is making the incredibly foolish mistake of aggressively buying back stock with their capital, while its AI efforts are an embarrasment and they get their ass kicked by Amazon. Spend $50 billion catching up in AI etc? Or buy back stock? Apple has made its choice, and it shows.
Just because you have a hundred billion dollars doesn't mean it's worth spending it to try (and quite plausibly fail) to topple an entrenched competitor in a tangential line of business. Monolithic corporations are massively inefficient because the combination of wealth and bureaucracy is the recipe for waste. Let them give the money to shareholders -- isn't that why they were supposed to be in business? Then if it makes sense to invest the money in some other line of business, the investors will invest it in some other company doing that thing.
The fact is that smartphones are now a mature product. There will continue to be incremental improvements, but nobody is really expecting a future iPhone to be to the current iPhone what the original iPhone was to Blackberry. The market is now a cash cow. They have an existing technology and customer loyalty that they can milk for multiple years until increasing competition eats away at their margins, doing that is a profitable medium-term strategy, so that's a completely reasonable and expected thing for them to do.
They could invest their cash in some speculative technology and maybe get lucky, but so can their shareholders.
Apple makes more revenue and far more profits than Amazon, and a much higher percentage of those earnings come from products with AI baked into them--compare the iOS business to the Alexa business.
I'd be inclined to give more weight to your argument, but you've conveniently left off the part where Apple has used debt to buy back shares primarily because so much of their cash horde is held overseas due to the US's wonderfully archaic tax code...which has just been overhauled. I fully expect 2019 to see Apple re-repatriating large sums in order to pay down that debt.
You do make a couple valid points but on the debt side are wrong. The only reason they have such a high debt is due to USA tax policy on repatriation. This has now been removed with new law so what will happen is there prob will not be anymore borrowing anymore. The CFO even mention they will be cash-neutral. Net net Apple will have very negligible debt moving forward and def nowhere near the level it is today or your +$200B.
If you look back just a few years, Apple has responded multiple times to saturated markets by introducing new product classes. We don’t have iPods anymore for instance, now we have watches (even though at one point you could have argued that the music player market was full too). If phones really do start to lose steam, Apple should have plenty of opportunity to create a new interesting product.
Alexa (Applexa?) could have been a natural Apple product. HomePod is clearly a catch-up, and not a particularly interesting one given that Alexa has already moved to built-in screens, and creating an AI home-hub in a variety of form factors, both big and small, is a trivial update now.
Alexa gives consumers utility/service computing without all the tragic time-wasting updater/installer/management nonsense forced on them by desktops, laptops, and (to a smaller extent) tablets and apps.
Cook's Apple has missed the point of this, which is the subtle but huge difference between consumer utility computing and content and hardware consumption computing.
Jobs was a genius at making everything fit together, and I suspect he was reaching for a utility strategy. The iPhone and iPad were the first generation of utility devices, while Siri was a first gen utility service. They were wildly impressive for their time, but still limited by elements of desktop legacy thinking - which is why we have to wait while apps launch and close, instead of just being able to do whatever we want to do.
Cook sells SKUs, not synergies or utility. I see no evidence that Apple is able to think of the future with a unified vision that doesn't involve selling things - even if those things are music and video content units - enhanced by some very constrained AI.
Right now Amazon is more of a synergy company than Apple. The aim seems to be total domination of retail, but there's going to be some overlap in services and in hardware too, and Amazon seem better placed culturally to innovate in ways that win that race.
The iPod was a catch-up product in a crowded music player market. The iPhone was a phone in a world of massive, dominating phone makers. Heck, the Apple II was a computer in an IBM-dominated world.
Apple is not “first” that often so who cares if they didn’t make an Alexa before anyone else?
Instead, Apple tries to get it “right” (and sometimes they still don’t but that is their strategy nonetheless).
Alexa has plenty of flaws. It is functional but not intelligent.
You end up spending lot more money than you make. To counter that, you invest $20 billion dollars in a game changing innovation but it fails leading to a write off. Then you go acquire a company that also becomes a dud. There are 100 different ways to go broke. Take a look at how GE messed up.
My Windows devices since 97 were consistent top of the line and rarely lasted more than 2 years before they slowed to an unbearable speed.
In 2012 I switched to Mac and still use the same MacBook Pro which still runs as quick as it did 6 years ago. It was a huge investment at the time but turned out to be cheaper than the 4+ Windows machine I likely would have cycled through not to mention the boot time and performance gains.
Just a pet peeve, Moore's law isn't tied to performance, it's tied to transistor count. Also, it has been pronounced dead every year for a very long time, but it hasn't actually been dead until the last few years, and some people might even argue that it's still not dead. Your comment is still completely relevant though, since performance increases affecting average users began to slow around then
That was true in the late 90s. It's not really true anymore.
I built a Windows PC using parts from MicroCenter back in 2014. Core i5-4690K, 16GB RAM, 512GB SSD, GTX 970 running Windows 8. It was around $1400 USD at that time, if I remember correctly.
It's now 2018. That same PC with all it's original hardware, is now running Windows 10 just as quickly as always. Not only is the performance still great, but my 4-year-old mid-range PC still outperforms many brand new PCs today. (It's still slightly faster than the current 21in 4K i5 iMac, for example).
If you had bought a Windows machine in 2012, it would not have "slowed down" the way your 90's computers did, and could still use it at it's full original performance today (just like your Mac).
Indeed. I've been using home-built PCs since the 1990s, more than a dozen in total, each with lifespans of 4 or more years (many overlapping with one another in periods of usage). Most using some flavor of Windows NT starting with NT 3.5.
In all that time, I've never experienced a computer getting slower during its usage lifetime. It's a story I have read and heard from others, but I don't know if it's real but caused by some usage behavior I don't exhibit or imagined.
I do know that in the 1990s and even more recently, but to a lesser extent, when I examined other peoples' Windows computers I would find they had installed legions of third-rate applications that were on the precipice of being malware. And this was a cascading problem because people would install something harmful to their PC such as iTunes, Quicktime, or RealPlayer and then attempt to resolve the resulting performance problem by installing a snake-oil system-tuner which in turn made matters even worse. Or they would install an anti-virus tool and their system would slow to a crawl. Unraveling all of this was never fun.
My decades of experience with Windows is that if you simply avoid harmful software, performance will remain essentially uniform. The systems do not decay by some natural process as many popular stories imply.
I don't know if I can agree with you on this. In ~2016 I built a PC (i7, 24GB RAM, 1.5TB ssd, gtx 1080) and my 2014 17 inch mbp has slowly been closing the gap. This is probably more due to software optimizations than hardware, but things just open and close more smoothly and quickly these days on the laptop.
I think the real issue is that Windows hasn't been effectively utilizing the RAM I've given it.
Windows 10 has a nice feature to "refresh" which supposedly re-installs Windows while keeping programs and settings. I haven't tried it, so no idea how well it works, but it might be worth your time
IMO, that's more to do with Intel than to do with Microsoft and Apple. Sandy Bridge, released in 2011, was the last Intel processor that was more than 10% faster than previous generations. We used to see ~25% generational speedups from Intel. Since 2011 we've only seen 5-10% speedups.
Capitalism hates it though if there's no reason to sell new devices. Although I think we still have a lot of headroom in terms of memory performance that will matter just as much as the previous CPU performance increases.
> Capitalism hates it though if there's no reason to sell new devices.
"Capitalism" (the free market) is us. And I for one do "hate" that CPU performance is not increasing as rapidly as it did in the past. I would very much like to see a return of significant generational performance improvements. I would happily upgrade my PC more often if it resulted in the significant gains of yesteryear.
I find it a curious asceticism to cherish stagnant technology because it means my 5-year old PC is still relevant. Yeah, that's the silver lining. But let's be real: if you had a PC that was twice as fast in every performance dimension that I could afford today, I would buy it.
Companies that get a competitive advantage or can command a premium from selling long lived products don't hate it at all. It's the core of their business model.
If the longevity of a product is a selling point, can provide a competitive advantage or help capture a premium then Capitalism is absolutely fine with it.
I can see it now: "Facebook now offers faster news feed rendering for PCs equipped with Intel 10 series FPGAs!"
Kidding of course, but I think the FPGA/CPU combos will be more relevant to servers than end users, at least initially. FPGA high level synthesis is miles away from being useful for consumer level application acceleration, and GPUs are much more accessible
Well, when I was using Windows more than 10 years ago... formatting your computer every 2 years was a good way to maintain performance, otherwise it would just slow down due to disk fragmentation + crapware accumulation. On Linux I could also just install the OS once and use it until I upgraded the machine... like in 6 years.
When I was using Linux more than 10 years ago I didn't even format my disks when upgrading my machine, I just moved the boot drive from the old machine to the new one.
OTOH, I've reformatted my 2012 Linux machine twice since I purchased it, which is twice more than I've formatted my Windows machine over the same period. Granted my Linux box is much more heavily used, but...
Yep - in the long run it has been much cheaper for me as well. I moved full time to Mac in late 2008 with an aluminum Macbook, and used that Macbook daily until May of 2015 when I picked up a loaded Macbook Pro 15. I got almost 7 years out of that first aluminum Macbook (which I actually still use to serve media to my Apple TV), and I bet I get that (or more) out of my Macbook Pro. Previous to 2008 I was getting a new Windows laptop pretty much every other year as I always encountered the same slowdown as you describe. I think that Windows laptops get better than 2 year now due to decreases in overall CPU speed improvements relative to what we were seeing between 2000 -> 2010, but the build quality of the Macbook was something no one else could touch at the time.
I think the performance improvements have drastically changed due to the adoption of SSDs in Apple's line up. The Macbook Pros in the early 2012 started to incorporate them around that time. Windows laptops took awhile to catch up starting with 'ultrabooks'
Apple’s growth is almost entirely inwardly-focused when it comes to its user-base: higher prices, more services, and more devices
I find myself increasingly stuck in an Apple 2011 - 2013 universe.
For starters Apple has not yet provided a viable upgrade for my primary desktop machine: a 2011 Mac Mini (2.7GHz dual-core Intel Core i7 with 8 GB ram and SSD). The specs on the current 2018 Mac Mini lineup are considerable lower than yesteryears Mac Mini. No idea why? I'd love to upgrade to a faster Mac Mini, but there's really no options, even 6 years later. So nope.
Then there's my 2013 MacBook Air (1.7GHz dual-core Intel Core i7 with 8 GB ram and SSD). This thing has been a workhorse. That said I'd love to get a similar form factor with Retina, however, the only somewhat viable upgrade option seems to be the 12" MacBook. Too small, so nope.
Finally, there's my iPhone 5S (2013 as well). It's showing it's age (sluggish at times), however, I've replaced the battery and to be honest, it's still got some life left in it. Admittedly, the iPhone is the only Apple product I own that actually does have a viable upgrade path.
Unfortunately, with the lack of upgrade options for my primary Apple devices, and issues I've had with iTunes and Apple TV, I'm hesitant to buy in again ...
Most other Apple devices and products I own simply collect dust:
Various iPods and thousands of purchased iTunes songs collecting dust - as I switched to Spotify. I switched to Spotify when iTunes + iCloud (perhaps user error, not sure) somehow deleted all of my ripped MP3s from along time ago. Generally speaking the iTunes app itself got to the point where it was unusably bloated and convoluted. Not sure why.
Apple TV collecting dust. Replaced by an Android TV due to not making it easy to play all file formats.
AirPort Extreme collecting dust - had multiple issues and it eventually stopped working (bricked).
Finally, MacOS itself has been lacklustre the past couple of releases. I hesitate to upgrade now, due to various bugs and lack of quality control. Apps I previously purchased on the Mac App store have stopped working as well.
So while Apple embarks on a new chapter in it's "middle age", it seems like the previous generation has been left to rot ... with not much in the way of upgrade options. How is this an inwardly-focused strategy?
Agreed. I switched whole-heartedly in 2006, when the combination of Unix, mainstream UI, professional apps, and "It Just Works" showed a lot of promise. The fact that my 2006 MacBook Pro lasted for six years earned a lot of loyalty.
But there doesn't appear to be any strategy or overarching vision anymore. Changes are introduced haphazardly, almost schizophrenically. There is zero focus on software or hardware quality. "Design" at Apple is no longer about the marriage of tech and artistry, but about tech being trumped by artistry.
- Make the keyboard work better? Maybe marginally with Gen 2 butterfly switches, but it's still mostly about the "wow" factor of the Touch Bar.
- Make iTunes work correctly? Nah. Just change the UI.
- Extensive pre-release regression testing for High Sierra? Nah. Ship with root access bugs and a new wallpaper.
- Keep making best-in-class wireless routers? Nah. Not sexy enough.
Apple has abandoned the values that led it to become so rich. It is still profitable, and probably will be for some time, but eventually its wellspring of good will from past successes will run dry. Features don't matter if they don't work.
> Apple TV collecting dust. Replaced by an Android TV due to not making it easy to play all file formats.
The previous Apple TV incarnations were not great, but the current Apple TV is fantastic.
If you need to access your own media, you can run an app such as Plex, Kodi or several of the other alternatives. I've only used Plex, which covers all of my needs.
With the current streaming offerings I'm increasingly less dependent on BitTorrent. Amazon Prime became available last year, but I especially appreciate niche players like FilmStruck (which has a small but rotating subset of the Criterion Collection) and Fandor.
Yes, I used Plex for a couple of years, but it was not really "standalone" ... I had to have a media server (my Mac Mini) running constantly. I looked into Kodi on the AppleTV but it was not trivial to install.
Overall it was easier to just purchase an Android TV device (Nvidia Shield).
I too find that Netflix, a few niche players and the odd movie rental from Google Play covers virtually all of my needs, but there's always something that comes up (someone brings some USB stick over or whatever) which Apple TV doesn't support.
It's odd that Google can develop a TV platform that supports all file formats easily, multiple players etc. and with Apple you still have to jump through hoops to play a file ...
> iTunes + iCloud (perhaps user error, not sure) somehow deleted all of my ripped MP3s from along time ago
I heard stories that Apple Music or iTunes (I don’t remember which one) would scan your local library to see what you owned, add the songs to your Apple account and delete the local files to save space. See [1] (the title is misleading, you can still lose access to your music as explained by the article itself).
The Mac mini at one point had a quad core option. It actually had the maximum spec option reduced some time ago. The current Mac mini is less powerful than one you could buy several years ago.
As Apple moves into its middle age, the rational move is to become more like the established vertically-integrated companies.
* Intel
* Samsung
* The British East India Company
If Apple leadership decides to accept this, they will begin to act like their peers:
* Recognize that the smartphone market will not grow like it has in the past. Create and then dominate standards using IP - barrier to competitors that simultaneously blocks a lot of regulatory burden.
* Recognize that Apple will not be the "innovator" that it was in the past. (Apple will still innovate, but not home runs every time.) Nurture the ecosystem around their product - allow small companies to find exciting new products, then acquire them.
* Recognize that their financials will need to adapt. Diversify their revenue stream beyond just one or two plays.
Apple still has the baggage of its own proprietary reinventions that they try to tightly control. iCloud. Lightning port. AirPlay. MacOS. The net negative impact comes as they gradually end up falling behind nimbler companies, like Amazon.
And yes, this will lead to the ultimate death of Apple as all behemoth conglomerates eventually choke on their own red tape and drown in their own largess. But Apple's days of youth are gone; no amount of dieting or exercise will bring them back.
I don't actually think Apple for the next 5 years will make the dramatic changes needed to realize these ideas. But perhaps a more interesting question is:
Is there any other avenue for a company who has reached middle age?
> Recognize that Apple will not be the "innovator" that it was in the past. (Apple will still innovate, but not home runs every time.)
I don't know where Apple gets this reputation, but they have hardly hit home runs every time. I mean, does no one remember the Newton? Final Cut X? The original Apple TV? The trash can PowerMac? Does anyone think the Touch Bar is a home run?
Which is good, because innovation necessarily means not hitting home runs every time, and Apple knows that. That's why they hoard so much cash, so they can afford to miss sometimes.
I question whether you understand Apple's core business. They can't vertically integrate like old school conglomerates because they don't have anything to integrate. They own very few assets apart from their IP. Intel and Samsung own factories; Apple does not. The East India Company owned land and ships; Apple does not. Amazon owns warehouses and many $billions of inventory... Apple does not.
From an IP perspective they are already vertically integrated, from chip design to UI to services to retail. But then you ding them for that...
Absolutely - Apple is vertically integrated as much as they need to be.
They don't need to own Foxconn or Samsung. It doesn't give them any advantage, and threatening to drop a supplier gives them leverage.
They do engineer capacity shortages to slow down their competitors. From 2011 [1], to 2014 [2], to 2017 [3].
I put "innovator" in quotes for exactly the same reason. Apple "innovates" as much as they need to, such as branding it #courage to omit the headphone jack.
If I "dinged" them, I don't see it. I predicted their future strategy, with the caveat that it might take them years to begin executing on it.
Apple's earnings calls are the most obvious signal that their "core business" is what all the chatter is about. Apple knows they can't ride iPhone and iPhone alone forever.
Apple Watch Series 3 is roughly equivalent to the original iPhone. It's a weird blindspot for all the Apple critics.
Best as I can tell, Apple is 2 - 3 years ahead of everyone else.
--
My singular and persistent disappointment remains iCloud. I still want seamless sync and handoff and backup and unlimited storage... That just works. That doesn't require me to do tech supp for my family members.
It's getting incrementally better, but isn't quite the whole enchilada yet.
No one has cracked the ubiquitous computing nut, so maybe it's just a hard problem.
I am on the same page as you...I think the Apple Watch is going to be the replacement for the iPhone. In fact this will be the largest segment as we slowly see most functions move from the phone to the Watch (yes I realize that there are some limitations due to screen size) but in general you will see much move over while the Apple Glass will take over the other function that you need a image of.
> Even iPhone X shows they can still bet big with their flagship.
Does it? The iPhone X (which is a great phone and one that I own) is a Galaxy S8 with a Face ID scanner. They didn't "bet big" on it, it was the only logical place to take their market.
Also doesn't have that Bixby shovelware or exploding battery feature. I get that a lot of Samsung phones are shipped and sold, but there's a difference in shipping shit at scale and shipping quality engineering at scale, and it's a tale told in decades, not single years.
I generally agree that they didn't "bet big" on iPhone X, but I really hate equating it with the S8 because that was a terrible device in my opinion.
First off, just from being in my empty pocket for 2-3 days, the screen was covered in scratches that made it difficult to see a portion of the screen in direct sunlight. I've never had this experience with any other phone.
Then I dropped my S8 from less than a foot and it cracked the glass on both the front and back of the screen. I haven't dropped my iPhone X yet, but based on my experience with recent iPhones it can probably take a lot more abuse.
The S8 also had some pretty annoying design/UX flaws. The fingerprint reader placement next the camera was amazingly stupid, and everything related to Bixby has been somewhat of a disaster. I had tried to reprogram the Bixby button to do something useful, but then it was still slow and Samsung insisted on breaking that functionality repeatedly in updates. They ought to know well enough that nobody wants to use their stupid assistant. Or cover their camera with fingerprints.
> based on my experience with recent iPhones it can probably take a lot more abuse.
I closed a taxi door on my iPhone X just before Christmas - phone fell out of my pocket, door caught it horizontally. Cracked the (thin) screen protector but no damage to the phone itself.
In hindsight, knowing how well Face ID works yes, but going all in on Face ID and ditching Touch ID completely early on in the design process was a real risk. I don't see how anyone can deny that.
They could have easily compromised their vision by including touch ID and/or a button. Samsung does this all the time: include every option, just so no one is left unhappy.
Alternatively, "include every option so everyone is left slightly unhappy."
I can't imagine ever going back to a phone with a button. Every time I pick up my daughter's iPad, I try to use gestures on it before remembering that I have to press that button, which already seems like a relic from the past.
It took minutes to get used to swiping up instead of pressing a button. It now feels like the most natural thing in the world, and I wonder why they ever started with a button. (Not really, but that's how natural the swipe gestures feel)
Personally I don't think it's ugly. I think having a solid bezel across the top but not the bottom would actually look worse. And it actually makes the new gestures more intuitive: swiping down from the left of the notch does the same thing that swiping down did before, and swiping down from the right of the notch opens control center.
Unless having a notch was their vision (which you are implying, but I highly doubt), they compromised. And as I said, the compromise was much bigger than having a discreet sensor on the back.
That you got used to it is irrelevant. People got used to not having control on their own hardware too, doesn't mean anything.
Not who you responded to, but I think that it's clear Apple intended on using the notch as a visually distinct identifier to an onlooker, "This is the new iPhone." Multiple times since I got mine at launch, people have noticed me using it and commented/asked about it. The hardware designers at Apple are clearly smart enough to be able to work out something if they didn't want the notch, but they (and marketing) decided to use the size of the sensor package to their advantage. Without the notch, there's no easy way to distinguish at-a-glance what brand the featureless-slab-of-glass is.
Regarding the second part of your comment ("that you got used to it is irrelevant"), during normal usage the notch doesn't obstruct any content on the screen. The screen real estate was gained by the removal of the bezels on the older models, so there's been an effective increase in vertical resolution despite the notch. In 16:9 content viewed in landscape, the notch is hidden by the black of the OLED, and is not noticeable. There's nothing to get used to other than there being a black notch at the top of the screen when used in portrait mode, which is made less abrasive by apps not normally taking control of the upper-left and -right corners (and so that vertical section would be unused anyway).
> The hardware designers at Apple are clearly smart enough to be able to work out something if they didn't want the notch
I don't think it's clear. Do you have anything to support this assumption? I haven't seen an iphone that is bezel-less without any encroaching notch. They haven't yet been able to eliminate the camera bump, either. This is basic stuff. Don't have crappy cut-out on the screen, don't have protruding camera from the back. Also, even with thew notch, the side bezels aren't nearly as thin as they are on competitor phones.
Pretty clear that they weren't able to reach a design they truly wanted.
The most interesting points of the article were this:
1) iPhone unit sales growth has essentially stagnated, because the overall market has reached saturation.
2) Apple has successfully managed to maintain growth in the face of 1) by increasing revenue per device (ASP).
I think the question of "What does Apple do, now?" is a hugely important one. However, I disagree with the author that the only reasonable approach Apple has is to "settle down" in its middle age position.
That question is important, because it implicitly asks, "What is next?" I don't see Apple deciding to not create a product that attempts to answer that question. We will have to wait and see what that product looks like (AR?), whether it is successful, and when it will come. Apple is definitely not going to rest on the iPhones success forever, though.
I'd add to this that Apple's current approach to AR and wearables is not one that necessitates the cannibalization of the iPhone. They are taking a constellation approach (as noted in the article), with the iPhone in the center.
I think that approach is not one based in defense, but one that actually plays to their strengths: personal products with an excellent experience, vertically integrated. Because of AR's computational requirements, it will be a long time until we have an AR experience that is untethered to a mobile computing device. A vertically integrated, constellation-based system will offer a better user experience, at least initially. Intel's recent Vaunt glasses [1] could be much more powerful if Intel also controlled the entire device the glasses co-ordinated with.
I actually think it is a deliberate cannibalization being done with here. But they are doing it slowly as not to be a shock to the system. Look at the Apple Watch...my gut you will start to see every year more and more of the functions on the iPhone move over. The beauty of this is ppl will not bark at this as the Watch becomes the yearly replacement cycle vs. the phone which will be every 5 years.
>I don't see Apple deciding to not create a product that attempts to answer that question.
It's funny you say this and then bring up AR. Apple has been ignoring VR and AR for a few years now. Sure they released an SDK but it felt like a box they had to check more than an attempt to innovate.
Apple is focused on products, and is only interested in technologies as they can be leveraged into products.
>Apple has been ignoring VR and AR for a few years now.
I strongly disagree with your assessment of ARKit. However, Apple has indeed ignored AR and VR from a _product perspective_ because right now, the technologies cannot be leveraged into compelling consumer products.
From a _technology perspective_ I think Apple has very much not ignored AR.There is a lot pointing to the fact they are pursuing it, including Tim Cook’s own words, and the existence of ARKit in the first place.
I really like stratechery, but I don’t get this post. Example
Instead, I think the order goes like this:
Customer owns an iPhone
Customer subscribes to Apple Music because it is installed by default on their iPhone
As an Apple Music subscriber, customer only has one choice in smart speakers: HomePod (and to make the decision to spend more money palatable, Apple pushes sound quality),
from which Apple makes a profit
Well, I don’t know. Maybe it’s just:
Let’s make exceptionally good products that people love
I have read somewhere, that Apple has a prioritized list of products to build, which they update yearly at their top 100 (?) leaders’ workshop. Maybe someone from Apple reading this could comment :-)
[A]s a general rule, challengers pursue interoperability while incumbents strive for incompatibility.
This is Strategy 101: seek to fight battles where you have the greatest advantage.
If Apple bought a mobile carrier it would be competing with its customers. Most phones are sold through carriers - meaning the carriers are buying iPhones not the end user.
Besides, Apple sells phones worldwide. What would be the benefit of just buying a carrier in the US?
Netflix made a similar calculation years ago. Netflix actually created what is now the Roku, but decided to spin it off to a separate company so it could more easily make deals to have Netflix installed everywhere without being seen as a competitor.
In a completely separate market, another example is that PepsiCo use to own KFC, Taco Bell, and Pizza Hut. But that made it harder for them to do deals with other fast food companies because they didn't want to make deals with a competitor.
> What would be the benefit of just buying a carrier in the US?
Buying a carrier would indeed be a bad move, both for competing with customers and maintaining a lot of infrastructure, but I could see Apple becoming an MVNO. Wireless companies are right down there with Comcast and the airlines in terms of customer satisfaction, so I would guess there are plenty of people willing to pay a bit more for a less hostile carrier.
100% yes, Apple should own this part of the experience. Increasingly, their devices are requiring constant connectivity. The Apple Watch comes into it's own with cellular - but the experience of adding it to my carrier (EE in UK) was absolutely horrific and an absolute nightmare.
A carrier should just be dumb pipes. Slowly, Apple has been eroding the value that a carrier adds (the iPhone Upgrade Program being the single biggest shot fired towards carriers). I would love to never, ever have to deal with a carrier and their scammy sales tactics ever again.
What can Apple gain from being a carrier? Seamless set up of devices and international roaming. In Europe, we already have data roaming across borders and it is incredible to have a device "just work" when I step off a plane. Bringing the "it just works" mentality to the service layer makes a lot of sense for Apple now. Importantly, unlike carriers, Apple doesn't need the cell network in and of itself to make money. It just serves to push the model this article articulates so well.
The biggest barrier they are going to face is the connectivity issue - they can make the best augmented reality glasses or cellular AirPods, or wireless, portable outdoor speakers, but without seamless connectivity, they won't be able to "just work". The rollout of 5G might just be the opportunity they need to execute on this.
Seems to me that when you boil it down Apple's primary strategy now is to focus on convenience and simplicity in a purposely anti competitive way.
What advantage does Apple music have over a service like Spotify besides being pre-installed? What about iMessage? What about Facetime? What about the MacBook with a single port?
Apple is not worried about competition or more importantly innovation, it instead focuses on creating a walled garden that can keep its users happy. The HomePod is not trying to be innovative it is trying to keep Apple users in their ecosystem.
The author seems to praise the vendor lock-in that Apple imposes upon its customers as being a good business move and a sign that Apple has "matured", without really offering any hard facts to back it up.
I'm going to offer an opinion of my own without any facts to back it up. Vendor lock-in may be successful in the short term, but interoperability is successful in the long-term. The internet would have never took off if it wasn't for the fact that every website author had access to the same APIs as every other one.
It's bullshit that I can't tell Siri to play music with Spotify instead of Apple Music. It's bullshit that I can't change my browser from Safari, and therefore can't use any Progressive Web App features like push notifications because Apple is afraid of web apps cannibalizing App Store sales. It's anti-competitive and a scummy business move. Android is weak in a lot of areas but at least they got this part right.
That is not at all true. That was certainly not the philosophy on the very successful Apple II series. And it was not how Apple operated in the non-Jobs Mac years in the late 80s and early 90s.
And it's not how they operated for the early part of the 2000s when Apple went on a major open source / standards binge: adopting the standard Intel/PCI architecture, using Unix as the foundation of their OS, open sourcing their kernel, using open source WebKit, etc. etc. Hell, the first version of the iPod even did it right and was basically an external USB hard drive.
But there's been a major turn to the proprietary as the iPhone business has eaten Apple. They've locked that platform down in a big way, and they've failed to embrace the web and cloud in a functional way. There's less and less open standards embrace there, and more lock-in style in both hardware and software.
In a way Apple has always had two hearts -- a Wozniak (the open Apple II) vs Jobs (the closed Mac), long after those two have left the company.
Apple announced support for PCI in 1993 with their original PPC road map. The first gen of PPC Macs were Nubus for backwards compatibility but the second gen in 1995 were PCI. Apple allowed third party Mac clone makers to make hybrid Nubus/PPC Macs.
The first generation iPod was FireWire not USB and always had a proprietary file system for actually adding music and forced you to use either iTunes or MusicMatch (Windows) to add music.
Apple is no different than Googke or Microsoft. They use open source when it suits them but none of the major companies outsource their Crown Jewels.
But even if you go back to 1984, all of the other computer makers that made non generic clones are out of business except for Apple. So that's still 35+ years of Apple going thier own way.
People have a soft touch for companies that are on death's door, proof in point is when a startup is closing door. it's their biggest boost in marketing they ever had. and people complain they didn't already know about it.
With 100,000 full-time equivalent employees, that means that Apple could pay every one of their full-time equivalent employees (which, as a reminder, includes a large number of retail employees) an average yearly salary of $100K for 12 years while making $0 in revenue and still not deplete their cash reserves!
Or, to put it all in perspective, based on Wikipedia's cited estimate (https://en.wikipedia.org/wiki/International_Space_Station#Co...) the International Space Station cost $150B to construct. Personally, I'm looking forward to visiting the iSS at some point in the future.