It was as if they said, OK, here's the supermarket, now vendors, create stuff and we'll stock the shelves.
Except everyone publishing software today that isn't a game developer is targeting the web. Which means the best software you're going to get for Windows that isn't long-lived incumbent software like Creative Cloud is going to be... over Google Chrome (or Microsoft Edge).
OK, but Microsoft is in the best position to create great, integrated, first-class technologies that build on how great Windows is. But they don't. I have mixed feelings about this, because when Apple does it, they basically put people out of business.
But here's the thing, in 10 years time, you're going to have Windows, which is still just, and will continue to be, Windows. And macOS, and iOS, and tvOS, and every other Apple OS, is going to be so much more. And they already are so much more.
For those who haven't experienced it yet, or can't afford Apple products, they're missing something that just hasn't happened in computing in any other period of time I can immediately think of.
Or, idk, I'm blind because I think Apple products are so great. The lattermost is probably more likely.
I think Apple is better positioned with total vertical integration to lay the ground work for the next platform (AR) and has been doing so for years now. They'll ship some hardware when the time is right.
They've repeated this approach since the iPod successfully, never first to market, but laying the ground work before shipping the best in class product.
FB (Zuck specifically) recognizes the next platform and Oculus is a bet to win it. Their issue is they don't have their own phone OS, the Oculus platform they have to build up from scratch (which they've done a decent job doing). They also have a brand issue (I personally dislike their ad-driven business model).
It'll be interesting to see what hardware Apple ships - the UI potential for AR is enormous and very cool. I know Zuck sees this and is public about it, Apple is acting in such a way that they definitely see it - they're just quiet about it until they ship.
Looking at tiny glass displays will be a funny anachronism of our time.
It's Android, right?
In fact none of the major platforms we use today was just predicted and constructed. They've evolved and shown their benefits naturally and not entirely in expected ways. No one actually planned the web to be an application platform. It was a university paper exchange program. And the Internet before it was a military communication network.
Your predictions about the grand future of AR remind me of the excitement around VRML couple of decades ago.
"Looking at stale 2D web pages will be a funny anachronism of our time" we thought. Turns out making existing content more fancy in 3D wasn't that useful, it actually was more cumbersome both to create and to use, so 3D web pages died before they even had a true chance to live.
The iPhone was a UI step change improvement over previous 'smart phones' and the app ecosystem came out of that.
The ground work being set in the OP's post is about getting things ready for hardware that can then take advantage of it.
It's possible to make predictions based on trends and the capability of hardware that becomes possible when it previously wasn't: https://www.youtube.com/watch?v=sTdWQAKzESA (Also see: Douglas Englebart's the Mother of All Demos: https://en.wikipedia.org/wiki/The_Mother_of_All_Demos). Xerox PARC too - computing history is filled with examples of people pulling the future down because they recognized what was possible.
Just because 3D websites are a bad UI doesn't mean looking at little hand-held glass displays is the best one. Likely in AR we'd still pull up flat 2D websites a lot of the time, you just wouldn't need to pull out a little glass display to do it.
Michael Abrash used to have a blog post about the hardest AR problems when he was at Valve (he's at Oculus now): drawing black, and latency - the latter is mostly a hardware constraint - I'm not sure if anyone has solved the former (the magic leap sucked).
If the hardware is possible, the UI benefits seem big.
Google Glass didn't die because it wasn't AR, it died because wearing glasses all the time wasn't practical.
Not practical technologically in terms of battery life, weight, and not practical in terms of simply that you don't need to interact with some digital UI every waking moment of your life.
Also while in our imagination we can conjure up virtual displays in AR and use them for complex UIs, actually waving your hands in empty air, aside from being super weird, is also super inconvenient, compared to handheld multitouch glass.
You're not making AR predictions based on "current trends". You think you are. Instead you're trying to draw a straight line from the present reality to your favorite sci-fi movies that have shaped your idea of what the future is going to be like, while also skipping over all the pesky details that can trip up that idea from concept to realization.
In other words, same reason why everyone was dead set flying cars are coming. And yeah, the generic "no one believed in XEROX PARX and no one believed in trains and car engines, no one believed in airplanes" argument was brought up about flying cars. Turns out that this argument is not an automatic win for believing whatever you wanna believe is coming.
Just because someone didn't believe in airplanes doesn't mean I can't roll my eyes at predictions that faster-than-light travel is just around the corner.
The timing has to be right and the hardware has to be possible - if you're too early it won't work.
Flying cars is a bad comparison - they mostly don't exist in widespread use because of reasons not related to computing. Risk, fuel, control, etc. - even then rich people do have helicopters (though that's mostly different).
Pointing out failed predictions does not imply that all predictions are similarly wrong. In computing - the examples I showed (and there are others) are more relevant.
And nothing has changed about that.
> Pointing out failed predictions does not imply that all predictions are similarly wrong. In computing - the examples I showed (and there are others) are more relevant.
Most predictions are actually wrong. Let's see the hardware that "doesn't suck", let's see the UI and utility that "are there" and then I'll tell you if we have a winner or not.
Right now we have nothing except bold fantasies powered by sci-fi movies full of cheap hologram VFX.
Yet - it's a prediction based on the capability of future hardware that seems plausible.
> Let's see the hardware that "doesn't suck", let's see the UI and utility that "are there" and then I'll tell you if we have a winner or not.
Yeah sure, it's way easier to make predictions in hindsight after other people have already built it. Even then - when the iPhone launched in 2007 it was largely panned in a similar way to what you're doing now.
Making accurate predictions is hard - I agree with that. If you dismiss everything you'll be right a lot of the time, but you'll also miss every big and interesting change until someone else builds it.
And nuclear fusion reactors are only 20 years away. That's a concrete goal and it's more likely than some vague new "metaverse". I promise, you'll still be posting on HN when anything resembling a shadow of this is released.
I do this every day of my life.
I think Microsoft does far more interesting research than Apple, which largely (but not completely) just buys new ideas like Cisco. An effective strategy, granted. I feel MS has a feel for where development is going better than Apple, with VS Code and the purchase of Github.
"missing something that just hasn't happened in computing in any other period of time I can immediately think of" is just hyperbole. To me, Apple has always been the company that takes other people's ideas and polishes the hell out of them - which is amazing, but when I look at my 30+ year tech career, I don't get very excited about Apple.
The word “garden” suggests bespoke curation for appreciation and abundance, while “walled” suggests within this boundary the experience is purposefully tended and safe.
The Japanese put them in the middle of their homes. The British write children’s literature set in them. Just look — the very idea of a walled garden is lovely:
In film: https://i.pinimg.com/originals/5b/52/41/5b524125c48b35bfe0ca...
In life: https://en.wikipedia.org/wiki/Great_Maytham_Hall#Gardens
In my experiences with the non metaphorical variety, most anyone who can afford one prefers it.
> bespoke curation...purposefully tended and safe
These ideas are antithetical to what many on HN seem to want. I personally love Apple products, but walled gardens are great when you don't bump into the walls all the time, which a lot of other people seem to do.
In these cases (wall bumping) I think it's reasonable to want alternatives. But there are fanatics and anti-fanatics that I think skew any discussion about Apple.
Personally I think they should have that freedom. If Apple really had no competition, I would absolutely change my mind. But so long as Android exists, my view is that consumers deserve the right to choose Apple’s garden if they want. And they can choose to leave it. Millions of people do every year. Let the market decide.
Don't be embarrassed to admit that you're the one wanting to make up new rules. It's okay to want new rules. Be honest about it.
If you are interested in having your perspective challenged, the commentator for whom I most agree with on this is Hoeg Law, who has published extensive commentary on the Epic vs Apple lawsuit. If you are interested in hearing a view that differs from your own, I can commend his publications to you:
Yes, after a couple of decades of large companies which have consolidated platforms to a handful of sites and services that the majority of the population uses, yes it's time for some additional rules. I completely support the Cicilline bill.
If you'd like to get out of your bubble, I'd read the open markets institute's briefs in the Epic case and the Cicilline bill.
This belief suggests that perhaps you might be the one stuck in a bubble. I've been greedily consuming alternative views on this case but seen very few legal analysts describe the case as being anything other than an uphill climb for Epic.
I have read numerous amicus briefs from the Open Markets Institute, but I'm not aware of them issuing an amicus brief in the Epic vs Apple case. If this exists, can you please offer a link to that?
The latter is a proposal to change the rules and is therefore not relevant to the question of whether Apple were "follow[ing] the same rules," as you put it. Needless to say, I don't agree with the Cicilline bill or any other attempts to frame competition on a platform level. Most egregiously, they almost always carve out special interest exceptions and protections for video games on televisions, but not video games on mobile devices.
Have you got anything else you think I should read?
Rules can be better defined and updated, so I don't think the Cicilline bill is "changing" the rules. It's just giving the FTC an explicit mandate instead of the hands-off position it has taken so far wrt Apple. The other platforms get some new/updated rules too.
I don't see the "special interest" distinction you're making at all, Apple is free to set the rules they want in their App Store. They just can't disallow alternate app stores like they have been doing.
Apple themselves argued "people buy devices" so why shouldn't we frame competition at the platform level? Why shouldn't people be able to do what they want on their phone instead of needing two phones?
The open markets page I linked to has a link to the Epic brief they filed: https://static1.squarespace.com/static/5e449c8c3ef68d752f3e7...
That article is utterly rudimentary reportage. There is no legal analysis, or really any depth of analysis at all. You really should seek out specialist analysts like Richard Hoeg who have in-depth, intersectional understanding of both competition law and technology platforms.
If you don't like the idea of watching long-form commentary, his speaking style is clear enough that I find his voice eminently clear at 2X speed. Use the YouTube shortcut keys shift–comma (<) and shift–period (>) to quickly change the playback speed.
> Rules can be better defined and updated
In the excruciatingly pedantic context of law, "better defined rules" and "updating the rules" are synonymous with changing the rules. You can spin it however you like, but as the law stands right now, what Apple is doing is legal.
Out of curiosity, do think that Sony, Microsoft and Nintendo should also be forced to dissolve control over their platforms as well?
> They just can't disallow alternate app stores like they have been doing.
Until such time as a law is established to disallow walled gardens of technology platforms, they absolutely can. You are welcome to argue that they shouldn't be allowed to disallow alternate app stores, but arguing for the status quo to be changed is an entirely different argument.
It's also worth noting that the law is absolutely on Apple's side when it comes to licensing Apple's intellectual property, which all iOS developers must do in order to use the software development tools they supply. So even if Apple is forced to allow side-loading of alternative stores, there is absolutely no question that Apple would be allowed to require a percentage cut of revenues from apps developed using their tools—just as Epic is entitled to ask of developers who use Unreal Engine.
> a link to the Epic brief they filed
That is not a brief filed in the Epic vs Apple trial. It doesn't have anything to do with Apple. As far as I can tell, no such amicus brief exists. Yet you seem convinced it does exist and that you've read it.
Thats an interesting prediction. So you are claiming that the judge in the Apple vs Epic case will not rule against Apple, in any way, on any of its behavior.
That is a strong claim for you to be making. And since you have made this prediction, I will be sure to come back to your comments in a couple months.
Because I am pretty sure that the judge will rule against Apple on at least some issues. Probably not the whole thing. But there will be some things that Apple is doing, which will be ruled illegal.
The real question is, if this happens, will you back off on your extordinarily strong claim, that absolutely nothing that Apple was doing, will be deemed illegal by the judge? Will you admit that maybe you didn't think this all through?
It is not inconsistent for something to be legal today and then become illegal after a Judge issues a ruling. Right now, what Apple is doing is legal. It may become illegal in future.
My prediction of the outcome is, I think, a coin flip between the Judge ruling entirely in Apple's favour, or a ruling that is mostly in Apple's favour but requires a minor relaxation of certain rules around messaging inside apps about alternative methods of payment. (However even in the event of the latter, this would not represent any impediment to Apple requiring a percentage license fee when payment occurs through an alternative method. In which case it would be moot.)
For anyone following the trial directly—not just consuming other people's interpretations—it was clear that the Judge was repeatedly seeking some sort of bone to throw Epic's way, slicing the thinnest possible piece for Epic. It all hinges upon what the Judge determines to be the relevant market, and what is considered an acceptable substitute under the Sherman act. Somewhat ironically, Epic is a terrible plaintiff in this respect since their games are available on many platforms.
No actually. It would mean that it was already illegal. Thats why the judge would rule it that way.
Judges interpret the law. So yes, if the judge rules this way, then this means that you were wrong to claim that Apple's actions were not illegal.
And I look forward to coming back to this post when this happens, so that I can show you the explanation of why the actions were illegal on some fronts.
> Right now, what Apple is doing is legal.
Well if the judge says otherwise, on some fronts, we will be able to look back on this post, and re-evaluate then, won't we?
> become illegal in future.
No, becoming illegal would be if a legislator changed the law. Instead, this case is about interpreting existing law. So if the judge rules this way, then it means that Apple's actions were illegal. They did not become illegal. They were always illegal.
But, of course, I can already see people making up excuses, to ignore the judge, if the judge disagrees with then. If the judge disagrees with them, and she says that the actions were illegal, they have already made up an excuse as for why the judge is wrong.
Personally, I trust the opinion of the legal system. And that means that if the judge rules that some of the actions were always illegal, then that means that the judge is probably right, and you were wrong.
> it was clear that the Judge was repeatedly seeking some sort of bone to throw Epic's way
Ah, so then you agree that some of Apple's actions are illegal, and you take back your previous statement. Got it.
Glad you agree that the judge will likely rule that some of Apple's actions were always illegal.
The matter in question is distinctly novel and any ruling from the Judge will establish a new interpretation of existing law—laws which date back to 1890 and cannot possibly predict the creation of computer software ecosystems. A ruling could make Apple's past actions retrospectively illegal. But that doesn't make them illegal today, before the ruling has been issued.
> And I look forward to coming back to this post
As do I. I've bookmarked this thread and we shall reconvene when the judge returns her verdict.
> But, of course, I can already see people making up excuses, to ignore the judge, if the judge disagrees with then.
On both sides.
> Ah, so then you agree that some of Apple's actions are illegal
No, I don't. In my opinion, the correct verdict should be entirely in Apple's favour. I disagree with the argument made by Epic that Apple has done anything illegal. But even if the Judge makes some relatively trivial concessions in Epic's favour, it would be a bit rich for pro-Epic commentators to crow if none of Epic's substantial demands are rejected.
To the extent I think Apple should change any of their policies around the App Store, it is for the sake of PR and not law—to protect Apple's business model from the coming onslaught of badly conceived "anti-monopoly" legislation.
You literally just said that you think that the judge is going to rule that at least some of Apple's actions are illegal. That is what your previous comment means.
If you think that the judge is going to rule against Apple, in any way at all, no matter what, it means that you agree that some of Apple's actions are illegal, currently right now.
So, if the judge in the Apple case, make any action at all, no matter what, on any issue, against Apple, then that means that it is a true fact that Apple was engaging in illegal conduct.
No matter how much you dismiss the judge's ruling, if there is any order at all, against Apple on anything, then that means that Apple's actions were indeed illegal, on the things that the judge says.
> it is for the sake of PR and not law
Well, since you said "not law", then that means that if the judge does anything at all, on any issue, against Apple, then that means that you were wrong about the "law" part of it.
> and in my view the correct verdict should be entirely in Apple's favour
Oh, the excuses are coming out already! Before even the verdict has been made, you are coming up with reasons to dismiss it. As expected, no matter what the judge says, you will come up with an excuse, to ignore the evidence.
What is even the point of pretending like you care about the legal system, if you are going to make up excuses, already, about why the judge's opinion is wrong?
I find it difficult to believe that you can't see a distinction between what I predict the Judge might do, and what I believe the Judge should do. This is so beyond disingenuous that I'm no longer interested in this conversation. I have deleted the bookmark to this thread and will not be returning in future.
Don't worry, I'll remind you. But, I fully expect that you will come up with an excuse, when the judgement happens, as for why the judge is wrong to declare that at least something that Apple has done, is illegal.
You are being disingenuous in your interactions. It appears you have form in this regard. As this other person said, you seem pretty intent on deliberately misinterpreting as strongly as possible.
I'll leave you to it.
Like I said, this is already disallowed under the Sherman act, and regulation will make it explicit. If an OS maker had sued Apple for tying a device to the OS, it would be illegal under the Sherman act too.
No argument on Apple's tools, there are other tools available.
I'm kind of taken aback by your misrepresentation of the open markets institute links I posted. You are being disengenuous when you say there's no such brief.
Wow, did you seriously paste a link without reading it? Open Markets Institute did not file an amicus brief in Epic vs Apple. What you linked to is an amicus brief in the matter of Shah vs VHS San Antonio, a substantially different case which pre-dates the Epic lawsuit. Seriously, click on your own link.
OMI did release two paragraphs about Epic vs Apple in the form of a press release. Did you think that was an amicus brief? I'm starting to think that maybe you don't know what an amicus brief is.
> this is already disallowed under the Sherman act
No, it isn't. You do know that the Sherman act doesn't make it illegal to operate a monopoly, right?
> If an OS maker had sued Apple for tying a device to the OS, it would be illegal under the Sherman act too.
You've just demonstrated that you don't understand what "tying" is in the context of antitrust.
If you've followed the case, it's clear what anti-competitive harms Epic is alleging from the iOS app distribution market.
You seriously don't have anything more substantive than that?
I have listened to hours upon hours of the actual court proceedings in Epic vs Apple. Have you listened to even a single minute of it? I found Epic's case to be muddled and oftentimes contradictory. Most of their ambitious claims fell quite flat when tested in court. It certainly isn't helped with Tim Sweeney making some fairly ridiculous claims on Twitter, e.g. that Apple's commission should be more like credit card rates. Fairly ridiculous when Epic run the own store, which they admitted isn't yet profitable at a 12.5% commission. And I'm really confused why they couldn't make money on 12.5% since the Epic Games Store is barely more than a shopping cart and game downloader.
Hoeg Law's final video includes a contextual refresher on the Sherman act. It's really worth listening to. Just six minutes, or three minutes at 2X speed.
I'll leave it here.
I'll take the wild, chaotic natural environment over a walled garden any day of the week. So much more interesting, even though there is danger.
Which makes perfect sense, because you can't take the tree from an existing garden and plant it in your own without taking the roots and soil with you. Or put another way: you can take the tree, but you'll be leaving a hole for everyone else to fill in.
It's a solution to a problem and everything is integrated.
Not every solution has to be super modular, plug-and-play and work with everything, everywhere, forever, and allow the end user to self-repair, but not everyone wants that.
Which makes all the difference.
That being said, iOS is absolutely a walled garden, though I rarely find myself in need of something that it doesn't accommodate anymore (with the exception of WebXR, which isn't exactly popular yet). But Microsoft doesn't have its own mobile OS anymore, so not sure this is analogous.
I was all in the Apple ecosystem the past couple years except for the Watch. And I'm struggling to think of something that's impossible to do with Windows and non-Apple devices besides Airdrop and that didn't even work half the time between my iPhone and Mac mini. Sidecar is a cool feature to have built in, but apps like Duet and Astropad do the same thing...
And I don't think it's a bad thing that MS doesn't Sherlock devs either...
If you have Apple Watch, you can authenticate services on your computer just by double-tapping the Watch side-button. It also unlocks your nearby iPhone if you have the Watch on.
You can trigger Do Not Disturb across the ecosystem from any of your devices.
Find My lets you keep track of all devices in the ecosystem with a global lost and found mesh network built in.
Apple Pay works seamlessly across iPad, Mac, Apple Watch, and iPhone. My Mac uses my iPhone's FaceID or a confirmation on my Watch to authenticate payments - no setup or weird configuration required.
AirDrop works perfectly for me with all my devices. I can beam documents and data from device to device without thinking about it.
Handoff works across all my devices. I can start an e-mail on my iPad and continue on my Mac instantly.
I can see the charging/battery state of my iPad, iPhone, Apple Watch, and Apple Pencil from any device in the ecosystem, including Mac.
AirPlay works across my iPad, iPhone, HomePods, Apple TV, Mac, and even Apple Watch. AirPlay 2 is incredible and has no delay. Rock solid connection - much better than any other protocol I've used.
If I get a new device, setup and integration with the rest of my devices is instant - no configuration required. This is especially great with AirPods - I don't have to pair them with each device. I just pair with 1 and I'm good to go on the rest.
Apple Store support for everything in my ecosystem.
etc etc etc
I can configure Windows to lock my PC if my phone is not nearby. Which makes a ton more sense than unlocking it if it
As for the rest, I've never had much success with all that being "seamless". I could hardly ever get my Airpods to show the battery level of the case by opening it and holding it near my iPhone.
Well the Watch has to be on your wrist, unlocked, authenticated with your iCloud account and close enough to trigger NFC.
It's actually one of the best non-fitness related things about the Watch imo.
Point taken on some of the features not always being seamless but I've had much more success than not and I really appreciate all the ways the ecosystem works together - I can't imagine not having some of these features.
You know that if you don’t want it, you can simply choose to not set it up that way, right?
Remember this feature competes with unlock mechanisms that can be defeated by a photo held in front of the camera, not with Fort Knox 6 factor identification.
On a windows laptop, you have Windows Hello, which is very reliable and seamless. Most of the times you won't even notice it. No $400 watch required!
> You can trigger Do Not Disturb across the ecosystem from any of your devices.
This is just a single click. I don't see the big deal.
> Find My lets you keep track of all devices in the ecosystem with a global lost and found mesh network built in.
Cool, but I don't see much use for it. I have literally never needed to track my laptop or phone or keys.
> Apple Pay works seamlessly across iPad, Mac, Apple Watch, and iPhone. My Mac uses my iPhone's FaceID or a confirmation on my Watch to authenticate payments - no setup or weird configuration required.
My credit card works seamlessly across Apple, Android, Windows, real-life. My browser stores the card number and pre-fills it for me, no matter the device. No setup or weird configuration required.
> AirDrop works perfectly for me with all my devices. I can beam documents and data from device to device without thinking about it.
Very useful in some situations. However, I hardly need to use it since my android phone is auto-syncing photos and videos to my Onedrive, which is syncing to my laptop. This works across all ecosystems, unlike airdrop. When I do need to send a particular file, Your Phone app works beautifully.
> Handoff works across all my devices. I can start an e-mail on my iPad and continue on my Mac instantly.
I can do the same across all ecosystems using gmail and/or outlook.
> I can see the charging/battery state of my iPad, iPhone, Apple Watch, and Apple Pencil from any device in the ecosystem, including Mac.
Cool ability, I haven't thought about it, but I think I would enjoy it. Your Phone does show my phone's battery on my laptop.
> AirPlay works across my iPad, iPhone, HomePods, Apple TV, Mac, and even Apple Watch. AirPlay 2 is incredible and has no delay. Rock solid connection - much better than any other protocol I've used.
I am not sure what Airplay does. Is it something like bluetooth to stream music? Bluetooth works fine for me most of the time, and Chromecast also works well. Both of them are not locked into any particular brand, too.
> If I get a new device, setup and integration with the rest of my devices is instant - no configuration required. This is especially great with AirPods - I don't have to pair them with each device. I just pair with 1 and I'm good to go on the rest.
Same with Galaxy buds. I don't even need to have bluetooth on. The phone will still detect the buds.
> Apple Store support for everything in my ecosystem.
Not sure what you mean by this.
As you can see, I don't find any great benefits to the Apple ecosystem compared to other, more open systems. And it doesn't hurt that almost everything is much cheaper than iStuff too.
It's the details, integration and quality that you can't capture in words that make all the difference.
It's easy to downplay the features like you're doing, but the magic is all of them together working together in one ecosystem. Even the tiny tiny features like being able to ping my phone from my Watch - I use that feature every single day to find my phone in my house (if I left it in a jacket pocket for example).
Sure if price is an issue, Apple's ecosystem is not cheap.
It's almost unthinkable that some in-software feature that a lot of people like can be done in an Apple OS and not a Microsoft/Android OS.
The question is not whether it can be done, but how easily it can be done. If you have to jailbreak your phone or cobble together a collection of different applications to simulate the experience, for 80%+ of people that's basically equivalent to impossible.
I own both. From my POV I'm able to do more on Windows even if I prefer MacOS for day to day use and iOS for my phone (though again can do more on Android)
For a few years now there has been some noise that Apple’s Pro hardware lineup isn’t Pro, that macOS is becoming more locked down and more iOS-like, that the software is buggier than ever, and - until they dropped their App Store split to 15% for small developers - that 30% was too high a price.
With WSL, and VScode and GitHub Microsoft are making a (messy, so far) play for developers. But as you point out, they didn’t take it to the nth degree and nail the execution. I wonder if they’ve missed the opportunity now.
I develop for web on macOS, and have all Apple hardware, so maybe I’m missing some of the other things Microsoft have done.
Maybe it is the lack of verticality MS have, and the integration between devices, something they can never match Apple on that means there hasn’t been the transition they were aiming for. That’s why I’ve not switched.
I’ll be very interested to hear what people think about this.
What happened was open source software, linux dominating the server side, and android dominating mobile (iOS as well, to a lesser extent).
> Except everyone publishing software today that isn't a game developer is targeting the web.
Except for nearly all backend development... which mostly targets linux first.
> OK, but Microsoft is in the best position to create great, integrated, first-class technologies that build on how great Windows is. But they don't. I have mixed feelings about this, because when Apple does it, they basically put people out of business.
For example? Apple software is average, and Microsoft has a long history of 'putting people out of business' (not in a good way).
> But here's the thing, in 10 years time, you're going to have Windows, which is still just, and will continue to be, Windows. And macOS, and iOS, and tvOS, and every other Apple OS, is going to be so much more. And they already are so much more.
You forgot... android and linux. How does apple 'already have so much more'?
> For those who haven't experienced it yet, or can't afford Apple products, they're missing something that just hasn't happened in computing in any other period of time I can immediately think of.
That just sounds like an opinion of the average non-tech user who lives inside an apple bubble and likes icon shapes and default keybindings. You're not saying anything specific, just poetry about apple being 'first-class' and 'something that just hasn't happened in computing in any other period of time' (???)
Some of those, perhaps, but MacOS has been on a steady decline for the past five years. Apple doesn't seem to care about the Mac platform except as a lifestyle accessory/devbox for iPhone owners. Windows may be limiting itself by merely trying to provide a decent daily-use desktop environment, but Apple isn't even doing that anymore.
And what is wrong with this? A stable, long-lasting platform is a beautiful thing.
Maybe I'm an old fart (at 36...) but I don't really want "so much more". I want something that I can count on and bank on now, and that will be there for me for as far into the future as possible. None of this lofty futuretech seems to promise that, at least without bleeding me dry in pointlessly recurring interactions like subscription fees.
All pro or semi-pro creative applications target native.
live streaming / video editing / animation / DAW / image editing
There are some toys for parts of these workflows in the webspace, but no serious tools for people who do this stuff seriously.
I am not an old timer like some here that has a MacSE, but for 20+ years, this is the biggest come back of Apple, and many of us were once hoping for. For me it was a simple thesis, something that is so much better deserve to be in a better position. May be it was more of a hatred against Microsoft in a way, since they were shipping crap after crap.
And here we are, Apple dominating and will continue to do so. But since 2016 Apple smells exactly the same as Google in 2003. Don't be evil Hypocrisy.
Except this time around Apple has the better product, services and brand. From a strategic point of view there are no major flaws of weakness. Google's only major weakness was relying on browser for access. That is why they created Chrome. They also saw what could happen if everyone were using iPhone. They will have access dictated by Apple so they need to built Android.
But Apple has no access problem. They are the final End Point ( or starting point would be more appreciate ) of all technology and digitalisation.
> I'm blind because I think Apple products are so great.
Not really, objectively speaking it is hard to argue Apple's product are not some of the best on the market.
And they are the only ones who have this.
It is called: Mesh.
As a fellow cult member who hasn't drunk enough kool aid, Apple products have been losing their luster in recent years.
Apple TV still doesn't have hands free voice control. Fire TV has had this for years now. Using your iPhone isn't great because it doesn't have the microphone setup to consistently hear your voice well. Does Siri and Apple TV finally work together with Home Pod like how Google TV uses smart Google speakers?
Apple Macbooks have suffered immensely with the near useless touch bar and terrible keyboard.
Thinness as a feature seems to override everything else. I don't care about having a thin desktop. I want something I can open and replace parts in without paying about $10k.
The list goes on. Including how there's still no word on the VR / AR unit, while Facebook mops the floor.
The only light in the darkness is the M1 chip.
Yes it does. I use it everyday. Which Apple TV do you have?
> Apple Macbooks have suffered immensely with the near useless touch bar and terrible keyboard.
Still the best laptops in the market by a mile.
> Thinness as a feature seems to override everything else.
Many recent products have actually been thicker. New iPhone is thicker/heavier. New iPad is thicker/heavier. AirPods Pro are chunkier than regular AirPods. AirPods Max are some of the heaviest headphones in the market. Their new Mac Pro is a massive, modular beast. etc etc
> Including how there's still no word on the VR / AR unit, while Facebook mops the floor.
Apple is never first to market. But when they do enter, they tend to wipe the market clean. They also never talk about unreleased products.
> The only light in the darkness is the M1 chip.
This chip and associated tech will power their entire ecosystem and make every product better.
4k - 1 model older than M1. I don't mind buying a Homepod to enable it (I'm not happy about it, but less happy about being forced to do it with iOS devices not designed explicitly for hearing you anywhere in the room), but to my knowledge commands from the Homepods won't make it to Apple TV. Am I wrong?
> Still the best laptops in the market by a mile.
They are still considerably worse than older Macbooks, since thinness and removing the ability to upgrader hardware were the priorities. The only thing that keeps me in the Apple computer ecosystem is Mac OS.
> New iPhone is thicker/heavier. New iPad is thicker/heavier. AirPods Pro are chunkier than regular AirPods. AirPods Max are some of the heaviest headphones in the market.
That's a great trend since Ive left. Looking at the new iMac line was disheartening.
My brain didn't register the "hands-free" part of your comment. I think you're partially right but not completely.
I have a pair of HomePods hooked up to my Apple TV 4k and I can do basic hands-free commands like "Hey Siri, pause", "Hey Siri, continue", or "Hey Siri, go back 20 seconds".
But more advanced commands like "Hey Siri, open YouTube" don't work - you still need the remote to navigate around afaik. I agree with you though, it would be cool to have much more advanced support with HomePods and Apple TV. I want to be able to navigate the interface with no remote at all.
I set up a Google Nest speaker thing for a friend's parents (they were all Android users, so it seemed like the best fit). I connected it to their network, plugged it in, then said "Hey Google, turn on the TV" — their smart TV turned on. I said "Hey Google, play Cory Carson on Netflix", and it opened Netflix on the smart TV and started playing an episode of the kids' show
I was blown away because I had no idea what TV they had, how it was connected, and didn't do anything to set it up other than connecting the Google Nest speaker to their WiFi
I really wish my HomePod + AppleTV could do that! The best I get is "Hey Siri, turn on the living room TV" and even that only works by the grace of HDMI CEC (randomly, it will just turn on the AppleTV and will fail to turn on the TV)
The Google Nest speaker was able to turn on a random smart TV in the same room with zero setup, and not even a direct connection. Play specific shows and more. My HomePod + AppleTV can't do anything close to that
All that said, I refuse to allow any smart TV in my house to be connected to WiFi. So I doubt the Nest solution would work for me. It was just impressive to see it in action
I’m hoping that Apple steps up their game, but it’s been 8 years already, and only now are they release a less expensive version of the HomePod. I don’t feel that Apple knows what people want anymore which leaves Amazon to take continue dominating the space with their cheap products. In Amazon’s defense, not all their smart home products are bad cheap and no one is still close to really touching them in that space.
> great. The latter most is probably more likely.
Isn't Microsoft already shipping AR headsets to the US Army? if anything they are the leaders in the space
anyway, as far as "Metaverse" goes I'd argue we already have multiple and the presentation layer is just gradually changing
Apple has cloud services, but they are, for the most part, intended to help sell more iPhones. Can't argue with the Apple's approach since it works well revenue wise though.
Android hasn't really faced many obstacles in cloning things. Consumers in China, for example, still buy Android phones, despite iPhones and despite having enough money for those iPhones. I don't know if anyone is missing out on all that much. I wouldn't say they are like, unenlightened. If anything the Android ecosystem has led to more digitization of life in China than in America, far faster - it looks way more like the metaverse in terms of computing taking over daily life than here, which is to say that you're really far off the mark in terms of what really matters.
Anyway, this metaverse stuff. It blows. Roblox blows. Second Life blew. World of Warcraft is great, but you can hear about why it's great direct from the source (https://www.media.mit.edu/videos/conversations-2014-05-07/) and it's all about really carefully curated and purposeful design choices, the user driven parts of it are not why the game was so good.
Existing metaverse experiences blow not for lack of immersion! So the technology will change little of that. If Roblox was photoreal, it would still blow. Minecraft AR didn't matter. None of that shit matters.
A company that barely supports 1 external video card vendor, that releases shit graphics APIs, that sues game companies - they're not going to break ground in the "metaverse." Apple Arcade games are good, but that's because they are fighting the real antagonist, the real horror show: free to play gaming. That is the antagonist of the metaverse, having to be free and monetize people via Robux or V-Bucks or whatever it is that activates neurons in a 10 year old's brain. It's got nothing to do with the technology.
And given that they don't sell hardware, there isn't money in it. And if there isn't money in it, it isn't worth getting sued over trying it.
So here we are.
Even your example (Creative Cloud) has an iOS and iPadOS app.
In Ellen Ullman's book Life in Code, she writes about Whitfield Diffie's (of Diffie-Hellman fame) speech at the 2000 Computers, Freedom, and Privacy conference in Toronto.
> "We were slaves to the mainframe! he said. Dumb terminals! That's all we had. We were powerless under the big machine's unyielding central control. Then we escaped to the personal computer, autonomous, powerful. Then networks. The PC was soon rendered to be nothing but a "thin client," just a browser with very little software residing on our personal machines, the code being on network servers, which are under the control of administrators. Now to the web, nothing but a thin, thin browser for us. All the intelligence out there, on the net, our machines having become dumb terminals again."
I've been thinking about this a lot lately. Smart people like Diffie saw this happening more than 20 years ago.
It's really made me rethink web apps that don't need to be web apps (including stuff like electron). Like you said, Microsoft has thrown in the towel and it seems now Apple is really the only platform making a compelling argument for native apps and keeping the computer smart.
People got busy with computers when computers were the size of houses. They carried mobile phones when phones were the size of suitcases. Is the proto-AR revolution happening somewhere out there without anybody noticing?
Its possible that there is no low hanging AR fruit. That somebody has to do the difficult job of climbing up the AR tree so to speak (refine the technology until it feels like magic).
It just seems that this would be an exception in how things played out so far in the "digital transformation" journey. Its the nature of the human brain to fill-in the gaps and overlook the rough edges when it really has an incentive to do so.
 investment in the sense of personal time by users, creators, business people etc to really learn and use the technology to scratch whatever itch they found it is scratching...
For one, it arguably requires even more CPU/GPU than VR because unlike current phones and current VR you really want ALL the apps to run concurrently in AR. You want your social media app to show the names and or interests of the people around you. You want your mapping app to show you directions. You want your restaurant app to point out which restaurants have seats available or a low waiting time and price / cuisine / style. You want your virtual pet to run around your feet and body, you want to see each person's virtual fashion, you want your virtual monitors showing your older 2d apps running, you want your virtual cooking instructions and while sometimes you'll want to turn down one app over another, usually you'll want many running at the same time highlighting different things. You can even argue you want the various apps to share 3d spacial data (sans privacy issues) so your virtual pet can interact with your friend's virtual pet and the virtual furniture they have placed in their house via some furniture app.
one app at a time is just not good enough for the general population to get iPhone level interested in AR.
Getting full view glasses with fast enough depth sensing so things can be correctly occluded and run all apps simalutainiously and have the batteries last all day seems many years away
AR doesn't have to have those same requirements because you're not going to get nauseous looking at a buggy label flickering in the distance. You don't have to spend compute cycles to trick your brain into believing what you're seeing is real, because it is actually real (minus the overlayed 3D models, of course).
Now, sure, you will have to spend compute cycles on depth sensing and mapping your immediate area. But that's something the OS should/will do, not every app simultaneously. If you think about what an AR operating system would be responsible for doing, mapping your surroundings and providing that data via API is probably one of the first features it would have. It's no different from Windows or macOS communicating with your monitor so that your applications can draw on it. Similarly, every app likely won't be responsible for drawing its UX onto the user's vision - it will probably submit some graphic or model to the OS, which will then anchor that model "onto" reality and handle the user moving their head around it. Much like how every Windows app is not responsible for resizing or moving its window, that's the window manager's job to do. In AR we would probably have a reality manager or something.
What I will agree with you on is that we probably need desktop-level rendering power to solve AR, not mobile-level. However, with the release of M1, it does seem like we're pretty much there already, and I would not bet against Apple's chip team failing to make a smaller M1 that fits into AR glasses.
I don't think this is true. The problem with an app just giving 3D data to the OS is then no apps can compete on quality. There's a reason Cyberpunk 2077 (or whatever the top graphics game is) looks different than Wii Bowling. Even with 2D windows each app does its own rendering using the GPU and provides the results as a rectangle of pixels that the app computed. Some apps look amazing because they used 100% of the GPU power. Some look so-so or maybe they have a great aesthetic (Animal Crossing?) but they aren't spending power on rendering. Now, add that across apps. Your virtual pet app wants to render photo realistic fur where the fur appears to bend in relation to things in the real world, your virtual fashion app wants to show you're friends outrageous auto-reactive clothing, but your navigation app just wants to draw some solid more mostly solid colored lines. The virtual pet app will hog all of the GPU just to draw its fur bogging the entire OS down.
It won't be a simple as it was for desktop PCs because you can't let any app slow down the system (which is what happens on PCs) since you need the system to always run at 60 or 90 or 120fps. At best you could maybe make a preemptable GPU (not sure any current GPUs are preemptable), or use 2 GPUs, one for the OS, and one for apps.
I think Apple or someone will ship an AR device without solving these issues. I think it will fail just like the Apple Newton failed. Too early. Running AR glasses with an iOS like OS that runs one app at a time is just not a useful paradigm for AR and solving the problem of letting 5 to 20 apps all run at the same time, each using as much GPU as possible for their specific idea of quality, AND have that all run fast enough to keep up with latency issues to actually have the virtual stuff not swim over the real stuff just has a long ways to go. Maybe an M17 but not a M1
The main problem is that it's so personal and low latency, but it also takes so much processing power. You need to spend a lot of time on performance and power work before anyone will use it without their phone melting.
Apple also makes good use of Bluetooth for airdrop & wifi password sharing, something else google is light-years behind in.
Good writeup, interesting, even if it seems a bit (quite) oversold to me. Makes me think of Benjamin Bratton's The Stack, the many tiers that compose the digital.
The section has good stuff in it, but to me it came off as me-centric, as about high end experiences like VR, accurate spatial systems.
And that feels like it's a very different piece of the platform than what I was talking about, which is beacons that we can leave for each other. Getting to a bar & having the menu advertise itself. Art installs that come with interesting mini-sites on the browser, or which are cross-media. QR codes are the closest thing we have today, but QR codes require very explicit intent to use, and I've long been interested in the promise of more ambient computing, of seeing the numerous micro- / edge- clouds about me, & seeing what they might offer.
I didn't get any of that from the Find My. I'm looking for Find Your, I guess. Or Share Mine. Which Physical Web did, which Apple's iBeacons continue to do.
AppClips. Drive-by installs. What could possibly go wrong?
Well-made, cool video though!
It is bad enough that it reminds me of Carbon and the related legion of missteps that hurt Apple badly some 20 years ago and pissed off a lot of loyal developers.
Honestly, SwiftUI combined with Apples now complete market dominance is enough to make me reconsider my career path.
Just because they’re so big doesn’t mean they can do no wrong and several of their recent technologies have been regressive.
The magic involved is not really magic at all, just Swift syntax sugar. Nothing about SwiftUI is proprietary, you can make your own SwiftUI variant by copy and pasting its code. It's just another library after all. Granted, some Swift features do seem like they were explicitly granted to make SwiftUI prettier, but it's not like you can't use those features in your own code or libraries too.
It also means there’s a lot of hidden determinism where there didn’t used to be.
Storyboards were rubbish, I agree, but SwiftUI is more fundamental. Apple is abandoning the PostScript model which permeates every graphical computing paradigm (WinForms, GTK, Qt etc.), and if you don’t like programming with cotton gloves on, you’re basically not welcome.
All very much a shame from my point of view, because UIKit is easily the most powerful UI toolkit around. It’ll be interesting to see how long third parties will continue to be able link against it directly.
Jokes aside. What is the mission statement here, what is the dream? Make you never want to interact with the real world again? Help you manage your to do list better?
I feel there was much more of a mission statement for mobile back when everyone was stuck on public transit and at the office for hours every day. You could finally make your life on transit and at the office more meaningful by integrating your personal life with that through your phone.
What's the dream now? I don't get where all this is going unless it's like you're never going to leave your house again, so escape into VR.
Anything that currently has a paper label or barcode in the real world, could be a candidate for an AR app that helps users find stuff quicker. And rather than having to look through aisles or buildings to get to what you want, you could just use a search bar or virtual assistant to look up the thing you want and get a persistent marker to its exact location.
Basically just imagine the kind of UI a video game presents to a player, and how a well executed UI blends in subconsciously to the point that you can navigate menus and issue commands without having to think of it. Now imagine that experience but overlayed over real life, where an app can help you remember someone's name or birthday as they approach and you can take a note with your location and current view attached to it. Something as silly as "funny car" while looking at the funny car, or as serious as telling the nurse after your shift where the patient's pain point is.
I'm unfamiliar with SwiftUI, but fwiw, I found the extensions needed for CSS3D to support AR to be surprisingly small.
A few years back, having a custom browser-based VR stack with passthrough AR, I sketched a talk for the BostonVR meetup, to give around April Fools. It would have purported to be an introductory onboarding demo, of newly available CSS3D extensions for AR. With support for placement in realspace, billboards, HUD overlays, integrated multiple displays, position aware and 3D displays. The extension needed was surprisingly minor. The talk would have been basically "introductory CSS positioning, in a slightly enriched context", which I expected to be quite believable, followed by a "surprise! - the making of the demo spike". I don't quite remember why it didn't happen.
> 2(.5)D experiences
Shallow-3D UIs seemingly have a lot of potential, but don't get much discussion.
Meta: I wish HN discussions around AR were of higher quality. It's be clear for years that Apple was pursing this. Being unfamiliar with the Apple ecosystem, I'd have liked to see more discussion of what those pieces and their characteristics might suggest about the future. Or for instance, of whether Apple is doing any CEP complex event processing, or retraction of app state changes, to support input with diverse latencies.
And most displays have some privileged focus depth(s). Vergence-accommodation conflict, and eye strain from long use, can be reduced by staying close to that. Ahem, and as my eyes have grown old, the annulus where I can focus well has narrowed.
And maybe, it might be possible to have text in layers, close enough to avoid vergence-accommodation conflict eye strain, but far enough to be tolerably readable, and ideally to allow eye tracking to determine which layer is being read.
I was layering a keyboard-cam-with-annotations over desktop for a bit, to see the kbd in an HMD, but also to explore "graphic artist looking at screen containing processed view of tablet input"-like UX for software dev. The layered clutter was somewhat annoying, so I'm unclear on viability, but it did seem intriguing.
Also, "shallow" still permits some depth. A cm or few at 50 cm (laptop screen) seemed comfortable for me (for some limited frequency of focus switching, and I've only a little experience, and ymmv). That's depth enough for more than buttons and text, but still a very different design regime than "fill the room with XR UI". ILM effects, hmm... someday I'd like to find a stereo drone straight-down view of beach surf, shallow but 3D, to use as desktop background. :) Maybe.
Like jfc people, 'End users will experience the metaverse through the default device delivered experience.' sounds like the most doublespeakiest doublespeak ever to have been written.
I love tech and this lofty shit makes me want to gouge my eyes out. Speak plainly and simply, folks.
What is trying to be communicated here is psychographic content will be delivered via standard device substrates existing in well aligned usage patterns that will no doubt delight the end-user with value. Metaverse here simply means they will deliver an experience that is parallel to user expectation yet orthogonal to the digital representation of the physical model of the prevailing social zeitgeist.
Or maybe someone just read Ready Player One recently, and is just thinking: We're going to build the Oasis but nobody could possibility understand what that is, I'll have to break it down into easy-to-misunderstand corpobabble.
The Black Sun was there first.
Can i interest you in a loose collection of extremely janky perl scripts? The bar’s owner hates it but, who cares? If he kicks us out we can just race bikes for a while until the fuss blows over.
On a more serious note, I think these are different metaverses. Ours is pretty much locked in VR as far as I can tell, and Apple is building an overlay on zeroth-order reality, aka AR. Arguably harder and more powerful if done properly. Which I'm sure it won't be.
My response is kind of stuck between "that's overly cynical, isn't it" and "statistically speaking, that's pretty likely".
(I was actually working at an early AR project at Nokia a bit over a decade ago, at the time called Point & Find, later called CityLens. It gave me a feeling that good AR may be one of those things that remains "just a couple years away" for decades.)
I don't know what the "killer app" for AR is really going to be. CityLens and comparable apps like Yelp Monocle on smartphones clearly haven't cut it. (Show of hands: how many people remembered Yelp had an augmented reality mode? I don't think it's there anymore.) I think the big challenge now is thinking of applications that aren't games where using goggles/glasses give you more than incremental improvements over putting the UI on your wrist or in your ear(s).
I wonder if it really is a navigation HUD. For example, i’d love this, with some auditory interface, for things like cycling and motorcycling and also probably normal driving too. Safely presenting salient information in the same depth of focus as the road, and giving me tools to interact with it with voice, would be good AR in those situations, and i’m in them pretty frequently.
Is that a big enough market? Will it inevitably descend into the inverse of They Live?
we’ve given the hardware a shot at least a few times now and wearable glasses tech has not yet been useful. But i can’t think of anything else.
- Low latency, ideally imperceptibly low
- "Delta-zero", meaning the overlaid virtual reality should never drift from reality itself. This is probably theoretically impossible, so let's just be imperceptible here too.
This should boil down to a completely continuous interface over the sensory domains of all humans (other animals?)
For now I assume we are only talking about the visual and audio senses.
All eyes and ears.
> Of course, I think I can help. (…)[jibberish]
Lol, well done!
If you can't explain what you mean in simple, easy to understand language, and you're not talking about quantum physics, you're just bullshitting. Should be required reading: https://en.wikipedia.org/wiki/Politics_and_the_English_Langu...
But it’s also not totally unfair. I think much of the metaverse & AR ecosystem doesn’t have great terminology.
I mean ‘slap some goggles on and touch floaty doo-hickeys that you see but they aren’t really there’ doesn’t quite have the same curb appeal…
If we’re pursuing a Metaverse, VR Chat is closer to a Metaverse than Apple.
Also, Apple will try to make useful things appear in your VR goggles - like showing you how to get home on a map, when in fact you're planning to go out for a drink, or go grocery shopping, or anything but.
This is a terrible article, but the idea is - if not sound, at least potentially interesting.
It's basically enhanced location-aware cognition, taking input from everything around you - sound, video, tactile input, GPS location, event history, AI-enhanced memory - and combining it in real time to produce (checks notes...) "a new class of life-enhancing interactions."
The problem? It needs a lot of moving parts working together seamlessly with better-than-human performance and reliability.
Otherwise it will be a kludgy distracting nightmare of failed meta-everything - like the most annoying and useless intern anyone has ever had, only everywhere.
/s (I hope)
Personally, I dislike the term and I believe that it causes people to lose credibility when they use it. However, I can define it to some extent.
Consider the term 'universe' in the context of 'a particular sphere of activity, interest, or experience'. For example, the 'Harry Potter universe'.
The 'metaverse' is simply such a 'universe' that you can't really opt out of, because it's a layer on top of, rather than distinct from, the regular old world. This is why AR is often considered to be such an intrinsic aspect of it.
Now, you might say that 'the internet' is just such a metaverse, but it isn't because it doesn't benefit the people who peddle the buzzword.
Metaphysics = the physics of physics
Metascience = studying science itself with science
Metaphilosophy = Philosophy about philosophy itself
Metadata = data describing data
Gaming 'meta' = the game of how to best play the game
'Meta' is one of those ultra-abstract terms that obfuscates the meaning of whatever word it is attached to unless the listener gives or has given the word a fair amount of thought. Ah, then maybe that is why it is being used here.
Metaphysics isn't really the "physics of physics" though. It is part of philosophy, not physics. It isn't even the "philosophy of physics", since that names a branch of philosophy distinct from metaphysics (albeit they do have some points of overlap.) This is an example of where the meaning of a term is different from the sum of its parts.
The philosophical implications of quantum theory (and its interpretations) is one part of the philosophy of physics, but the study of those implications go beyond just metaphysics proper and also extends to other areas of philosophy such as the philosophy of mind, epistemology, the philosophy of logic, etc. Conversely, there are some debates in metaphysics with which contemporary philosophy of physics doesn't concern itself with much, such as the relationship between existence and essence, or the problem of universals.
A "metaverse" is a "universe" in the sense that it's a shared 4D spacetime that you can explore (using some technology to give you a window into it). It's "meta" because it's layered on top of our "base universe": It's there to describe and augment the physical universe we exist in.
For example: If, in your vision (via glasses, or a HUD, or a pane of glass or a brain implant), there's a big 3D arrow hovering above a road then the arrow exists in the "metaverse" and the road exists in the physical universe that the metaverse is describing.
I thought metaverse just meant Second Life 2.0.
Some examples: Roblox, Facebook Horizon, VRChat, NEOS
"I’ve been less sure they understood the full scope needed to build a compelling AR ecosystem – a metaverse."
Ok, so a metaverse is somehow the realization of the "full scope" needed for a compelling AR ecosystem? And what exactly does that mean, and how does that define this particular usage of the (heavily overloaded) term 'metaverse'?
Meta meaning 'about', so a universe about a universe?
edit/ugh: googling the term says 'a virtual-reality space in which users can interact with a computer-generated environment and other users.'
Like wtf does that have to do with the meta root word here?
There is far too much abstraction in this vocabulary defining a thing that is very real.
Another use of the term (not here) would be a way to jump between VR systems, but they never generalized well enough to make that a significant thing. Right now Steam would count
Imagine a city park where a handful of people have glasses with cameras on them that are continuously sending a geo-located feed of the images to a map server which processes it into a 3D reconstruction. You could watch in real time as the city park is reconstructed and updated in real time 3D, add in virtual content, make content "layers" for different digital assets, track interactions etc...
It's a globally crowdsourced, consistently updated real time digital twin of the world.
> I’ve been less sure they understood the full scope needed to build a compelling AR ecosystem – a metaverse.
Seems this is about augmented reality being integrated into the regular experience of a significant population.
I think it's what Apple's calling their new Augmented Reality platform...I think...or what the author is calling what could be Apple's potential AR platform...maybe?
See flying cars 20 years ago.
My own prediction, using many of the same data points as the author, is that Apple is trying to create a suite of features that act as a sort of Software Personal Assistant. For years Apple has consistently put in better sensors and larger TPUs than strictly necessary for the expected lifetime of the device. We're already seeing some of the results of this with the Health app.
Apple understands the profit potential of platforms, so they'll make some of the data that enables these features to App Developers and that in turn may enable AR, but I doubt any Apple executives are seriously focused on bringing AR capabilities to developers.
A thought I should have fleshed out in the above post: the kind of sensor data that enables AR can also be used to deduce a great deal of personal information. With Apple taking a strong position on privacy I suspect they will make only a limited subset of data available to developers.
More likely, they will allow developers some mechanisms for leveraging the data on device, without being able to exfiltrate it.
SwiftUI 2 is so much better. I am taking a 50 hour course, and spent a few hours hacking this morning. So much better than when I tried it 14 months ago.
Spacial Audio works really well. They are probably using something called Head Related Transfer Functions, which I used at my company’s VR lab in the mid-1990s.
The example AR Swift Playgrounds examples are also pretty cool.
I agree with the general premise of the article!
The Shazam stuff seems very limited in use case, and I'm annoyed at "AppClips" because they should just be websites.
Notes+Spotlight+Shared with you all seem like they're inventing new paradigms to avoid ever adding a user file system on iOS, since Apple is opposed to that. It's possible that those new paradigms will be great, but I think they'd be better if Apple gave up a little control.
That's existed for 4 years. A user file system is effectively available on iOS, even if it's not exactly what you might expect coming from a desktop OS.
I'm not convinced that this is true. Round-trip latency has gotten a lot lower, especially if you take edge computing into account. 20-30ms round-trip is not unusual. If your mic feed is being streamed in real-time, and the program is able to achieve a high-confidence prediction at (or before) the moment you finish speaking, I think it should be possible to deliver an experience that feels instantaneous, at least for users with cutting-edge connectivity. For visual interactions, 100ms feels instantaneous; I bet tolerance for spoken interactions is even higher.
I wouldn’t say /too/ excited. Maybe ‘had his brain pulling a Charlie from Always Sunny in Philadelphia with that Pepe Silvia meme thing’.
Sent from my iPhone
And people with prescription glasses won't be able to afford the Apple Prescription(tm) anyway.
Edit - ah: I see the author is present. Ok then!
Edit: TIL about the color-scheme (draft) CSS property. FF doesn't support it.
If you're not using dark mode, it's #EBEBEB on #121212, which has plenty of contrast:
If you're using dark mode, it doesn't simply do nothing, which would be the reasonable approach with those colors, it gives you that light gray text on #FFFFFF pure white background. It's basically the opposite of dark mode.
However…I’m sorry, the article as a whole just leaves me with the taste of Apple butt-licking.
So many of these parts of this “metaverse” are typical recurring revenue and ecosystem lock-in encouragements.
Find My: Let’s sell some high profit margin trackers and keychains.
SharePlay: let’s sell more TV+ subscriptions.
Universal Control: platform lock-in: make you question using a Windows desktop instead of an iMac if you already own an iPad.
Spatial Audio: sell more headphones
Notes features: sell more News+ subscriptions, more vendor lock-in.
ShazamKit: data mine advertising info. Yes, Apple has ad platforms, and with each passing day their business model creeps closer and closer to Google’s (e.g.,
When they inevitably drop Yelp in favor of their own review system, or if they launch their own search engine)
(Apple can still mine data and use it to sell you things and understand what you’re likely to buy even if they “respect” your privacy and leave all the data on-device. They can still get you to click an affiliate link or buy an app without leaving the realm of on-device ML.
GMail -> Let's sell some ads
Android -> Let's sell some ads
Android Auto -> How can we put ads in cars?
YouTube -> How can we show people more video ads?
Search -> Let's sell some ads.
Google Music -> Did this sell enough ads? If not lets rebrand it and try to sell more ads.
Anyway, not a slight against Google, but you can do this with any large company probably. I'm not sure how meaningful it is.
The App Store has sponsored apps.
Apple News has ads in the app, before you click on any articles.
Apple Music/iTunes Store/Books has promoted artists/albums/movies/TV. It’s unclear whether any of these are paid promotions by the content producer but they’re in a banner so I have to assume Apple gets something out of it.
Apple Podcasts also has a featured section, likely paid ads.
And, as I mentioned, it’s extremely likely that Apple is working on a replacement for Yelp for Apple Maps, which will potentially be a big source of revenue.
And of course, there are ads for Apple’s own in-house TV+ content. Not an ad network, but if they know what you like they can better promote their TV shows. By default the TV app sends random push notifications.
The more Apple gets into content the more knowing your personality and interests is important to their business model.