Hacker News new | past | comments | ask | show | jobs | submit login
Apple Vision Pro: Apple’s first spatial computer (apple.com)
2608 points by samwillis 3 months ago | hide | past | favorite | 2844 comments



All: there are over 2700 comments in this thread - to get to them all, you need to click More at the bottom of each page, or a link like this:

https://news.ycombinator.com/item?id=36201593&p=2

https://news.ycombinator.com/item?id=36201593&p=3

https://news.ycombinator.com/item?id=36201593&p=4

There are also a bunch of other threads: https://hn.algolia.com/?dateRange=all&page=0&prefix=true&que....

Sorry that our server has been creaking today. Perf improvements are on the way (fingers crossed), but alas not today.


It's interesting that HN is completely overloaded right now...with people coming to announce how unimpressed they are and how it isn't for them.

The displays in this device are crazy. I honestly didn't think they'd be able to put together a value proposition, but I think they legitimately did. It's super expensive, and some of the cost of the device seems kind of silly (if I heard correctly, the display on the front is 3d and gives different perspectives based upon the viewers), so obviously they're going to have a lot of room to improve value in subsequent generations.

But it's going to be a hit. HN is going to be swamped with "How I used Vision Pro to..." posts when it comes out.

One element that didn't get a lot of play (if any...though I was distracted with work) -- did they talk about using it as a display for a Mac? I'd love to use a real keyboard mouse interacting with flexible Mac displays.


> It's interesting that HN is completely overloaded right now...with people coming to announce how unimpressed they are and how it isn't for them.

Agreed, polarization is a good sign that this is going to make an impact. Ironically "unimpressed" is communicated by a lack of response, not by a negative one (which more likely indicates people's beliefs are being challenged). The only way this would be a flop is if they shipped something really buggy and worse than the competition (which at the time will be the Meta Quest 3). Otherwise...

> it's going to be a hit. HN is going to be swamped with "How I used Vision Pro to..." posts when it comes out.

100%!

> did they talk about using it as a display for a Mac? I'd love to use a real keyboard mouse interacting with flexible Mac displays.

Looks like it's going to be a standalone device that you can pair with a magic keyboard and trackpad. Considering it ships with an M2 I expect iPad/Air level performance (assuming the spatial stuff is solely handled by R1). I can totally see myself using it as "the one device" (pun intended) and get rid of my Macbook, assuming there's an easy way to share content with someone who's next to me, e.g. on my iPhone.

I can't wait for it to be publicly available.


> Agreed, polarization is a good sign that this is going to make an impact.

Virtually every new Apple product is going to generate this sort of response, and while many Apple products have had a large impact, just as many haven't, I don't know how much predictive strength "this new Apple product generated a lot of conversation on HN" has.


Exactly. Especially in this case, where we knew this was coming for months. It's generating a response because people have been waiting to talk about it since the rumors started.

For myself, my "unimpressed" reaction is because the experience they're selling is the same as what Meta has been trying and failing to sell for years now. It's definitely typical Apple—wait for the tech to mature and execute better than anyone else—but I'm unconvinced there's an actual need being filled here.

The iPhone took a market that had already taken off in business—PDAs—and blew the roof of of it by revolutionizing the tech. The VR-for-productivity market is practically non-existent, and even in gaming it's still very niche. Neither are anywhere near where PDAs and Blackberries were when the iPhone made it big.

I'm just not convinced the "execute better" strategy will work when there is no proven market.


The YouTube stream I watched mentioned it can detect when you are looking at your Mac and offer the screen up in the googles with full sizing and layout control. Your Mac appears just as another app and you can multitask as ususal.


Yup -- I was a bit disappointed that it can only simulate a single monitor, but I guess since it's working wirelessly there's bandwidth limitations.

Ideally I'd love it if I could simulate a 3 monitor workstation. Maybe for the next iteration.


If you can put up all the windows from your 3 display workstation, why would you want to simulate displays?

There’s a similar approach available for the Meta quest 2 (and I’m sure the quest pro and quest 3) but it takes a little reorienting to stop thinking in terms of “screens”


Wow, this makes me think of the fabled zooming interface[0]. Why limit yourself to a "monitor" or a set of windows when the sky's (literally) the limit? With a ZUI you could have the entire world at your fingertips. Browser history (or git commits) could just be further away from you in the Z direction. Or maybe it's behind you and you just have to turn around to see it.

[0] https://en.wikipedia.org/wiki/Zooming_user_interface


An old, ancient term for it is "Spatial UI". The window you need is right where you left it in the other room; navigating between your apps becomes like navigating around your house. Your coding apps are in your office and your social media apps in your bedroom and now you won't get the two confused and "accidentally" scroll social media while working.

In some ways this is particularly great, because humans involved to have a lot of spatial memory in this way.

(It's an interesting footnote here that the early pre-OS X Mac OS Finder was sometimes much beloved [or hated, depending on your OCD predilection and/or personality type] because it was a Spatial UI. Files and folders would "stay" where you placed them and you could have and build all sorts of interesting muscle memory of where on your desktop a file was or even a deep tree of folder navigations, with scenic landmarks along the way. Apple discarded that a long time ago now, but there was something delightful in that old Spatial UI.)


Have you ever tried using https://www.switchboard.app/?


On that ZUI - have you ever tried out https://www.switchboard.app/? If so what are your thoughts? I thought it was okay.


oh hell yash

The way I mentally organize projects would make this both deeply compelling and useful,

and a total disaster lol


From what they showed, you can’t break the windows out of the screen mirror rectangle.

You’re right though, if they allowed windows to freely float, it would also solve the issue.


Check out the “Platform State of the Union” video stream for more detail on that. They discuss windows, volumes, shared space, spaces. https://developer.apple.com/wwdc23/102


I assume they would be apps running on the device, rather than on a remote machine.


Ah yes, but those apps appear to be just some kind of iOS type thing, so at least for me I couldn’t really use them for productivity. (The lack of coding tools on iOS also really kills any possible productive uses of my iPad in this way, unfortunately)

(That’s why I was focusing more on the mirror-your-Mac functionality)


We might be at a point where that could change. This device seems like it could be close to providing the performance needed to start running productivity apps, and it also provides the screen real estate. Those types of apps have to be coming to iOS in the next few years.


Ipads could do that for years and yet nothing. Hopefully with the EU laws it will change though.


Exactly -- my iPad has the same SOC as my laptop, and my laptop is amazingly useful while my iPad is useful for...watching movies and Safari.


That doesn't make sense though. You don't need to render/composite what isn't visible.


Maybe. They told an easy to understand story in 10 seconds. Apple is amazing at educating the customer.


I don't think it can simulate any app. Likely, it is a feature akin Continuity, and you have to have corresponding app installed on your Vision Pro to pick it up from Mac and continue working on a headset.


I’m fairly certain that you can just create a virtual monitor in the vision pro that just mirrors the MacBook like any other display would.


While they did confirm Continuity would work across visionOS they also showed direct footage of your Mac monitor being displayed as an app while using your Mac.


Yeah I had missed that. It's such a neat feature!


I don't get this. There is no live demo so far, only a pre-rendered ad. So you have no idea what the actual experience will be like (in an industry famed for over-promising and under-delivering; remember Magic Leap?). The use-cases are also dubious: you can... watch TV alone? Scroll through photos alone? Take pictures? Only the virtual desktop thing was something that I thought "that's useful".

I'm unimpressed so far, maybe that will change maybe it won't. But right now I don't see anything worth being impressed by.


They gave 30 min demos to WWDC attendees the following day.

I'm excited mainly for two reasons: fantastic eye and hand tracking (according to reviewers such as MKBHD) and replicating my office/entertainment setup wherever I am (except for shared experiences, that is).

I think Apple tried to nail the seamlessness of the experience, rather than give you some amazing use case nobody ever thought of. That will be a good challenge for developers.


>Agreed, polarization is a good sign that this is going to make an impact. Ironically "unimpressed" is communicated by a lack of response, not by a negative one (which more likely indicates people's beliefs are being challenged).

To quote Elie Wiesel: "The opposite of love isn't hate, it's indifference." It's an extremely good barometer.


It's interesting that every single top HN thread is mostly unanimous praise for this device (which presumably no one has yet seen or used), while also painting themselves as the minority opinion.


Techcrunch concluded "The price reveal turned any ‘would buy’ in the room into a ‘definitely not’ without hesitation."

Anyways, bookmark the threads of folks calling an Apple product dead on arrival for a revisit in a few years.

The ipod, the iphone, the watch, the airpods... they've had a pretty good record and almost all these have had harsh criticism out the gate (while then going on to absolutely PRINT money for apple).

Apple is sitting on lots of cash and investment with operating cash flow of something like another $100B a year? Why aren't they allowed to take some risks on products like this. Facebook certainly has burnt billions in a similar space.


I remember hardly any significant negative criticism of the iphone, watch, or airpods.

Someone below brought up "when the iphone first came out it was 2G, was only on AT&T" - well, yeah, and those were very valid initial shortcomings that Apple pretty quickly rectified.

With the Vision Pro, I see very few comments putting down the actual technological achievements here. Comments seem to be pretty universal in thinking this is the best VR device there is. But the valid question is people are still having a difficult time imagining real, extended use cases where it doesn't feel like a novelty.

Personally, I think it's great Apple took a swing at this. I wouldn't be willing to bet one way or the other on its success, I think there are lots of unknowns, but I don't really have anything but high praise for the folks that built this.


Watch was criticized for its poor battery life and lack of usage other than health/training management. Now battery life is improved a bit and more health features added, but I think the OG criticize point is still valid. Why is it sold well is that it seems that many people care about health device than we expected.


Also the watch launched with a bunch of expensive ultra-luxury options that were mocked. Ive tried to lean heavily into fashion which was quickly dropped in later revisions.


Yeah it wasn’t as clear where wearables were headed.


Apple Watch also isn't really the game changer that something like the iPod or iPhone (or various Macs throughout the years) was. Sure I see people wearing them, but not a tremendous amount, and not completely out of line with something like a FitBit or a Garmin.

Apple created a very competitive product in an established market with the Watch, they didn't change the game.

Which is where I could see the Apple Vision Pro ending up, but I'm sure that's well short of Apple's expectations.


AirPods were called 'q-tips in your ears' from people who thought they looked stupid but that faded pretty quickly once the utility became clear


well there was that time [0] Rudy Giuliani wore them like a space-alien, that was kind of funny.

[0] https://duckduckgo.com/?q=rudy+guiliani+air+pods&ia=images&i...


The iPhone wowed everyone but its price was heavily criticized. Apple later got into the exclusive AT&T deal which "subsidized" the price. People just ended up paying more over time.


IPhone didn't have a pen. It didn't run symbian as it's OS. This is what I remember people complaining about.


Let's not forget Steve Balmer laughingly mention that no serious business user would ever use a phone without a physical keyboard. People here are negative for the sake of it.


He was negative for the shareholders


I bought one after using a blackberry and I was instantly sold on it because you could browse websites as if it was a computer, zooming in to the text section with a double tap. I remember my daughter wouldn't entertain the idea because it didn't run blackberry messenger which was the killer app for kids at the time.


It didn’t have a keyboard. Serious smartphones have keyboards.


> I remember hardly any significant negative criticism of the iphone, watch, or airpods.

Sounds like you have a memory problem. I’m sure you can find the threads archived if you need reminded of the criticisms.


Criticism of the iPhone on it's debut was absolutely vicious.


Criticism of the iPad was even worse.


I just want to point out: At the time of the iPhone launch, AT&T's business model (and every other telecom up until this tipping point) was to sell "minutes" which was essentially micro-charges for consumers who want to make calls or send texts.

This was mostly an infrastructure problem that Apple innovated on and helped AT&T solve- carriers would no longer need to sell "minutes" but could instead sell Data, which was a much better value proposition. There's a quote in the movie Blackberry along the lines of "the problem with selling minutes is that there's only 60 of them in a minute to sell".

I can only assume this attributed to the global adoption of the "data sale" model (and the iPhone with it) since the profit ceiling was exponentially higher for every carrier.


This is 100% wrong. The original iPhone plan from AT&T included unlimited data but was still capped on minutes: https://www.apple.com/newsroom/2007/06/26AT-T-and-Apple-Anno...


Data plans existed before the iPhone. And Europe was much more hardcore with minutes than the states at that time (I remember that ATT standard plans were not unlimited talk at that point, something that was unheard of in Switzerland where I was living in 2007). In fact, I think the innovation was something like unlimited data?


I've never spent more than $400 for a smartphone, always bought second hand Android phones. My income went up in the last couple of years and a few months ago my phone broke. I bought a $900 iPhone.

If it's good people will buy it. I will buy it. No doubt about that.


I've tried $100 phones from the Walmarts, and I've tried the top of the line Pixel phone a few years ago. Nothing comes close to a iOS or iPhones.

I just wish they made a printer. I'd buy an Apply printer in a heartbeat, I don't care what it costs.


Hear Hear! A printer and a WiFi AP!


Apple's used to have the AirPorts (Express and Extreme) as WiFi APs. They were pretty good.



Oh wow. Dammit.

Your link helped me find the Snow White Design Language: https://en.wikipedia.org/wiki/Snow_White_design_language


People paid $550 for a pair of headphones. They'll buy this if it's good


to be fair, $550 for a pair of headphones is a lot, but it's not even close to top of the range


It was definitely in the upper range of prices for over-the-ear Bluetooth headphones, not like this even matters, because people just _did not_ pay $500+ for headphones before the Maxes dropped.


>people just did not pay $500+ for headphones before the Maxes dropped

are you sure? why do you think this?


I think they’re saying that apple is selling $500 headphones to people who would otherwise not buy $500 headphones


Just wanted to piggyback on this. I was just commenting today to my coworker on the number of people I see walking around Manhattan/Brooklyn with AirPods Maxes on. I swear I see at least a pair or two every other minute while walking down the street.

Most of these people, like you said, we’re likely not blowing $500+ on headphones before Apple made that concept mainstream.


Sure they were, or at least close.

Beats: $400 out the door. Bose Quiet Comfort, whichever is the most recent: similar. Sony also sells ~$350-ish noise cancelling headphones.

Going from $350 to $550 is roughly the normal apple premium.

This is $3k.


Honestly, nah. I don’t think the majority of these people were blowing 400 or 350 or whatever on alternatives beforehand.

Granted, this is 100% anecdotal, but I’m seeing way more people rocking AirPods Max around the city every day than I remember ever seeing rocking over the ear headphones, let alone expensive ones.


I see airpods more often, but sure, apple did discover a huge market here (and not so kindly pushed people towards it by the removal of jack ports). But I don’t fault them, the airpod pros are really cool and the comfort of noise cancellation especially on public transport is a godsend and it makes sense that many people actually cough together the price for that.


this is $3k, but then how much are other premium VR headsets?


They're also selling much worse headphones than what an audiophile would buy for $500+, but trendier.


some people obviously did pay $500+ for headphones. We don’t know how many sets has Apple actually sold…


> $550 for a pair of headphones is a lot, but it's not even close to top of the range

It's top of the range for typical consumers. The people who wear Apple's $550 headphones aren't people who are buying Sennheiser HD800s'. Before, people would've spent $200 or up to $300 on the Bose ones. Apple got them to spend an extra $200-250.

I'm surprised by how often I see these headphones. They were basically nonexistent in the Bay Area but I see them often enough in NYC.


People were paying similar amounts for high-end headphones for years.


Not the same people though


Except no one bought those headphones.


Not true, I bought them and really dislike them.


I owned two Boses (QC35II and 700) and two of the best that Sony has to offer (XM4 & XM5).

The AirPods Max blow both out of the water in comfort, usability and ANC.


I did the same quest and I'm happy with the AirPods Max. I remember reading Apple originally wanted to make them better but they would have been more expensive so they didn't, I wish they had.

I have more expensive headphones than the AirPods Max but these are what I use the most.


You'd be surprised how popular they are. Certainly they're overpriced, but the noise cancellation/sound/build quality/etc is very good. They've also apparently become something of a celebrity "it" item: https://www.vogue.com/article/are-the-airpods-max-the-latest...


I see these people in the gym with all the time with them. I think your “no one’s buying them” might be rooted in a personal bias.


I personally love those headphones even despite their price.


I love mine and would rebuy them without blinking.


Wow fascinating. How much better than AirPod Pros are they? I tend like the minimalism of the AirPod line. Super discreet, can easily stash in your pocket, can be listening to music anywhere and no one even really notices, etc. Oh and they work just as good on the treadmill or while running.


You would just have to try out the AirPods Max to find out :)

Personally, I cannot say as I have never owned nor used any of the other AirPods. If you are looking for mobile usage, the Max aren't the best choice. They are large and heavy and there is nothing discreet about them. Also, I wouldn't even consider running with them.

I use them at work or at home where all of this is no issue and I just want to enjoy the best music experience.


If this is you use case some Audio-Technica ATHM50XBT would probably have way better sound at a 3rd of the price. For over a decade it's the most used headphone set in professional studios for a reason.

P.S. not saying there aren't better headphones, just that the price ratio is great with these ones and the sound to my ear is better than on airpods pro. No noise cancellation though.


I don't doubt that there are other great headphones. However looking closely you will find that the AirPods Max offer an interesting combination of features. Like noise cancelling, which can be quite a big help in certain situations. Even with modest background noise, the noise cancelling can just increase the music listening experience, as you just hear the music and no background noise. Also nice is spatial audio. Especially for movie watching. And I have grown fond of the build and looks of the Max.


No noise cancellation though.

Then it's a completely different product...


I read reviews that actually say that the 2nd gen pros have better ANC than the Maxs.

So depending on what’s important for you they may be the better choice.


Whether or not it's the right device, it's definitely being introduced to the wrong economy.


iPhone 3G was released in summer 2008, right in the middle of the biggest financial crisis since the Great Depression. Arguably, this was the beginning of iPhone's rise in popularity. The original iPhone was released in 2007, and the cracks in the economy were beginning to show then...


But it cost $500 ($700 in today's dollars) and the day-to-day utility of cellphones/blackberries had already been established for a decade. Your example doesn't seem that comparable.


All those phones did cost 200-300 IIRC. 500 was outrageously expensive. For this device it's 3500, will be much cheaper for non pro version plus production of critical components will scale up significantly. Sony can produce less than a million displays for that thing per year, it is understandable why they are expensive.


Sure, but again… it’s like 10x the price of a competing piece of equipment which is still regarded as pretty niche. In the case of an iPhone, 2x the price for 1000x the functionality was a clear “buy”. There’s a reason that among basically everyone I know (mid class millenials) scoffs at watches and iPads but is a complete iPhone addict - the value proposition is just that good. For all the talk of these ancillary/luxury devices, the fact remains that the iPhone (or Android knockoffs) is still the absolute Crown Jewel of tech that cuts across demographics in a way that their other products do not.


And to be honest, a handheld magic cube that fits in your pocket and can display anything and be interacted in any way really is as magical as it sounds. Plus it is a quite good camera as well.

I really think that smartphone design is close to the optimal sci-fi tech for humans, exactly due to it being handheld. We rely on vision and touch the most and I think it combines those well. I am almost sure that VR would even in theory get as popular as smartphones, all else being equal.


Another data point for you, the Quest Pro was $1500 on launch and is now $1000.


The people that can afford this aren’t impacted by the economy. It’s a professional tool and the expense can be justified. It’s not a product for ordinary consumers yet. On top of that it’s not out until next year - who knows what the economy will be like then.


> It’s a professional tool

That you use to look at family photos, use iPhone apps in a giant window, watch movies, and play with VR Mickey Mouse? The presentation seemed to lean more towards the consumer than industry applications.


They did but that mostly seemed silly to me. Multiple monitors was the main thing that jumped out as an actual good use case. They need to market all aspects of it but they’ve named it “pro” for a reason and I feel like there was a lot of focus out on productivity uses (conference calls, browsing, multiple displays, 3D Models).


Apple's "Pro" naming is somewhat a random. Here's my ranking of Pro-ness.

Mac Pro >= Pro Display XDR >= ProRes >= Logic Pro > FinalCut Pro > Vision Pro >= MacBook Pro >= iPad Pro > iPhone Pro >>> AirPods Pro


Marketed to the general public but will be used by pros.

The goal is excitement and investment in the app ecosystem so, when they figure out the form factor, the cheaper/lighter/more useful future device is a bigger hit.


It acts as an infinite screen extension of your computer...

Many professionals would be thrilled to have a portable multimonitor setup that they can use from the couch, bed, airplane, train, Uber...


Yes, I feel a lot of people are too tied down to their biases and social bubbles. I'm working in the area and you see great use of these devices from medical, to architecture, and mechanical engineering.

I understand the skepticism, but sometimes our perception of the world is quite narrow. Given that most of us are developers, even more so.

I don't mean to be condescending, I just feel that way a lot with both myself and my colleagues when exposed to fields and constraints that we haven't seen before.


It is the first version of the Vision Pro and I would expect it to fail due to its price.

The second or third version maybe something worthy of the consumer having a look at. This is directly competing against the Quest Pro, and the Vision Pro is still at prices like the HoloLens.

Apple will probably announce a 'Lite' version which will directly compete against Meta's cheaper Quest VR headsets.

> Facebook certainly has burnt billions in a similar space.

And their Quest VR headsets already outsold Xbox Series X/S. [0]

[0] https://www.thevirtualreport.biz/data-and-research/65297/que...


How do you define failure? I reckon that if people start to make apps for this device, then it’s served its purpose. The next generation, or “lite” version will arrive to an already-populated ecosystem. Meanwhile Apple will have a lot of data about what worked and what didn’t to tweak their direction.


> The ipod, the iphone, the watch, the airpods... they've had a pretty good record and almost all these have had harsh criticism out the gate (while then going on to absolutely PRINT money for apple).

Looks like you and me have a completely different memory on this? iPod, iPhone were almost unanimously praised at the moment of announcement, thanks to Steve's magic. AirPod also received generally positive reactions. Apple Watch had a genuine issue on its product positioning and its success came after fixing that issue.


The reaction to the iPod that everyone remembers was "No wireless, less space than a Nomad, lame", never mind the criticism of an apple only device, or the cost (honestly, the mac mini and maybe the m1 airs are the only two devices I can think of apple has released that people didn't complain about the price).

The iPhone in addition to pricing was also widely panned for being 2G only, for being AT&T only, for requiring a data plan, for not having a physical keyboard, for not having a stylus and for being something no one needed because our phones and ipods already do all of that.

The iPhone did get a better reception than the iPod, but that's probably owed to the success of the iPod in proving Apple might just have an idea or two about how to make a new piece of cool tech, but it had plenty of poo-pooing by the tech class too.


> never mind the criticism of an apple only device

The first iPod was predicated on FireWire and iTunes, which were basically only available on Macs at the time.

(iTunes - Jan 2001, iPod - Oct 2001, iTunes Store - Apr 2003, iTunes for Windows - Oct 2003.)


iPad was definitely mocked.


Not claiming it's a minority opinion, but early on there were multiple submissions that were dominated by people rushing to proclaim that it was DoA. One claimed it was the end of Apple. There is a huge disparity between people who click an arrow and people who comment.

And you are absolutely correct that the enthused haven't used this device, or even heard from a non-Apple employee that tried a beta. I am hugely concerned about long term comfort, particularly in the eye fatigue realm, for instance, and will be watching to see what the sentiment around that is.

If it were many other companies I would honestly be much more skeptical about it, but I mean Apple has a pretty good track record of actually delivering products that meet or exceed their promises. And they really promised the moon with this reveal.


I completely agree with you about the tiredness of the eye or fatigueness of the eye. It's really hard to imagine someone wearing this kind of device for a very long time without feeling any pain. I'm not sure exactly the reason why this pain came from. But I think the question we face is going to be maybe the next big thing for humans, which is going to directly connect all those sensors directly connected to our central brain without using the eye. But that's kind of a science fiction thing. I'm not sure I'm going to have a chance to experience those things.


Ditto. I can’t see this being used portably so do wonder if the 2 hour battery life is a clue on how fatiguing the experience might be.

2 hours I guess covers a commute, but it’s hardly handheld form factor - how much bigger would it need to be to get “all day wear” battery life? It doesn’t feel like a real spatial constraint, so can only presume >2hrs is not required in actual use.


I see the pattern is that C is complained by most of people. And there is another type of programming that is that, which people never talk about. So just by talking about, regardless it's positive or negative, there is a tension in there and it's expectation, it's our will to kind of devices or this technology came into being. So eventually it will become part of our life and I hope that day comes sooner and this company will not disappoint us.


There's a famous macrumors forum post of people raging against the iPod, saying it will be a massive failure. We've seen the same reaction from every Apple hardware announcement since.

The original post in 2001 is still live. Read it for a laugh: https://forums.macrumors.com/threads/apples-new-thing-ipod.5...


TBF, there's an equal amount of "I haven't touched the product, nor even read reviews of people handling it in their hands, but I'm totally gonna buy this only based on the marketing material"

I kinda loved how Accidental Tech Podcast's host joke about not having even heard of the product yet but they'll probably buy it for personal use either way.

The pendulum has fully swinged the other way for the a sizeable chunk of people I think.


Apple also released https://en.wikipedia.org/wiki/Apple_Newton

Which was a great idea and a very innovative product literally ahead of it’s time by 15 years.


I bought a used newton from the lead engineer on the newton. I loved the device and used it regularly until my then girlfriend stepped on it and broke the display. Needless to say that relationship wasn’t long after that ;-)

I later had a palm. It was garbage compared to the newton even if it was 1/8 the size. I’m glad to see the newton essentially return as the iPhone/iPad.


> There's a famous macrumors forum post of people raging against the iPod,

Back in the day, it was the Slashdot take: "No wifi. Less space than a Nomad. Lame."

https://slashdot.org/story/01/10/23/1816257/apple-releases-i...


> Read it for a laugh

Steve Ballmer also laughed so much of the iPhone being without a keyboard :) , It turned out to be one of the most innovative products in history.


I'll admit to being quite skeptical of the iPad and was wrong about that.

That said, despite owning a Quest 2 and eagerly awaiting the Quest 3 release, nothing in this headset particularly appeals to me. (Am mainly into rhythm games and am guessing those wouldn't be nearly as fun without the haptics in into other headsets' controllers which this seems to lack).


That was actually quite funny thank you. Reminds me of my friend in highschool who was a Zune fanboy.


> NO!

> Great just what the world needs, another freaking MP3 player. Go Steve! Where's the Newton?!


Personalities have not changed.


> But it's going to be a hit. HN is going to be swamped with "How I used Vision Pro to..." posts when it comes out.

I'm not going to predict whether or not this is going to be a hit, I just don't know.

However, remember when Google Glass came out there were tons of these "how I use" posts and I remember people even changing their LinkedIn profile pictures to be with Google Glass. And, we all know how that turned out.

So, early posts by self-styled influencers or wannabe's are in no way predictor of success, or failure, of a product.


I wonder how intentional Apple was about picking a name that can’t be turned into a schoolyard insult like “Glasshole”?


If you ignore the “vision”, you can just call them A(pple) Hole Pros.

You can pretty easily make fun of apple products. We just don’t do that because their products are good.

The moment a bad iPhone comes out someone will start calling it an iSore.


Pission


Most Google hardware products turn out similarly bad


> The displays in this device are crazy.

I'm actually curious about this, and how the displays will actually feel. The ads/keynote all talked about how they're "more than 4k for each eye", which sounds like a lot when you're talking about TVs or monitors, but... stops sounding quite as impressive when you realize you're talking about IMAX-sized screens (which is the main "wow" draw for watching movies in VR), or when talking about augmenting reality.


Yeah, 4K per eye stops being impressive when it's five inches from your retina and you're trying to read fine text. Pimax has had a 4K/eye device for years already: it's nice but still nowhere near good enough to do things like replace your computer monitor. They're planning to ship a 6K/eye device next year, which will probably still not be enough. The real world has a very high pixel density!


They have eye focus tracking for sure in this, so maybe they can render in adaptive resolution mode je only highest rest in center of vision? Who knows?


Adaptive resolution rendering doesn't add more pixels to the display -- if you want high resolution for the spot the user is currently looking at, you need that resolution across the entire display.


Theoretically, you don't really need it across the entire display - this is achievable with eye tracking paired with fancy actuated curved micro-mirrors (so-called DMDs) that can dynamically make part of the matrix concentrated in a smaller focus area (viewed through a curved mirror) at cost of peripheral picture quality. But it's extremely complex tech and I'm not aware if it's available in any consumer-grade devices. The alternative is liquid lenses, but I think micro-mirrors are more researched topic (I'm no expert in either, it's been ages since I last studied physics and I wasn't good at it even then).

It is remotely related but different from "classical" foveated rendering (which is just a way to get better framerates), as it's an actual optical system. With DMDs you also need foveated rendering (and fancy transformations, as displays are no longer projected uniformly over time), but foveated rendering alone is not sufficient.


They already listed foveated rendering in the features (which I believe is what you're describing). It use the graphics performance budget efficiently, but it can't physically add more pixels.

It's really cool technology anyway, and according to PSVR2 reviews, it seems to work well.


Hopefully focal adjustment tracking too. I've got a feeling it's just for the selection UI.


I think resolution will be important the smaller (or further away) the movie you’re watching is. And for things like text in apps.

If you’re watching an IMAX-size screen in AR, the resolution of the content will be the main factor, I think, rather than the density of the goggle displays.


Each pixel is 7.5 microns. Assuming RGB, that's 22.5 microns. Thats at the maximum limits of detail an eye can see.


I have 2 4k screens in front of me right now. I can close one eye, and without moving my head make out the entirety of both screens. They cover most of the non-peripheral horizontal field of view, but you could easily fit in another 4k screen on top of each vertically. I can make out individual pixels (when there is a gradient, like with a small font) on the screens. Higher resolution screens of the same size at the same distance would let me read slightly smaller fonts.

That is, at a resolution in which pixels are still perceptible, I can make out more than 33,177,600 pixels (4 4k screens, equivalently 1 8k screen) per eye. This device has less than that. Less than half that per eye. It's not "at the maximum limits of detail an eye can see" even assuming they just have no wasted pixels in your peripheral vision.

7.5 microns means nothing without knowing what lenses it goes through.

That said, I think it might be enough pixels to be useful for reading text. Unlike the index I own, where that is just unpleasant.


That's not enough information. It's behind a lens that spreads it across your entire field of view.


Assuming they're square. Roughly calculating (23 million pixels between the two with no space between 7.5 microns,) that's 25.432mm^2. they've said they're the size of postage stamps. This ties in.

I think it's near safe to assume there's no real gap between pixels and thus indiscernible. The lag might be a thing.


Once again, the absolute size is irrelevant - postage stamp or otherwise. It's optically scaled to fit your field of view - essentially under a microscope. There are VR devices with 4k screens already, and it's still not enough to be indiscernible to the eye - especially for things like text.


Not having visible gaps between pixels is a necessary but woefully insufficient condition for high visual fidelity.


This is an over confident audience very sure that their experiences and perspective is representative of the mainstream. See the rsync vs. Dropbox meme.

The execution is all that matters here not any speculative flaws. If it’s a delightful, polished, responsive experience for the stock applications, other use cases will come. I don’t want to bet against Apple achieving that bar. They’ve done it over and over again before.


> If it’s a delightful, polished, responsive experience for the stock applications

IMHO this is a perfect description for the Apple TV.


I’m still amazed Dropbox is making money. Doesn’t windows come with a Dropbox clone built in?


Every. Single. Apple product launch post. “Meh”, “I can’t see the use case for this”, “it’s all already been done before”. Like clockwork. Then they’ll sell a million of these, and by v3 it’ll be much smaller / better / cheaper, and gain mass adoption. It’s like people have an “apple event reaction” algorithm going, and it never changes.


It sure can be used as a display for a Mac. Just stand in front of a Mac, and the screen will go dark and the windows will be moved to your Vision Pro.

Here's the point in the Keynote showing it: https://www.youtube.com/live/GYkq9Rgoj8E?feature=share&t=552...


I'm wondering if a similar trick will be used with iPhones and Apple Watches that are within view. Will they bother with the camera reading the screen and then rendering to the visor, or will they just seamlessly talk to the phone and watch to get the screen imagery. I'm assuming that would improve the quality.


There was one moment in the presentation when a guy at the office opened his Macbook Pro and the screen popped up above it much larger.


He also used a keyboard and trackpad.


I agree 4k in each eye sounds insane. But eye strain that's going to be the big determinant. I initially thought it was transparent OLED at the start but to my disappointment it's just screens. Perhaps they've got the focal adjustment thing Magic Leap was trying to do right.


4K is not much if you consider that these pixels have to cover the entire field of view, not just a relatively small screen.


It’s certainly a generational jump from the Quest series at least. Of course the price is completely ridiculous


The best an eye can discern is roughly 20 microns, but generally far higher at 100 microns. They said 7.5 microns per pixel (X3 for RGB is 22.5 so roughly there without space).

Assuming they're square. Roughly calculating (23 million pixels between the two with no space between 7.5 microns,) that's 25.432mm^2. they've said they're the size of postage stamps. This ties in.

I think it's near safe to assume there's no real gap between pixels and thus indiscernible. The lag might be a thing and focus, but this might actually not be a problem.


> The best an eye can discern is roughly 20 microns

The size of an object doesn't matter. What matters is how it gets projected onto the back of your eyes.

There are 120 million rods (black and white) and 6 million cones (color) in a single eye. You would need at least as many pixels. But photoreceptors are not evenly distributed, so to account for moving your eyes across the screen, you would have to have even more pixels.


What if the would move the screens? E.g. Similar like they have in camera optical stabilization they slightly move the sensor array. They could make screen with not uniform pixel density but more dense in center and then do eye tracking and shift those screen mechanically depending where eye will be focused. Probably not easy to pull off as camera optical stabilization (need bigger movements and screens more heavy than weight of camera sensor) but maybe not completely impossible? Oled screens are very tiny and flexible just probsbly hard to make it non fragile.


The pixels may be 7.5 microns but you’re forgetting that they are viewed through a lens. The point stands: 4K pixels for the full field of view, which is a lower density than 4K for a small screen.


The lens can be directional focusing your vision onto a certain point, also your peripheral vision cannot discern as much detail. They've stated it is on a chip the size of a postage stamp. So we'll have to see how the lens directs it, when it's released.

Edit:sort of a Magic Leap type thing. The further out you look from the centre of the lense, the more the lense curves back to the focus your eye on the centre. With the eye tracking changing the image to compensate for your eye movement.


Unless they're magically changing the shape of the lens in response to eye movement that doesn't seem physically possible.


What if they shift screens mechanically in response to eye movement? Similar like apple camera optical stabilization works by shifting slightly camera sensor array? And if the screen pixel density is not uniform but more dense in the center? Hard to pull off but I guess not impossible.


Even if that where possible, I doubt that the accuracy of the eye-tracker is sufficiently high for this approach to work.


>The best an eye can discern is roughly 20 microns

That's not how it works. You need an angular resolution.


There’s Sightful’s $2000 device you can buy right now https://www.sightful.com/ I’ve used an early demo of this and was very impressed. After the demo I had the strong feeling Apple is going to build something similar and I was right


4 Million Pixels is so terrible for an AR/VR headset. 23 million pixels will be indistinguishable from reality for all intents and purposes.


The human eye has an approximate pixel resolution of 120 million pixels per eye. On top of that, our brain constantly processes and integrates the output of our eyes. This creates an even higher perceived pixel resolution of about 480 million pixels per eye. Some estimates are even higher.

I'm not saying Apple created a bad product...but I wouldn't expect a mere 23 million pixels to be indistinguishable from reality.


The human eye actually has terrible resolution. We only see in high resolution in the fovea in the very center of our eye -- basically the single point of primary focus. Resolution beyond that drops off dramatically (1/7th and much worse).

I've seen people claiming on sites like Reddit that people who watch with CC on simply read it in their peripheral vision while focused on the action, and that just isn't possible in most situations for the reason I mentioned. You actually only see high resolution in the middle 1 degree of angular view.

So to come up with such a number someone took the entire FOV of the human eye and assumed that you focus your fovea on each and every angular degree of it.

That's neither here nor there are your point is as valid -- where you're focused on will have a pixel density below "reality" for your fovea, however it presents lots of optimization potentials in software (e.g. no need for fine rendering outside of the focus) and in hardware. There are already devices which use tiny mirrors and optics to basically concentrate the pixels wherever you're looking and render a distorted view to match.


It will definitely not be indistinguishable from reality, but might be good enough to fool us after a short getting used to, similarly to how even 24fps is enough for continuous motion. Of course you will see 60fps as more “fluid”, but only in comparison. And afterwards the differences quickly plateau and not many people can see any difference between 120 and higher.

It is probably similar with this as well, the question is where apple stands on this scale.


I think it’ll be 5 generations before it’s a real product. I’d note the first iPhone was kinda garbage as was the first iPod. For the iPhone the App Store was empty and the apps that existed for years were pretty rudimentary. It couldn’t hold a phone call open. It was clunky and comparatively terrible hardware. Apple has the ability to invest and innovate on an idea for decades incorporating advances, fostering investment, and building an ecosystem.

The jaded take to my ears sounds a lot like the LLM / generative AI take - looking at the first real generation and claiming it’s an evolutionary dead end of hype monsterism. I feel sad that people that likely got into this field as a dreamer of what can be are stuck seeing what simply is.

Will this usher in rainbows end within the next 20 years? Maybe. Maybe not. But I’m always happy to see there are still nerds that can dream of what can be, even if they’re often drowned out by the chorus of what today isn’t.


I do see your point and it is true that every product is going to be more mature, more complete for the later publication. But things of a first generation product like this is going to be a huge risk for a lot of people. But the things I want to talk to myself is probably if I can pick up one thing or maybe one or two things that this device can solve that probably doesn't have a good solution in the market, then just go for it. And if it is affordable, then go definitely do it. The upside of doing this is you cannot change your workflow in the early stages. So if you consider the time you put into that product in this new workflow, the things or the productivity you gain from this early experiment is going to be more productive. But gain, it's a risk.


Yeah I think first generations of apple products are for the curious, the rich, and the engineer seeking to build the next generation of apps on their new platform. I never look at them as “a good deal,” or a mature product. I think that’s foolish for any 1.0 of anything. Generally 3.0 is where maturity begins, and 5.0 is where incrementalism starts.


Wow a reference to Rainbow’s End! IIRC that novel was set around 2026. I don’t think we’ll have Vinge-style AR/VR contact lenses for many years to come. Certainly not by RE’s fictional timetable… :(


It’s ok. Error bands on SF are wide and shifted right. Mostly because jaded skeptics that cling to the constraints of the present kill the dreams until someone has the wherewithal to ride out the skeptics. Say what you will about Musk, apple, etc - they set absurd goals and fail half way, but that half way is the stuff of science fiction.


The cost is prohibitive, but I can't think of anyone who I trust more to introduce a cutting edge consumer device.

I wont be a user, but I hope they succeed.


The fact this thing has an M2 makes me surprised Apple didn't try to sell it as a Mac.

I feel like at the price point, this device makes much more sense as the kind of thing that could replace a laptop/desktop than as a companion to it.

If you can check connect a magic keyboard/mouse to it, this thing could conceivably be a MBA and badass multi-monitor setup rolled into one. And to me, that's really the only way this form factor makes sense.


It is a viable first entry as an AR computer. Does it need to be anything more than that?

In 10 years with GenAI video creation and GenAI NPCs it could be bonkers cool.


> It is a viable first entry as an AR computer. Does it need to be anything more than that?

It needs to do what HoloLens and Google Glass didn't.

Sell well enough to attract developers and improve manufacturing economies of scale.

For what it's worth, I think Apple has a chance here - there were smartphones before the iphone, but apple made the first one good enough to take off. Perhaps this will be the same?


Don't think we'll take 10 years. GenAI NPCs are like 1-2 years. GenAI video is about 3-5 years max.

A little scary bringing a kid into this world. I've seen how my nephews and nieces get completely absorbed by screens.


I too have trouble thinking this is truly “cool” - it’s basically a self-contained Plato’s Cave. I feel like the “cool” of the next decade will be distinctly luddite-inflected, but who knows.


Until all of the cool stuff is hidden behind paywalls.


Yes, the presentation shows it used as a display for a Mac. Incidentally, you can also do this with the cheaper Quest Pro headset (or any headset in the Quest line, so $300-$1000 price range - but you don't get as many pixels). There are a few options for the software, VRDesktop (https://www.vrdesktop.net) being one.


> the presentation shows it used as a display for a Mac. Incidentally, you can also do this with the cheaper Quest Pro headset

You may be technically be able to do it on Quest, but it's mostly useless because text at non-massive sizes is completely illegible on current headsets.


> But it's going to be a hit.

Well, if nothing else, the influencer / celeb culture will make it so. Apple, unlike other tech companies, almost has a monopolistic grip over it.

I mean, they sold AirPods for the most ridiculous price and yet they beat sales numbers of just about everyone in the audio industry.


This, still?

Do people like you think that people like me buy AirPods because influencers do?

Might it just be that they’re astonishingly good wireless headphones? I mean is that possible in your mind?


The product may be good, but I am talking about its price, and how it sold like hot cakes anyway. Consumer goods don't sell as much without marketing, which influencers / celebs provide for free to Apple.


I'm predicting right now that it's going to have performance problems with that display. While they haven't released exact resolution numbers per eye, 23M would give it a slightly higher resolution than the HTC Vive Pro 2, a headset which requires a GPU. While mobile chips have really impressive CPU performance, I don't think they're nearly as competitive in the graphics space.

Knowing Apple, they're also not going to support anything else besides Apple Hardware so you won't be able to hook it up to an actual gaming rig like you can with the Meta Quest 2. While this isn't a big deal for a lot of people, Apple is taking a huge risk releasing a very premium product like this without supporting the largest established VR market (gamers).


> Apple is taking a huge risk releasing a very premium product like this without supporting the largest established VR market (gamers).

This reads like "Apple is taking a huge risk releasing a new smartphone without supporting the largest established market (BlackBerry device users).

The VR gaming market is microscopic compared to what Apple is likely aiming for here. They do not give a single flying fuck about this "established market", nor have they for any other market they've entered. The entire Apple ethos is to completely change the narrative for whatever product category they enter. They did this for phones, for bluetooth audio, for watches, and—whether or not they're ultimately successful—you can bet your ass this is their intent for wearable headsets.

What's the eventual end goal for these devices? I'm not sure yet, but I'm certain it will become clearer in the coming years. My expectation is they anticipate this will come to replace fixed displays for a huge number of office workers. Maybe not with this first revision, but by gen 3 that's my bet for the market of this device. If you assume it get lighter and comfortable, higher res, and better battery life over the next few iterations it's clearly something that could just be your work machine with a paired bluetooth keyboard.


VR headsets are very personal from a cleanliness perspective. I would never share one. There's a reason why the padding around the visor is removable and washable.


I keep wondering how the demo units at Apple Stores are going to be kept non-vile.


Most people don't share monitors either. And very very few office workers share laptops, which is what they're suggesting


To chime in on the last part, I imagine that it could be beneficial for Apple’s offices alone; every employee is able to create their preferred workspace while using less physical space; only really needing a desk, keyboard, mouse, power & internet source and a seat


The only reason I sit in a fancy ergonomic chair is to be able to view my monitor(s) properly.

If the monitors could be virtual using an AR headset, I could just sit in a la-z-boy with a cupholder and a massaging seat :D


> nor have they for any other market they've entered

They don't care about iOS games? Apple Arcade?


> The VR gaming market is microscopic compared to what Apple is likely aiming for here. They do not give a single flying fuck about this "established market", nor have they for any other market they've entered. The entire Apple ethos is to completely change the narrative for whatever product category they enter. They did this for phones, for bluetooth audio, for watches, and—whether or not they're ultimately successful—you can bet your ass this is their intent for wearable headsets.

Apple is also the company which released https://en.wikipedia.org/wiki/Apple_Newton back in the day… They turned out to be right at the end but still had to renter the market entirely from scratch after 10 years. So far Apple has been great in “perfecting” products that already exist by doing the right thing at the right time.

They weren’t the first or the second to release a smartphone, smart watch, tablet, BT earphones etc. all of those had established markets and somewhat clear use cases Apple “just” streamlined and turned them into something that normal people would actually want to use. It’s seems a bit to early to do that for VR yet. So in a certain way they are in somewhat uncharted territory.


Whether or not they're successful is irrelevant to the question of what their intent is. But I find it telling that your initial reaction is to reach for a device that failed thirty years ago as if it has any relationship to modern Apple.

They didn't "just" streamline the smartphone. They destroyed virtually overnight the existing dominant players in the smartphone market and within a few years essentially ended the existence of non-smartphones as a market category entirely. They didn't "just" streamline the watch. Again, within five years of entering the market they overtook (in units) shipments of the entire traditional watch industry. Both of these examples are significantly larger and more entrenched than the existing VR gaming market.

Of course not every product of theirs is successful in doing this. But without question, this is their aim a majority of the time.


> find it telling that your initial reaction

Telling what? My point was that Newton was a brilliant idea yet the hardware wasn’t there yet and it didn’t have clear use cases. Both concerns apply for Vision Pro so at this point it’s still closer to the Newton than the iPad

> They didn't "just" streamline the smartphone

They did exactly that which is why it was so brilliant. You could do everything you could with an iPhone with other devices before it came out. It’s just that the experience was quite poor and all other devices were underdeveloped and had serious flaws in comparison (to be fair the first gen iPhone was a pretty lackluster device too).

You could browse the web, watch video content, send messages/emails, listens to music, play games, make video calls. Did Apple invent any of that? The iPhone was a just a device which could do it all with much nicer UX than anything on the market.

VR is very different in that regard.


> Apple is taking a huge risk

Let's contextualise this ... they have so much money in the bank there is literally no way to spend it. This could completely flunk and have zero impact on them. There's no risk here for Apple. Perhaps the question is why they aren't being more adventurous, or pushing this harder by subsidising the gen 1 device to get it off the ground.


The risk is brand dilution. Apple has a reputation for not launching products that flop


Yeah it's clear their focus isn't games. There's no way the GPU can push those pixels with the graphical fidelity expected by gamers. But I'm sure it will have no problem pushing the raw pixels as long as you stick to mostly graphical compositing-level graphics like all the productivity/lifestyle stuff they were showing in the demo.


Roughly double the amount of pixels = "slightly higher resolution"?


Sqrt(2) = 1.4 so there are 40% more pixels per inch. It’s not a different order of magnitude.


The Vive Pro 2 has ~12M pixels. This has 23M. That's nearly double. We don't know the FoV so we have no idea was the pixel per degree density is.


Double pixels still means only 41% better pixels per inch, per mm or per degree.


Performance wise, in the Platform State of the Union, they mentioned that they will use eye tracking to choose which parts of the "screen" to render at high resolution. That should help a bit.


First off, mobile chips are actually quite good at high resolutions (but usually lack bandwidth). But this is an M2. That's not a mobile chip.


100 games at launch isn’t aiming for gamers? That’s at least decent compared to the quest.


100 games on Apple Arcade*

How many of these will be windowed iOS apps? I assume most of them.


It doesn't matter what games it has if it doesn't have _my_ games.

That's really what sets casual gaming devices (Apple TV, iPhone, iPad, etc) apart from actual gaming devices.


Yeah, although the part about it just had someone use it to extend the native monitor.. I'd be curious how deep that integration went... more than a large virtual monitor, to have you able to spawn multiple/infinite windows of any of the mac apps on it, that'd be killer!


Probably a v2 feature that isn’t ready yet. But I’d be surprised if they weren’t working on it after the widget stuff on Mac desktop.


> did they talk about using it as a display for a Mac

Yeah, you look at the screen through the headset and then pinch to move it around and grow/shrink it.


Early to tell. No point predicting who will say what, when people will say everything on the scale eventually.

People have different needs, and use cases and are affected by the way things become implemented. The details of usability, impossible to tell just now.

Probably one thing is easy to tell, is that chatting with a helmet on while moving around in the room is not going to work. : ) That's just stupid marketing crap.

I am looking forward its feasibility for external virtual screens of a Mac - or even a PC! -, with physical keyboard and mouse, that sounds attractive. But with patience, let's see how it works first in long run for the masses. And if it gets to a more realistic price tag sometime.


> if I heard correctly, the display on the front is 3d and gives different perspectives based upon the viewers

This effect probably relies on a lenticular lens overlaid on an OLED screen. This was similar to the method used by the Nintendo 3DS to create a stereoscopic image without glasses.

https://en.wikipedia.org/wiki/Lenticular_printing


Any word on how many interlocutors standing opposite you this can support?

I'm reminded of the Hallway Projection Scene in Mission: Impossible - Ghost Protocol, which works beautifully until more than one person looks at it.

https://www.youtube.com/watch?v=qtA0JS1lBaY


The guy on the commercial was using a real keyboard so I imagine this can be used relatively standalone, with the caveat that it uses iPad apps.


Taking a positive sign because now the consumers expectations are high and if they not deliver what they promised here then they're gonna have a huge trouble so as a consumer it would be nice if the consumer can provide some our expectations to say how to show our interest and kind of motivate them to build a better product.


> did they talk about using it as a display for a Mac?

Yes, in the keynote at 1:32:02. It discusses how looking at your computer then turns the Vision Pro into a display.

https://www.youtube.com/live/GYkq9Rgoj8E?feature=share


Yes, they showcased that you just have to take a look at a Mac screen and the glasses become the display.


They did share that it can be used as a display for your mac. It sounded like you're limited to 1 screen (i'm guessing because of bandwidth limitations; also guessing that upgraded macbooks may have the necessary hardware to stream more pixels)


> did they talk about using it as a display for a Mac?

Actually the did:

> bring the powerful capabilities of their Mac into Vision Pro wirelessly, creating an enormous, private, and portable 4K display with incredibly crisp text

I wonder what the latency would be like though.


They mentioned the device can detect when you are using your Mac and show the desktop as an app in your headset. So yes that will be possible as well as just using normal bluetooth mice and keyboards.


They explicitly said in the keynote, that you can bring up the screen of your Mac as a virtual display. So it looks like you can use this to work with your Mac.


23M for both eyes doesn't seem that far off from Meta Quest 2 Pro at 9.3M (2,160 x 2,160 x 2).

And Meta Quest 2 Pro is one year old at $999.


> 23M for both eyes doesn't seem that far off from [...] 9.3M

It's almost 2.5x the pixels [edit: was ~~resolution~~ which is incorrect]. How is that "not far off"? It's more pixels per eye than the MQ2P has for both!


2.5x the pixels is more like 1.5x the resolution in terms of the smallest features that can be seen - remmber that displays are two-dimensional and in order to halve the width of the smallest discernable detail like say a line you need to double the pixels in both directions for a total of four times as many pixels. On the other hand, it is going to be close to 2.5x the rendering cost.


Ta, edited my post to correct it to pixels instead of resolution.


MQ2P has super blurry texts though. It's hard to take think 50-60% bump up the resolution will be enough.


you can (over simplified, tech people yell at me or whatever, but) display your macbook screen inside Apple Vision as a screen/monitor/window, whatever it's called same way you would an app.


The device seems amazing, it's just... not really Apple, that's all.


Hmmm, I don't get that. Apple builds personal computers. That has been there mission from day one. This is easily the most personal computer they have ever created. I don't see how it could be more Apple.


It looks neat.


> did they talk about using it as a display for a Mac?

Yes ! In 4k


It would very obviously be useful for work if you can actually get high res, effectively unlimited monitor space. Maybe not for everyone, but people already spend $3500+ on monitor setups somewhat regularly (and employers definitely do this). Apple themselves sell a single monitor that costs $2300 when fully spec'd out (5k, but the point is that they know what people spend on monitors). I can't figure out why that wasn't the highlight of the demo, since that's just very clearly the easiest way to sell a $3500 device with this specific set of features.

The recording video of a kid's birthday was one of the most ridiculous thing's I've ever seen. I'd maybe record my kid with something like this every once in a while, but I certainly wouldn't be wearing ski goggles while he blows out candles.


This "unlimited monitor space" is a complete non-selling point for me.

Being a wealthy software engineer, my monitor space is not bottlenecked by my budget or desk space, but by my literal neck. Constantly rotating my head back and forth from one monitor to another is, quite literally, a pain.

For me the sweet spot is a single curved monitor right in front of me. If I need more "desktop space" I add another Space with Mission Control. And with keyboard shortcuts I can move between Spaces nearly as fast as I can rotate my head around.

So what am I going to do with a VR headset if I ever got one? Put the active app straight in front of me just like I do with my normal monitor. I'm not going to put my terminal at some odd angle 25° above my head and crane my head back when I want to run a command in it. I won't put the Weather app 90° to my right, obscuring what is currently a nice picture window looking out on my yard.

For me, VR needs that "killer app" to justify the high pricing and inconvenience of use, and I just don't see one yet. I don't expect one any time soon either; if VR was going to get a killer app, it would have shown up by now.


You sound like someone who has a very stable and spacious office. Have you considered that "having more desk space than there is space in the room" is the killer app for many (wealthy!) people who either 1. travel a lot, or 2. live in countries like Hong Kong where space is at a premium?


The travel point is a legitimate one. This is less a device to look at code, and more a device to look at people and presentations. Practically every Fortune 500 executive will have one of these because they'll be able to immerse themselves while jetting around the world - neither limited to a laptop screen, nor to a cartoon environment where people don't have legs, but in a truly effective war room that interleaves live video conversations, presentations, dashboards/visualizations, and their physical travel companions.

Or, at least, they'll want the ability to brag to their peers that they can do these things! It's the Apple playbook, and it will create a tremendous amount of envy. If it's at a price that's profitable, it can sustainably anchor their reputation even if it never goes mainstream.


> Fortune 500 executive

> they'll be able to immerse themselves while jetting around the world

> a truly effective war room that interleaves live video conversations, presentations, dashboards/visualizations, and their physical travel companions

This is the world we make, and it's for them!


It's yet another way for executives to torture themselves. Don't envy them.


F500 executives tend to have people who will show these presentations on big screens, in rooms they can just stroll into (and out of). And they don't want to strap anything to their face, particularly something that might (horror!) upset their carefully-placed hair.


Yeah that comment is desperately out of touch with reality. I presume the person never actually met/dealt with these folks, for them it would be humiliating to wear it and to be seen wearing it, Apple badge or not doesn't matter. For those levels, carrying >100k watches and having plastic ski goggles on your head? Forget it, anywhere where others can see them. Maybe this mindset changes in decade or two, but not earlier.

Generally on the topic, its rather underwhelming release of device that is searching for its market (while usual Apple echo chamber here on HN sees it as second coming of Jesus). No wonder they scrapped the release few times in the past, it must have been properly underwhelming when compared to competition. And pathetic 2h battery life at best? That makes it useless for any longer flight (I am sure you can plug powerbank and continue but it will look pretty bad and annoying as hell).

I am sure Apple will tune software to perfection, but I can't see it being enough, market is tiny considering the investment, well saturated and from what I heard rather shrinking. But I hope they will push the market in some good direction long term with their creative approach, so we all can benefit eventually.


You could have described BlackBerry in similar terms pre 2008


The difference is that Blackberry let you do something you couldn't before.

Which is this entire thread -- what can you do with AR that you couldn't before?


> what can you do with AR that you couldn't before?

Its my belief we are about to find out in the next 3-5 years.


The only compelling answer I can think of is "everything we already do now, only untethered by physical location."

Which is less about polish and more about deployment volume and/or standards interoperability.


Immersion.


My dude, that's what they have when they actually arrive at their destination. We're talking about what they do on the plane, or in their hotel room.

Or, perhaps easier to picture, when they're on vacation on a beach in Tahiti. They could be chauffered 20 minutes back into town to a "secure workspace" in order to have a five-minute call where someone back at their HQ [where it's the middle of the night] briefs them on a screen... or they could go into their cabana, strap this thing on, have the five minute meeting right then and there, and then go back to sipping Mai-Tais.

Executives already make this choice, this way, right now. This choice is the reason that the iPad Pro has traditionally had better "stuff" for teleconferencing than the MBP does: the iPad Pro is — or was — the thing Apple most clearly marketed to executives. Right now, executives take out the iPad Pro to take that quick cabana video-call.

For this use-case, the Apple Vision is just a one-up to everything the iPad Pro is already allowing them to do. It's more secure (nobody can watch the presentation over their shoulder); it gives the presenter back at HQ more visual field to work with to make their point; it's more discreet in how it presents them in video calls (i.e. if they're calling in while laying naked on a massage table, that won't be reflected in their 3D-model recreation); etc.

---

More realistically, though, ignore the F500 CEOs. I have a feeling that I know exactly who this was built for — and it's not them. Apple engineers aren't any more in love with the idea of serving the needs of executives than anyone else is. They throw them a bone now and then, but they have other things in mind when building the core of each product.

Now picture this: you're an Apple hardware engineer who wants to work remotely, but you were forced to work-from-office due to not just the secrecy around the Apple Vision project you're on, but also the collaboration benefits. (It's currently basically impossible to review 3D models for "feel" on a laptop; you need either a big bulky 3D TV, or some other company's big bulky HMD setup. Neither of which travels well.) But your dream? Your dream is that you can figure out a way to do everything you're currently "doing better" by being in the office — reviewing and collaborating on 3D models of the new hardware, for one important thing — while on vacation in Thailand, sitting in your rented condo, on the couch. No need to also be paying for time at a coworking space (or to even be in a town large enough to have those); the HMD is the coworking space. As long as you have wi-fi, you can do everything the engineers back at Apple HQ can do.


This sounds a lot like the use cases stated for the office metaverse thing FB was pushing that failed to materialize.

The last thing executives want is a "more immersive" PowerPoint or Zoom call. It's either Zoom or in-person with all the trimmings, e.g. nice dinner, round of golf.


>This sounds a lot like the use cases stated for the office metaverse thing FB was pushing that failed to materialize.

Apple might be a company that is better at implementing hardware and platforms than other companies, especially Facebook.


The problem is a lack of real use case and input methods. I see none of those solved by apple.


The "look at the search bar and speak" was pretty cool even if it's simple. Eyetracking is not available on most other VR headsets yet


> Practically every Fortune 500 executive will have one of these

Even if that's true, that's only like ~50k people lol.


I mean if every single one of them buy only one of them, that's only $175 million dollars right there. Totally not worth it for Apple to bother even trying


Apple's first year sales of their watch was a failure with 10 million units sold instead of the projected 40 million. Apple now has 34% of global market share. Now remember Steve Ballmer laughing at it.

It is not the 1st generation of most of their products, but the follow ons.

I'll wait to see what the first months of hands on reviews and perhaps a personal demo. How heavy is that headset and how long is the battery life (I thought I saw 2 hours)?

Time will tell.


Good example. When the 1st gen watch came out, I knew I wanted to have one, but I also kind of knew I wouldn't want the first generation. Lucky me, because I had quite some GAS at that time, the 1st and 2nd gen watches were never really easily available where I am located. Then, I conveniently forgot about the desire to own one. For years. I now have my first watch, 7th gen, and love it. Well, it is more like with a cute pet. You love it, and you learn to love its quirks. So even after 7 generations, the software is still not flawless, nor are the sensors. This is the first thing I would be worried about, if I had any inclination to use a headset: How distracting are the bugs they definitely will have? Since I totally stopped to install anything below iOS #.2 I wonder how "fun" it is going to be to use this product once it comes out :-) I have no trust left in their QA, shipment date is more important then user experience... :-(


Apple only truly started competing against Garmin recently. Improved running metrics, low power mode, better battery (Ultra) etc only showed up recently while Garmin and others had them for years. Even GPS wasn't on the first iteration.


I am not looking for a fitness tracker, so Garmin is not even close to competition for an Apple Watch to me. Why? I use VoiceOver. Garmin does not have any speech output at all, so they can not even be compared for me. I do a lot of FaceTime Audio from my watch, another use case where Garmin doesn't even come to mind. Dont forget that products these days have a pretty diverse feature set. Assuming everyone is looking for a fitness tracker just because this is the new hype is rather, erm, unimaginative.


I’m still unsure that they’re any sort of competition for Garmin and co yet.


They are not (yet), but target group doesn't care about raw stats, or price/performance ratios. But I love them, because they will push Garmin making even better watches, so everybody wins.


Yeah it's a win/win for users I think. I just upgraded to the Fenix 7 Pro range and it's very nice.


Not sure how you can say they are not competing. Anecdata but I considered a garmin vs apple watch. Biggest driver was cellular to call either my wife or 911 when kitesurfing alone (yeah I know I just shouldn’t do it) so chose the apple watch 3 when it came out. Now have an ultra and that’s really starting to catch up with some of the other features I wanted. Seen several people in the kiting community pick apple vs garmin and vice versa for a myriad of reasons.


The Apple Watch has truly succeeded in the smartwatch space, but is the smartwatch space even worth a damn yet? Or is it perpetually waiting for the opportunity to monetize users’ health data and other tracked biometrics, for it to really be profitable.


Maybe this “space” thinking is wrong. Don’t worry about the “smart watch space”. Worry about making a product that will make a bucket load of cash. Does it matter if the sector is worth much overall when you rake it a butt load of money for yourself?


That’s what I’m getting at. Is the smartwatch market in general really worth all that much money?


https://www.statista.com/outlook/dmo/digital-health/digital-...

Revenue in the Smartwatches segment is projected to reach US$44.91bn in 2023.

Revenue is expected to show an annual growth rate (CAGR 2023-2027) of 8.26%, resulting in a projected market volume of US$61.69bn by 2027.


By that measure, the iPhone is a total failure, together with the smartphone market it created. It pales to insignificance compared to the market for food! And don't even think of looking at the market for shelter, then it's hardly even a joke, why bother. Or maybe that that's not really a meaningful angle of looking at markets?

What exactly is the "all that money" you talk about anyways? If Apple's watch division was a separate entity on the stock market and they had inexplicably high valuation I might enthusiastically agree with you, but it's not.


Anecdotally, the Apple Watch is very popular in the bay area. I'd be very suspicious of any claim that Apple didn't make boatloads of money selling it


Neither butts nor boats are all that large though. Even if Apple has made both boat loads and butt loads of money, we would need to be talking about gigabutts or kiloboats to get anywhere meaningful.


Maybe it's a butt full of prepaid debit cards?


It is while people still have too much money to spend.


I'll agree that smartwatches seem niche and not particularly useful. (I've never had a smart watch other than Fitbits, but I really don't see much value beyond tracking steps and heart rate. The notifications on my wrist aren't useful; maybe controlling music would be, but I'd rather just do that on my headphones.)

That said, it's probably a lot easier to switch to Android if you have an iPhone vs. if you have an iPhone + Airpods + Smartwatch + iPad + Apple laptop. The smartwatch as one additional small tether could make it worthwhile for Apple all by itself.


But the watch has a real use case and is in the price category that people can actually afford it.

But you are right, time will tell.


All of these depend on the individual. I've never had a wrist watch since I finished college(used for timekeeping in exams). Mostly because never needed it. Mobile phones were out by then, and you had a watch and much more in it. Its just that use case for me died out. I'm also into swimming, and other exercises(kettlebell), but the fitness features don't seem to be attractive to me either.

I didn't find the steps tracker etc wearables attractive either. It felt most people wearing them were interested in measuring and reporting things, than doing the actual workout.

But I just looked up now and the Wikipedia page for Apple watch says they sold more than 100 million units so far. And now have a fairly large portion of market for watches world wide.

Different people have different use cases, likes and dislikes. And there's also the additional public mood factor which is very hard to measure and understand. Based on that this product could be a huge success.


Agree 100%, most folks I know have Apple watches to appear sportive, because its such a cool crowd to be in currently. The guys actually doing some proper trainings almost never have them, including me. There is also category of pros/semi-pros/hardcore amateurs where it actually makes sense to use some form of it(but I never saw pros training ie in Chamonix to wear Apple brand for that, and those folks all have chest straps), by measuring any small deviations, progress etc.

For me, it actually distracts me from workouts and activities. I used my wife's Fenix 6 pro twice for running to get the idea how long my usual trail run in the forest is, and how much elevation I gain/lose. What I estimated from my feeling was anyway 95% correct (although I don't think watches measure small variations of natural terrain very precisely). But it was distracting, looking at heartbeat you subconsciously want to push/keep yourself in some perf bracket (ie just below or above anaerobic threshold for me). Vibration after each km (probably can be turned off though).

After that measurement, running again without them was so liberating, and had this nice feeling of extra freedom in the nature, just me and the trail. I feel very well when I cross anaerobic threshold, perform above it or being close to it, don't need gizmo to tell me so.


Of course all of this depends on an individual. But apple is a for profit company that spent a tremendous amount of money on the R&D of this device, and I don’t see a good return of investment here, as not many people need it, let alone can afford it.


Have you considered this is a beta product for the cheaper mass market versions coming in 1-2 years?


I don’t think F500 execs spend as much time looking at monitors and slides as you may think. Also people travel to see them, so face to face is unlikely to be a benefit to them.

Also, it’s a huge expensive gadget in a time of austerity. If your 100+ execs get one of these, it won’t look good to shareholders IMO.


US$ 3.5k per executive is less than what is spent on their secretaries per month, it's absolutely doable even more as it becomes tax-deductible opex.

US$ 350.000 is nothing if your company has 100+ executives, let's be realistic.


How well do that work on planes? People who tested quest on planes found that the motion of the plane interfered and made it unusable.


So.. a whopping low 1000s devices will sold as per this business plan?


They will sell much, much more than this. All the wannabe startups and bigwig CEOs will line up to buy this, even if they can't afford it. All that matters is the image.


But I'm genuinely curious, why would the bigwig CEOs buy this if they didn't buy the Quest 2 or other previous headsets that could do the same things? You could do the cinema and virtual desktop and zoom calls with the Quest. Why is the market much larger for the Apple headset compared to the others? Except for the initial hype of "I need this new apple device" I mean.

The other headset manufacturers have been searching for the killer apps for years, both in gaming and pro usages, both with AR and VR. I didn't see anything in the Apple presentation that was new. It seemed contrived, like this woman who accidentally had the big headset on her head while she was packing a bag and therefore could take a call that hovers in the air. I just don't buy that (and neither does the various YT influencers I've seen reviewing the Vision Pro).


Existing VR headsets are too low res to work on text based content. This new product is a 4k screen in each eye


Existing VR headsets have 4k in each eye? It's considered the minimum iirc


what? I have a occulus quest. it definitly does NOT have 4k per eye. I've actually tried to use it for a multi monitor VR and the resolution was too slow and latency too high to be workable.


More than 4k actually if you square 23m pixels is 4795x4795


Less than 4k actually. You need to divide that 23m with 2.


> why would the bigwig CEOs buy this Because the Apple device looks like a desirable item instead of just a functional toy. It's the wealth signalling and image that count.


Because Quest 2 doesn’t “do the same things.” You’re acting like Vision Pro is just another version of Quest. It’s not even in the same time zone. It’s like saying “why does anyone need iPhone when a Palm Pilot is perfectly fine?”


What are the things Vision Pro can do that Quest cannot though? Genuinely curious as I don't know much about the Quest - and others above are saying it already supports floating virtual desktops/windows and video conferencing.

Quest doesn't broadcast your eyeballs onto a front screen obviously, but is that the only major feature difference? If not what other things are new capabilities?


> What are the things Vision Pro can do that Quest cannot though?

Quest's resolution and optics are not good enough to make text legible unless it's blown up to billboard(Ok maybe just poster) sizes. The iGlasses may be the first headset with adequate resolution to make text comfortable to read, making it possible to use for work.


Any business that has a CEO can afford this.


And all the diehard mac fans, YouTubers and such that will be talking it up for the next 2-3 years, building up the hype train, until Apple drops a $400 version for consumers.


Which would still make it a huge loss for apple.


I guess then sales of 5,000 of these are guaranteed. Somehow that’s a bit lower than I would guess apple hopes for.


Nah, it'll mess up their hair.


> This is less a device to look at code

why though?


Like the OP, I found I was more efficient/comfortable on a single screen compared to the 3 or 4 I have had at one point. Now in my 40s, I find myself more comfortable on a 13" laptop compared to a 34" screen. It's just easier to concentrate.

IMHO ideal computer use is to move things in front of your eyes instead of moving your eyes/head. Your area of focus is quite small with almost no value to filling your peripheral vision.


39 here, but I really cannot imagine ever leaving my triple-screen [tie-fighter](https://i.imgur.com/DkqkER7.jpeg)-style setup, unless it was for an unlimited number of unlimited-resolution screens.

If I could have one screen per application and surround myself in a galaxy of windows, I definitely would.

Would I look at them all on a regular basis? Of course not. 80% of them I would only look at once every hour or so.


39 here too, and not turning my neck all the time to look at multiple monitors anymore has helped save me a lot of pain.


I'm a fan of two monitors, my main horizontal (though I got one with much more vertical resolution than most 4:3), and one in portrait somewhat to the side.

So many big wins. I can do a zoom screen share on my main window and have notes, private stuff on the side window, I can read documents that often are vertically formatted on the side window.

I do a fair bit of comparing type work where I need a reference index doc on the side, then I got through the individual docs for tieback on the main.

It's game changing to have multiple monitors and particularly have one portrait and one vertical.


I hear you, I'm 38. I've been using a 14-in screen for the last ten years. Clients will ask why I don't use more monitors, but I can really only focus on one thing at a time, and my field of vision isn't that big. If I need to look at another screen, I just three-finger swipe.


Maybe your eyes are better than mine, but I have a real hard time working on a 13" screen. Trying to do Excel work on a tiny screen drives me up the wall. Either I'm sitting too close squinting at tiny text, or have to enlarge everything and fall into scrolling hell. With my 27" monitor I can enlarge the text and still have lots of screen real estate to do my work.


Well, it works for me right now... but that will surely change. I'll just make smaller and smaller functions until I need to get a bigger screen, haha.


Random insert point, but all this 1:1 comparison to the existing extra monitor concept of operation is emblematic of resistance to XR in general. I see it as trying to shoe horn today's use cases as a template for something that is literally a phase change of capability -- much like how the first automobiles were framed by the lense of horseless carriages.

3D in 3D is different. And when you put 2D screens into a 3D digital space viewed as embodied in 3D XR you still get affordances you didn't have before. Sure you need to reimagine and rewrite from the ground up these long established and stable 2D apps, but there are places where real gains are there to harvest.


Exactly. Seeing people talk about "unlimited number of monitors in VR" is kind of frustrating. Monitors are containers for apps, portals into your digital desktop. You don't need monitors in VR. The monitor is a skeuomorphism! Just put app windows wherever, unbounded by monitors.


The problem is wide monitors. Nobody need really wider monitors for work. Mostly you want to have more vertical space.On work I have a 32" monitor, at home even a 43" monitor. The cool thing is the vertical space. 16:9 is bs for work. A large 4:3 would be much better choice today.


That's so different from here. I'm 35 and when we finally got a large size TV last year I never went back to the small screen. Well except when I have to.


You watch a TV from quite far ahead. Even a huge TV might not be much bigger than a normal laptop in your lap, let alone a single big monitor.


My wife and I literally live out of 4 suitcases. We “nomad” 7 months out of the year and when we are “home” for five months, we still can’t accumulate anything that we can’t take with us since our condotel [1] unit that we own gets rented out when we aren’t there.

But I still have plenty of screen real estate that I can set out at my desk at home or in a hotel room between my 16 inch MacBook, my 17 inch USB powered/USB video portable external display and my iPad as a third monitor.

[1] https://en.m.wikipedia.org/wiki/Condo_hotel


The resolution might be sufficient, but all of my attempts across quite a few VR headsets has been sad when it comes to text. The crispness you really need is possible on static glasses (i.e. Nreal Air), but all of the anti aliasing on projected textures has often made long term work in VR hard for me.

But the displays are pretty high res. Guess we'll see.


Crisp text is also Apple's bread-and-butter. They've been typography nerds since the 80s, I've long assumed that their headset is this late to the game because they needed display technology to catch up to text rendering in VR


Yeah, getting a more flexible work environment seems like the only non-gimmicky selling point here. But there are much cheaper and lighter devices for that. Like NReal Air. (Haven't tried it but reviewers seem fairly happy)


I feel like an 13" MacBook Air is the ultimate in flexible work environments. Incredibly light, powerful, goes anywhere, long lasting battery. Perhaps I'm just a philistine and haven't yet gotten a taste of the new world yet...


Plenty of software and workflows chew up a lot of screen real estate. 13" isn't enough for how a lot of people like to work.


I make it a point to do all my work on a laptop like this. That way, I’m 100% productive anywhere like in a hotel for example. I never miss giant external monitors because I don’t have any.


Alternatively, you're 50% productive everywhere.


I have a 13" Macbook Air, try to travel as much as possible. At home I have a single 27" 4k screen. Both at home and remote I work with just one screen so I'm able to keep my workflow exactly the same. Honestly, I think my productivity on my 13" does drop somewhat, but nowhere near the 50%. I would say I lose 10% of my productivity. For me that 10% is totally worth it to be able to work remotely and travel more.


Might not.

I have gone both ways several times.

Being able to group apps and then being them to focus on the single display works fantastic!

I took the time to get seriously productive in either case. The difference was not a big deal.

Chances are the OP rocks it as hard as they can. I was able to.

And being mobile these days, being able to work on an Air is a real plus!


Nreal Air is good for resolution but bad for view angle. It's not for monitor alternative use.


Thank you. I was hoping for some testimonial on this use case, since the price and features are pretty attractive for the air.

I will now wait for a future revision.


For working, it's not as good as an actual monitor but much easier to travel with. Really shines for games/movies though.


That's a bummer. I'd really like to not be dependent on additional displays or bending my neck all the time.


If you can afford a USD$3500 headset and live in HK, you are already wealthy and have a large apartment. Avg income here is around $2000/mo.


Hongkonger here. Lot of people can spend USD$3500 for a watch, gadget, or computer, AND still live in 200-300sqft apartments in HK. Doesn't make them "wealthy".


I don't know any world where spending that much in gadgets isn't for the wealthy. Yes, there are richer people out there. That is still a lot of money.


It's really not. Growing up, I had plenty of classmates who spent more than that on superficial car modifications while working a minimum wage job and whose family was on food stamps.


In one shot? That feels wrong. And is a poverty trap. :(


Lots of people on modest incomes buy gadgets on credit.


Having a 3k+ credit limit isn't that common, is it? And I don't know any consideration of the topic that doesn't treat credit cards as a problem/bad idea. Especially at that level.


Maybe cultural difference issue, but your logic sounds odd to me.

Surely with $2000/mo income (which you describe as average for HK) one can afford an occassional one-time purchase of $3500, after some saving (or, although I wouldn't personally do this, with a loan).

Or even more than that: my country has a similar average income, and average people spend 20K on a car without a second thought. And no, it's not that the car is needed as opposed to the headset, because the need of going from A to B can be satisfied by a 5K second-hand car, no one actually needs a new one.


It’s likely the price of one of these will drop faster than land in Hong Kong though.


That seems like such a narrow subset.


How about everyone taking a long flight or just staying at a hotel etc?

That IMO is where VR glasses are actually a pretty good fit. Carry lightweight laptop through the airport and still get to use a 32” monitor on the go. Granted the current hardware not exactly ideal, but it’s close enough to be a reasonable option.


Don't underestimate the unwieldy shape of these headsets, they aren't very bag-friendly. The Apple design seems to do some compromises to decrease bulk but it still won't nicely slip between other stuff. Portable displays on the other hand, they are wildly underappreciated because so many still haven't the slightest idea that product category exists. They offer a very favorable bulk/utility trade-off and allow day on day scaling between the extremes of the smallest laptop you can find and what could be considered a mobile workstation.


These devices are currently bulky, but you can easily but them and a a bunch of other stuff into an under seat airline bag. The weight and volume is annoying but not a dealbreaker.

Also, I think we can all agree the form factor is likely to improve over time. Portable displays meanwhile have inherent limitations in use ie an airline seat.


Not only swiveling your head around, but doing it with a couple pounds strapped to it. People's necks are going to be swole.

That being said, I've always wanted a wearable monitor so I can lay in bed (or stand, or lay in my hammock, or just have some variety). The chair is bad, and I've spent way too many years (literally) in it. I need options.

I'm a terminal nerd, though, so I don't care too much about all the 4k etc.


The ops folks at a company I used to work for tried a VR workspace to put all of their graphs and terminals in a big sphere around you. With 2k screens, the text got too pixelated to read very quickly. 4k should improve that somewhat, but I'm not sure it will be enough for a great text-based workflow.


Even at 4k per eye, if you imagine a screen at a typical viewing distance, the "dot pitch" of the display is going to just be massively less than a good quality high end monitor sitting on your desk.

We've been waiting like 10 years for that to change since Oculus Dev kit days, and its still not solved today. Advances in pixel density in this space have been incredibly slow.

I think it could be a very long time before a headset can simulate a really great display well enough for me, but other's mileage may vary.

Even with "foveated rendering" the peak dotpitch (the highest pixel density it can acomplish) simply isn't going to be good enough for me - it can't be any sharper than the dot pitch of the panel in front of the eye.

A 5k iMac has 14.7 million pixels - the pixel density needed to do this as well as a "real" display in VR could be pretty massive.


I agree completely. A few months ago, I purchased a Meta Quest Pro. Relative to the Quest 2, the Pro’s resolution blew me away. And it’s still not even close to usable for real work on virtual monitors.


This, totally. I’m interested to see how this compares with the Varjo offerings wrt foveated rendering.

Reading text in VR is generally a horrible experience, and “4K per eye” does not equal even a single 4K screen.

That said I would be happy with 8 1080p screens.


It's not 4K, though. They're not giving a lot of information, but "23M pixels" for two eyes is 11.5M pixels per eye. 4K is 8.2M, so this is 40% more pixels than 4K.


11.5m per eye is still far short of what would be needed to approximate pixel pitch of many of Apple's "retina displays" at typical desk viewing distance display well, FWIW. This a really hard problem with tech we have today.

Whether its 8m or 11m or even 15m pixels isn't the point with regards to using it to replace desktop monitors - the point is the necessary density to compete with excellent real life physical displays is really high.

Your VR monitor only ever really uses a subset of the total pixel count - it still has to spend many of those pixels to render the room around the display(s) too.


The display system boasts an impressive resolution, with 23 million pixels spread across two panels, surpassing the pixel count of a typical 4K TV for each eye.


Thats still enormously less than the dot pitch of a good 4/5/6k monitor in meatspace/real life today - remember, a virtual monitor only ever uses a subset of the total pixels in a VR headset, which is why the pixel count has to be sky high to compete with real life.


Yeah, with VR headsets you generally only get to count the pixels for each eye since parallax vision means that you only have that many degrees of freedom to produce a color.


Was this before the advent of VR headsets that do eye-tracking + foveated rendering? With the tech as it is these days, you're not looking at a rectangle of equally spaced little dots; almost all of "the pixels" are right in front of your pupil, showing you in detail whatever your pupil is trying to focus on.


For what it's worth, this was with an HTC Vive of some kind. However, the screen pixel densities don't change when you do foveated rendering, it's more of a performance trick - the GPU focuses most of its compute power on what you are looking at.


> the screen pixel densities don't change when you do foveated rendering

That's the limited kind of foveated rendering, yes.

Apple has a system of lenses on a gimbal inside this thing. Which is precisely what's required to do the (so-far hypothetical) "full" kind of foveated rendering — where you bend the light coming in from a regular-grid-of-pixels panel, to "pull in" 90% of the panel's pixels to where your pupil is, while "stretching out" the last 10% to fill your peripheral vision. Which gives you, perceptually, an irregular grid of pixels, where pixels close to the edge of the screen are very large, while pixels in the center of the screen are very small.

The downside to this technique is that, given the mechanical nature of "lenses on a gimbal", they would take a moment to respond to eye-tracking, so you wouldn't be able to immediately resolve full textual detail right away after quickly moving your eyes. Everything would first re-paint just with "virtual" foveated rendering from the eye-tracking update; then gradually re-paint a few hundred more frames in the time it takes the gimbal to get the center of the lens to where your pupil now is.

(Alternately, given that they mentioned that the pixels here are 1/8th the size in each dimension, they could have actually created a panel that is dense with tiny pixels in the center, and then sparse with fatter pixels around the edges. They did mention that the panel is "custom Apple silicon", after all. If they did this, they wouldn't have to move the lens, nor even the panel; they could just use a DLP mirror-array to re-orient the light of the chip to your eye, where the system-of-lenses exists to correct for the spherical aberration due to the reflected rays not coming in parallel to one-another.)

I'm not sure whether Apple have actually done this, mind you. I'm guessing they actually haven't, since if they had, they'd totally have bragged about it.


I'm guessing from this comment that you may not know much about optics or building hardware. Both of the solutions you have proposed here are incredibly bulky today, and would not fit in that form-factor.

> The custom micro‑OLED display system features 23 million pixels, delivering stunning resolution and colors. And a specially designed three‑element lens creates the feeling of a display that’s everywhere you look

They have advertised that there are 3 lenses per eye, which is about enough to magnify the screens and make them have a circular profile while correcting most distortion. That's it - no mention of gimbals or anything optically crazy.


>Apple has a system of lenses on a gimbal inside this thing.

Do you have a source for this?


I'm thinking there is confusion with the system used to set the PD (distance between eyes). Of course there are not many details, but it does look like there's a motorized system to move the optics and screens outwards to match the PD of the user.


I think the key to that would be a design of interface which is a step beyond "a sphere of virtual monitors" where zooming was not just magnifying but rather a nuanced and responsive reallocation of both visual space and contextual information relevant to the specific domain.


Therein lies another problem with workspace VR, you still need a keyboard if you are doing any meaningful typing. So you still need a desk, or some kind of ergonomic platform for a lounge chair.

It is a great alternative for gaming in that sense however. Being able to game and be standing up and moving is great.


With screens detached from the input device, it should be perfectly possible to make a good keyboard + trackpad combo for use on your lap, on just about any chair/bed/beach.


4k is awesome for a terminal nerd.

The first time I used a 50 inch 4K screen in full screen tmux/vim, I realized this is the correct way to program.


With such a big terminal screen you might even recreate what an 720p screen can, with 256 colors!

I never really understood why we like to hack character arrays into pixels, when.. we can just manipulate the pixels themselves? I mean, I like and actually prefer the cli interface of many programs, but can’t ever imagine replacing a good IDE with vim.


vim is a good IDE, so I'm not sure what you mean.

I'm not mad about your IDE or anything. I've used some that I could like okay, with vim keystrokes. But vim lives where I live, in the terminal. I can't run your IDE in my environment. I can run vim anywhere.


I use a 32" QHD for a more limited but similar effect. 32" 4k and the text was too small and thus the extra resolution just complicated everything but 32"QHD and a tiling window manager is awesome, I don't use a second monitor anymore after years of doing so.


That's only cause UI scaling sucks on Windows and linux. On MacOS, a 4k monitor works great.


So many apps on Windows, you might get the latest font rendering stack, you might get the old one, even in Windows' own settings UI


I am probably an edge case as I use a tiling WM on linux, there is little UI to be scaled. The only metric I am worried about is max text at my personally readable size. I could change the font sizes on a 4k monitor, but websites are the only non-text UI I interact with and they don't care about your OS settings. Zooming is hit or miss on if it breaks the layout or not. I don't doubt MacOS would be better in general, but for me a QHD 32" is plug and play, most websites work well and no settings faff or zooming.


Wayland implements the exact same supersampling based scaling that macOS has, Wayland scaling is even better performing than macOS'


it doesn't work great, elements are comically too big on 32" 4K or just too big on 27" 4K, you need to scale it to 1080p but then it's too small. MacOS is made for 5K 27" monitors for high DPI (Retina) resolutions or non-high DPI 27" 2560x1440. The only high-DPI 4K screen that works great OOB is the 21.5" 4K Apple display.

* https://bjango.com/articles/macexternaldisplays2/

* https://www.youtube.com/watch?v=kpX561_XM20


on macOS there is SwitchResX and BetterDisplay where you can choose custom scaling options.


Too bad MacOS looks like dog shit on a lot of regular-ass monitors.


Not sure what I'd do at 32" but with a 27" 4K I run it scaled as 1080. Everything is sized how I would expect but text is just much crisper.


32" 4K feels like the sweet spot now, 32" 8K would be a good future upgrade, but we need DisplayPort and HDMI to catch up. 120hz is very nice for desktop usage, as is HDR. Now that my main rig is a 55" 4K 120hz HDR OLED, most other monitors look bad. 14" is still the best size MBP, as sitting closer with the high PPI screen works well to have text take up about the same amount of my FOV. 27" feels small, esp at 16:9. 16:10 was awesome and I'm glad that it and 4:3 are coming back. 16:9 was made for watching movies. 16:10 allows 16:19 content to fit with a menu bar + browser top bar + bottom bar, or just gives extra vertical space. Those ultrawide monitors, especially the curved ones, are just gimmicky. Just give me a gigantic 16:10 or 4:3 rectangle, tyvm.


Aren't you a case in point then?

> the sweet spot is a single curved monitor right in front of me

So you can have that. Exactly the right monitor size, curvature, location - in every room of the house, on the train, at work, in the cafe etc. People with ergonomic challenges are, I would have thought, a perfect market for this.


Yup, this is the reason why I bought an Oculus Quest 2, to use Immersed[0]. The idea to have a huge multi-monitor setup that I could use on the go - carrying it in my backpack - felt really appealing[1].

With the pandemic I didn't really end up needing it that much, plus I had some lag issues which I never bothered solving (by buying a separate wifi dongle) so my usage never really took off, but the idea was solid.

The Oculus headset is a bit heavy/sweaty. Not a dealbreaker per se but with something lighter I could definitely see myself giving it another go.

[0] https://immersed.com/

[1] I work on a single 13" laptop, for portability. I like the setup but I do see the benefit of having large screens. It's just that I can't really move them from one place to another so I'd feel crippled on the road.


Yes I use Immersed regularly, commonly for a couple of hours each day with a Quest Pro. It's pretty good and quite usable. Definitely resolution is one area where improvement would be huge. It's ok currently but I need the monitors very large which creates its own issues (you get to the point where you need to turn your head to read across the screen and realise it's an ergonomic nightmare).

I enjoy it for an hour or two as a nice change, but I couldn't work there all day.


I think the problem is that the headset still seems too inconvenient to use in all of those locations.

I think this stuff will make more sense when these are the same form factor as a normal pair of glasses.


yeah, the friction is key. This is a step forward, I'm sure it'll be amazing that you can just literally put it on and look at your laptop and it pops up as a big screen in front of you. But I think the strap is a barrier. Like you say, glasses form factor is so much better than "strapping" it onto your face. It's rumored Apple has that in its sights for a future model.


No one will create the killer app because they won't have enough people to buy it. They aren't going to sell 100 million of these things. They will sell 1 million to prosumers. But you can't make a killer high-end game on a completely new system with completely new features with such a limited market, they would need to sell it everyone to make money. That's the real problem with AR/VR. You need critical mass in the number of users to justify people building mass-market appeal games and apps. The goggles need to not have a cord, be 1/3 as heavy, and 1/4 the price, and then we will get mass adoption. My gut says we are 3 generations away. But it will happen.


Yes, they are going to sell 1 million. In this generation. Next generation will have non pro model. You can sell ten millions of that. It is not going to kill phones, but it will absolutely slaughter laptops. This generation is basically just devkits.


I don't think it's hit people (including me) that this is not just a headset. It's a full-blown computer.

You can take just the device and a keyboard with you to work anywhere.


Yep. This is huge for those who travel. It’s huge for those who do cad work. And the power available in such a small form factor really opens the door to previously impossible tasks


It seems awfully convenient that the laptop folds down nice and flat. It takes up very little space. Headset like this is still kinda big to carry around with you... Maybe just a preference on my part, but I quit carrying my big can headphones around with me because they were too bulky. I'd never carry around a headset like this. Plus you look like a dick wearing one.


Which is much bigger than a macbook air in a bag, and can do 2 hours at most.


You won’t need a keyboard.


You do. I’m 100% sure that flickering your fingers in the air simple (besides looking like an absolute moron) doesn’t have enough information to accurately type. Also, your arms will fatigue immediately.


If you can position things in AR, you can put keyboard keys down on wood grain and the device can tell where your fingers land.

If you can escape the skeuomorphic trap, many things are possible. A mechanical keyboard is certainly not the universally optimal means of character entry.

Maybe not in this rev, probably not at launch based on the video, but keyboards as we know them are due for an overhaul.


> A mechanical keyboard is certainly not the universally optimal means of character entry.

Funnily enough, I think that this is basically the ultimate limit of touch based systems — humans rely very much on touch, and touch screens’ smooth surface removes every physical hint from the system. Just remember back to how we could compose a whole message blindly in our pockets with feature phones, yet I can’t write a sentence correctly nowadays without constantly looking at the screen.

Now you would even take away that? Don’t get me wrong, I don’t think that the keyboard layout or anything is the optimum, but it is sure as hell closer to it than randomly hitting the table. The mechanical part is funnily being the key part.


It does seem to me there are strong parallels with the iPhone/Blackberry keyboard conversation. Some people will hang on bitterly until the end.

I can thumb touch-type on a cap touch screen with the help of autocorrect. With continued improvement of predictive text and new input methods I think all kinds of things are possible.

Maybe with another technology iteration of haptics you would get positional feedback?


> If you can position things in AR, you can put keyboard keys down on wood grain and the device can tell where your fingers land.

You'll get carpal tunnel syndrome faster than the battery drains if you're actually doing that. One of the main points of keyboards is actually the fact that they absorb some of the shock of typing.

It's actually extremely plausible that the keyboard is the best possible text input method - at least until we find a way to read brain signals non-invasively and decode those into text directly.


My (unchecked) understanding is that carpal tunnel comes about because of the angle of the wrist and the repetitive pushing itself.

If there are no keys to press wouldn’t you have no reason to exert force, and no need to angle your wrist or brace your hand?


Eventually, maybe. In the keynote, there was a vague outline of a virtual keyboard, but (unless I missed it) we never saw that virtual keyboard in use. Instead, the demo pivoted to using a paired Magic Keyboard.


How exactly would you replace a keyboard with anything even slightly as productive?


Just lay down in bed and put physical keyboard with touchpad on your legs. Many times I work from airbnb or hotel that doesn't have proper chairs or workdesk or from coworking hotdesk when travelling.


The GP was claiming a keyboard won't be necessary at all.


I probably won’t, but someone probably will. Productive might look quite different.


If we agree then why are we arguing? I said it would take 3 more generations to hit 100 million, and I said it would happen. My point is that it won’t attract big time developers until then because it will be not be economical for them. But I think apple can grind it out, make it just good enough to attract just enough value to grow just enough hit big numbers in 5-7ish years.


It will attract "big time developers" in version 1 because being first to market on a new platform is an enormous advantage, even if that platform won't have significant market penetration for years.

Angry Birds, Fruit Ninja, etc. were not particularly revolutionary apps and would never top the charts if invented today. But because they were some of the first games on iOS they became multi-billion dollar franchises.


It will "attract" a few. As in, Apple will pay people to develop apps for it, and will basically buy teams to develop apps. I have heard of these deals happening. But you won't be able to make a bunch of money off it for many years, so how much developer talent can they actually attract? The iphone was waaaay different. There was instant utility for the phone that attracted millions, it wasn't insanely expensive like the vision pro, and the apps you could develop were simple and useful. Yes, there will be a bunch of AR apps from iOS you can use instantly on vision pro (I assume, not actually sure), but to develop a full featured app that takes advantage of the interface will be quite hard, and thus quite expensive.


No one bought an iPhone to play Fruit Ninja, though. They bought it to get access to the internet on the go. Essentially the browser was the killer for a phone.


Those games were also like 3$


To be honest, I see much more financial constraints ahead in 5-7 years for the average (even Wester-only) people to think about spending anywhere close to this amount on a luxury device and with the amount of hardware needed even with generational advancement I don’t see it changing.


killer app sells systems, not the other way around.


That’s exactly what I said in a different way. No one will make the killer app because it isn’t economical to do so.


If there was one to make, Apple would make or subsidise it, guaranteed.

Even after reading loads of comments no one can really think of one.


Yet.


Apple isn’t the only one with an XR device. Devs can still hone their ideas now that they have UX direction. The Apple AR SDK has been out for years now too.

The first iPhone also only had 1.4 million in sales. I’m not even sure the App Store was even out until the 2nd Gen.


The original iPhone sold some 6 million units from what I can google.

Steve Jobs himself said 200 days after the launch of the first iPhone that they sold 4 million units.

Source: https://www.macworld.com/article/188823/liveupdate-17.html


You’re right.


The killer app imo is AR instruction. That is:

- you’re looking at some kind of physical thing in the real world you’re “working on” (whatever it may be) - your goggles are pointing out important aspects, telling you what to do next, etc etc.

I always thought something like this for auto repair would be really cool. Of course we need the software to catch up in this regard, since it would have to recognize and overlay fairly complex visual spaces.


Sports referees could also benefit, instant replay. Once there’s a cheaper, lighter versions you’ll see mums and dads running on the soccer/hockey fields with these.


I just think you are thinking of the monitors in an overly literal way.

Imagine a calendar on the wall, but with your meetings and everything dynamic instead of just a static calendar. And it adjusts to show your next meeting extra large as it approaches. No you see useful information in your periphery.

Or perhaps you have application monitoring dashboards on another wall. You don't look at them all the time, but a dedicated space wouldn't be a bad thing.

I see a lot of potential here in the future.


A digital calendar on the wall and a dedicated screen for monitoring are both possible with tech from 10 years ago.

The problem isn’t “we couldn’t do this before AR and now we can”, it’s “my computer already does calendars and monitoring well enough”.


My windows phone could already do everything an iPhone could do at launch, and in 3g no-less. But there is something to be said about putting it all together well and having it all just work seamlessly.


Until it is superseded. Ask Blackberry.


Maybe but every single photo is a person, alone, in a room.

While this is the case for a period of life, its certainly not the case for most of it or an end goal.


This is first-and-foremost a tool for doing work. They show people using it in their living rooms, but I get the impression that the key use-case is to use it in a home office (where you'd already be intentionally isolating yourself to get work done) — or in some other room (e.g. a bedroom) to turn it into a home-office-alike space.


Fair enough though when I am home, I have half an ear for what's going on in the house whether its stuff outside; someone at the door; the cat doing cat things; the kid running around etc.

It's rare even at work that I would want to be so fully immersed. Kind of makes me feel vulnerable, not you?


That niche is killed by their own watches.


Real estate costs more than this head set. I am a VR skeptic. But if someone truly solves the problems, a virtual desktop has obvious advantages even for the rich. I could literally clear out one room and shrink the remaining desk to fit a closed laptop, keyboard and coffee mug. And now my entire workstation is portable and exactly the way I want it where ever I go.


My immediate thought was working on a flight. This guy is like he's got some big curved monitor on his flight. No he doesn't, he's hunched over a laptop screen.

If I could work on a flight on a big screen I'd be thrilled. I really don't like the ergonomics of hunching over a laptop screen.


When I worked at Intel in 1997 we bought one of the first 42" plasma screens on the market to put in our game lab - and I put it on my desk and attempted to play Quake and Descent and other games on it and I couldnt handle it so close to me - it had ghosting and bad lag and poor angular visibility and it was $14,999.00

We turned it into a wall piece that rarely got used.

in 2016 I got a monitor for one of my OPs guys that was 4k and was ~34" and that was still to big to sit in front of - and my OPs guy gave it to me, I hated it and gave it to an eng, and he loved it.

Big screens are for certain people. I have a 70" screen in the living room that I never turn on, my brother uses it exclusively, and I use a 15" laptop as my personal screen.


But its very handy if you're a wealthy nomadic software engineer. I don't want to take monitors with me and I'd like to travel more while working. I'd like to do that with my 12" Macbook air.


Also being a wealthy software engineer, there still isn’t a better multi-monitor mobile solution than this at any price point. If you’re only working from home sure, but I like to cowork with friends in a variety of places.


I use 4 monitors arranged on arms to form a shape roughly like a curved 15360x 4320 display.

I also don't see how VR will come close to replicating the productivity I have in my home office, on any foreseeable timeline.

But when I go somewhere and just use my laptop screen, it's almost laughable how inefficient and annoying it is. The screen is tiny, I am constantly switching apps / virtual desktops, and there is no way to even see my debugger, documentation, and my app running at the same time.

To me, that's what I want VR to fix. The portable workspace. For us spoiled rich engineers sitting in our spacious home offices, the constraints that make VR (theoretically) appealing just don't exist.

(I'm skeptical there are enough people who want this badly enough to pay $3500 for it to fund an entire product category, though... I expected them to come out talking about fitness and health.)


The first question that pops into my head is why you’d work on a curved monitor (of which there still doesn’t exist a high resolution model) as a software engineer. Do you find the workspace on a single curved display sufficient?

My primary concern with the Apple headset is the relatively low resolution of 23M pixels. Our eyes can perceive so much more detail, and I’m afraid the low resolution will reintroduce pixellation as is commonly seen on low end and curved displays.


To me, curved monitor makes complete sense. Edges just become too far with flat displays up close.


It's not just that the edges of the screen are too far, it's that they're at an oblique viewing angle instead of perpendicular to the eye.


If it is 23M pixels per lens, that is still more resolution than a smartphone's screen. Each lens is smaller than a smartphone's screen and the resolution is per eye. I wouldn't be surprised if this actually exceeds the eye's ability to perceive pixels.

The difference between a monitor and the lens of a headset. If you look at a 4K monitor up closely within a region of the screen of two inches in radius, you are not seeing 4K in that region. 4K of pixel applies to the whole monitor not to the eye's field of view as it does to a headset.

If you were using the headset as a monitor, you could zoom in on text and the text can effectively have infinite resolution as it scales up into view.


> if it is 23M pixels per lens, that is still more resolution than a smartphone's screen.

But you don't use your smartphone 1-2" from your eye.


> of which there still doesn’t exist a high resolution model

QHD 32" works great, it's not quite two monitors but if you are using a tiling window manager or spend all your time in editor windows it's perfectly practical.


But the pixels are visible, and text on those displays is so much less legible than on a 200+ ppi display. I simply don’t get how some developers find those monitors to be acceptable and at the same time disregard the Apple headset. Perhaps it’s just lack of vision.


Maybe you have really great eyesight, or sit a bit too close? I can't see them. I have used retina displays as well and while it's clear there is a difference, it's not a practical difference for me. Retina feels nicer but it's the same amount of UI and text on a screen.

4k in VR is very different though, it's 4k per eye not 4k in dots per inch. 4k in VR will feel like a massive downgrade if you enjoy high DPI screens, but I think it should be usable. The state of the art is 12k I think and for people who like working in VR I see 8k on the pimax as the most common recommendation for good text rendering.


Agreed about the non-selling point. I've only ever been able to get my eyes to focus on one thing at a time. So I prefer one monitor. CMD/Alt+tab works for me. If I need to have things side-by-side for some reason I use a window manager and some key combos to quickly rearrange windows. There are very few times that I wish I had another monitor.


Even beyond my neck, the limitation for me is my ability to keep track of the spatial location of that many things, and need to have them all displayed simultaneously. I've really just found the sweet spot to be two displays (with the cost sweet spot for me currently being 1440p, but I imagine 2x4k would be an improvement). Even a third monitor really doesn't improve my ability to do things, so I can't imagine "infinite" impressing either.

For me, the main appeal of VR is its potential for gaming, with a distant second place being more broadly "interacting with things in 3d" (such as 3d sculpting/modeling, or something like VR chat).


don't forget 3d reverse engineering too

being able to spatially interact with disasm code inside IDA pro is going to be a game changer for those who like to take a more topological approach to the art


You can already spatially interact with 3D content on a regular screen. Thousands of CAD people do it all day for a living, they even have specialised peripherals for 3D navigation like the 3DConnexion stuff.


You think Hex Rays is going to support Vision Pro? They barely support two dimensions lol


:(

maybe a nice Ghidra plugin, then?


I don't have high hopes for Swing either.


I honestly don’t really see that working. Especially that apple didn’t innovate on the input-space and that is fundamentally 2D.


This. I used to be a multi monitor type of person but when desktop switching became good (I first experienced this in Linux) I started using a single larger monitor and never looked back.


Turning your head causes you pain? You need to go to the gym, get in shape, or figure out what the hell is causing a natural motion to induce pain and discomfort.


Sitting is a natural motion and hundreds of millions of people have spine problems from that alone.


sitting is natural.

Sitting on a chair, at a desk, staring at a screen, for 8 hours a day, 5 days a week, and then sitting in your car, and then sitting on your couch and never actually walking anywhere, isn't.


Most developers don't have mobility issues. They have 2 / 3 large monitors (or laptop + monitor).

And so in this case they have the ability to access them anywhere, anytime.


Not wanting to turn your head 90 degrees to see your 13th monitor is not a "mobility issue".


Or you could not be ridiculous and just use 2 or 3 monitors like everyone does today.

At least you have the option to put monitors above and below as well.

And completely swap configurations for different use cases e.g. coding versus gaming.


"Everyone" does not use 2 or 3 monitors. Certainly among the software engineers I interact with regularly (at top US tech companies), having multiple monitors is the minority, not the majority.

I agree with the parent that any setup that requires me to turn my head to see all of my screen space is a downgrade, not an upgrade. Even a monitor that's too big (above 30 inches or so at normal desk viewing distance) is bad.

If you like it, go for it, but don't act like it's the only or even most common way to work, even for developers.


I've worked at two out of the FAANG companies and many others. Never seen a workspace in the last decade that didn't either have a laptop and external monitor or multiple monitors.

And there has been quite a bit of research [1] on them with 98% of users preferring dual monitors.

[1] https://www.ie-uk.com/blog/how-multiple-monitors-affects-pro...


> Never seen a workspace in the last decade that didn't either have a laptop and external monitor or multiple monitors.

I've never seen anyone using "a laptop and an external monitor" who actually uses the laptop screen. (Where by "use" I mean "looks at it." They might have it on, but it's usually just idle at the desktop.)

Personally, I plug my laptop into a monitor and then put it, closed, onto a little stand for ventilation. (One of these things: https://www.apple.com/ca/shop/product/HP9X2ZM/A/twelve-south...).


Really? I found this so shocking I just got up and checked and around here 7 out of 11 people have their laptop screen in use.

I use it as a screen for my slack/discord/email and have my two main screens above it. It's true I use my two main screens more, but if I didn't have my laptop I'd want a small third screen to replace it.


I'm with derefr, once I connect up to an external monitor, or two, or three, I close my laptop and put it in a stand. I never use it an extra monitor either.


Have you considered the ergonomics of doing this? I do know a few people who put their laptop up on a pedestal mount so it's in line with their external monitors, which is fine. (These people generally got the largest display-size laptop they could afford, so it makes sense for them.)

But if you have your laptop sitting directly on the desk — presumably because you use its keyboard to type? — then any time you look at its screen, you're straining your neck. There's a reason monitors are on stands that hold them up 8+ inches above the desk — it's so it doesn't hurt to stare at them all day.


I'm not a software engineer or anything like that and I still have three screens including a laptop screen at my desk. Almost everyone at the small NPO I work for have at least 2 monitors including the likes of finance, customer service officers, etc. When I visit other offices it's not unusual to see 2 or even 3 monitor setups. This is common even at government agencies. This may be specific to New Zealand however and not the same elsewhere in the world however I'm sure Australia is in the same boat going by what I've heard from my Australian friends. YMMV. Will be watching this Apple innovation with interest.


Interesting - at the Google office I work at, the vast majority of developers use at least 2 monitors, sometimes 2 monitors + a laptop screen.


I'm a digital nomad. I miss having a spacious multimonitor setup. tried making it work with an occulus quest and immersed VR but the results were disappointing. If they can make it seamless and match the resolution so my eyes don't hurt after a minuite of actually reading code, Its going to be an immediate shutup and take my money moment.


Why wouldn't you use gestures to move the right monitor to be directly in front if you, maintaining some concept on what's on adjacent ones from UI hints?

Really the whole concept of "monitors" feels skeumorphic here. Shouldn't it just be a sphere where you're looking at a concave part with your current app, and can rotate as needed to pull other apps into view?


I can see it being nice if it's like Minority Report, where you can swipe small screens away, etc. Talk and it types. Glance to the left to see how the builds are going, etc. It could also be a nice virtual whiteboard. Usually it's hard to know how nice hardware can be without the apps. And you don't have to be in your office.


>"Constantly rotating my head back and forth from one monitor to another is, quite literally, a pain."

60+yo fart here. Same problem as well. After dicking with 3 32" 4K monitor setup a good while ago I am now down to a single monitor. It is still 32" 4K at 100% scale and feels comfy enough.


As someone who used to have a cheap-ish 3x27" monitor setup, I can confirm neck strain on big triple monitor setups is most definitely a thing. Imagine combining this with carrying the weight a pair of technogoggles like these,and I think it could get tiresome really quickly.


What if the mapping between your neck angle and screen angle wasn’t 1-1?


Then you would likely become dizzy and puke.


Perhaps someone will invent a way to virtually move around within a virtual space. Seems far fetched, I know. But we can still dream.


Decoupling virtual from physical movement is the fastest way to get people puking and giving them headaches.


True. Scrolling windows triggers my nausea.


You don’t need to turn your neck tho, you can turn the environment. And nothing goes off screen, just out of foveal focus.


It's called Beat Saber. :-)


you, or someone in a situation like yours, might at times find it valuable to have like a giant whiteboard in front of you, that you can walk around in front of, and on which you could spatially arrange a bunch of details


putting it in 3D is also an opportunity to fix window management


How so? There have been plenty of 3D window managers and IMO was all just gimmicks, not really contributing to any increase in workflow.

Edit:typo


Scale and Expose both definitely improved workflow



Most VR goggles eventually cause pretty significant eyestrain due to vergence-accommodation conflict [1] and other issues all of which get significantly worse the closer to the user the virtual objects are.

Apple's display is, I guess, in best-of-class, but they have no special sauce at all on this, and no physical IPD adjustment at all, and so this device as previewed is basically only useful for media consumption and maybe something like a telepresence meeting, albeit not long duration. Without controllers it's unlikely to even work well for most games.

Basically this is the best of a huge crowd of not very good VR helmets with probably industry-leading AR camera-based passthrough.

[1] https://en.wikipedia.org/wiki/Vergence-accommodation_conflic...


The Vision Pro demo clearly showed physical IPD adjustment multiple times.


IPD isn't that, they're talking about focus. All current VR headsets are focused at a fixed distance, typically around 2 meters away though earlier headsets focused to infinity. Anything outside of the 2m distance can look incorrect and it causes visual weirdness like depth of focus blur to instead look uncannily sharp.

Not fixable without varifocal lenses which adjust focus depending on what your eyes are looking at.


No I'm talking about IPD. Maybe someone else is talking about focus?

In the demo they showed a break out of the device which showed adjustable IPD width, with the displays sliding on little rods similar to many other HMDs.


My mistake, missed their reference to IPD.

I think it's automated? I guess with the eye tracking there's data for it to be able to center them.


Where? The lenses are clearly in a fixed location in all of the photos shown.


You mean like screens already cause issues to our eyes?


We don't have screens strapped two inches away from our eyes.


> It would very obviously be useful for work if you can actually get high res

Is it even that high res for detailed monitor work? 4K per eye yes, but for your entire field of view. Does that meet Apple’s definition of a “retina display”?

I currently sit a few feet away from a 5K display, that’s way more pixels per degree of FoV.

Same goes for movie and TV watching. I sit maybe 8 feet away from a 4K 55” TV and I can absolutely tell the difference between 1080p and 4K. Surely the equivalent “projected” display on this thing is gonna be 1080 or lower?

Of course, as one of those 30% of people with myopia they referenced earlier in the video, I dread to think how much extra it would cost to be able to see anything at all through this thing.


Keep in mind that the display processor utilizes a foveated rendering pipeline, which appears to concentrate the highest resolution rendering where your eyes are focusing.

That doesn't speak to the overall resolution of the per-eye screens, however.


This doesn't increase pixel density at the point it is rendering, since that id a limit of physical pixels. Instead it decreases the rendering resolution of peripheral vision, but even that still has the same physical pixel density.

I am pretty certain 4k per eye still isn't enough for monitor like text rendering but it is pretty good.


>4K per eye yes

I think what's missed here, in the absence of any better specs, is that they're saying "better than 4K per eye!" without mentioning that 4k refers to 3840x2160, and that it's the vertical dimension that they've exceeded. So > 2160x2160 per eye. Pretty good but not even close to good enough for a floating screen of text


They actually said 23 million pixels over both panels. So if taking that as 11.5 million each then at an equivalent aspect ratio that would be something like 4550 * 2560 per eye. Right?


ahh yes, you're quite right


The tech on this thing is so cool and so useless! People will buy them, try them out, and then a month later realize they didn't actually use it at all and return them to the store.


Most people buy products because they are usable, not useful. There is crazy amount of tech available today, and people strive to own most of them. The decision to purchase is usually based on wheter we can afford it and wether we will find some use for it.

The mere fact that goggles will enable users to communicate and consume media just as they can with devices they already own, will be the key argument to purchase this expensive headset.

But this incremental improvement we get after purchase of each-time-more-polished device finacnces future inventions and innovations, and then after some number iteration we get something that is truly useful. At least that's the trend I noticed regarding every tech breakthrough and hype in this century.


I am OP on the XDR Pro Display owner’s thread over in the Macrumors Forums.

Last week I asked XDR owners about their thoughts for possibly replacing their high end XDR monitor(s) with virtual displays in the Apple Vision Pro (I called it Apple Reality)

The question and replies cover some of the considerations around this replacement and there are ongoing replies now that some of the specs are known:

https://forums.macrumors.com/threads/pro-display-xdr-owners-...


> but people already spend $3500+ on monitor setups somewhat regularly

I don't know a single person who has such an expensive monitor (or a set of monitors). And none of my employers, current or past, would ever agree to spend that much on a monitor setup.

You can buy a 4K OLED monitor for a fraction of that.


Pro Display XDR costs around 7000 euros in Germany


And how many people do you know that own it?

This is a very niche device for photo or video editing.


2007: ‘There’s no way I’d watch my kid’s entire school play through a 4” screen just so that I could get a recording of it.’

2023: ‘I certainly wouldn't be wearing ski goggles while he blows out candles.’

We — perhaps not you, but humans — have shown a remarkable preference for watching the live event through a tiny screen so that we can have a recording of it for later.


When I record something that I also want live I don't watch it through the screen, I just glance at the screen occasionally to make sure it's still pointed correctly.


Yeah the kids playing scene felt like a clip from Black Mirror. I would never want to relive memories like that when I can just go hug my kid, and if they weren’t alive it would destroy my mental health to see them in that high fidelity without being able to hug them.

What a strange demo.


It felt like strong "I am recently divorced, and also a parent of young children" vibes throughout. There was the value comparison to a large TV and surround sound system, which is only valid if both your home theater and AR device have the same audience size (to divide the cost by) of one person.


The vast majority of families are multi-city, and many are multinational. I don't see anything wrong with trying to improve on FaceTime, since millions of people already use that every day.


Then they should focus their marketing on that, not make boneheaded choices like this:

https://twitter.com/jenskstyve/status/1666027793971396608


I love this! Thanks for putting my words into image.


The catch is, they might have an impressive display of unlimited size, but it is still likely to be tied to locked down iOS (or whatever they call it on Vision devices), so the selection of productivity apps and their capabilities will probably be very limited.


Except when they find a way mirroring the Mac's desktop in this environment on a nice way. Straight up 'external display' mode (more like external 3D space), but with less constraints on how to position the windows and the taskbar ... and consequently more trouble on navigating among those. Desktop icons might be a burden too, also wonder what to do with fullscreen mode.

Anyway, if one app was used on its OS for mirroring the Mac environment on a nice way, that could be enough for me.


I feel that this feature looks more like Continuity than real desktop/app mirroring. You'll likely have to have a corresponding app installed on the device to run it this way.


> It would very obviously be useful for work if you can actually get high res, effectively unlimited monitor space.

Very big "if".

I already notice visual artifacting in REPLs on 1080p displays at 60FPS. That's nothing compared to the aliasing issues facing stereoscopic virtual displays. I can't imagine wanting to do hours of focused work staring at objects in an aliased virtual world.

Could still be a useful for travel.


I am not a fan of even the look of this thing but I am not sure why people are just talking about room and work. I think people will just buy it and watch films when lying down on their beds or sideways, when on the commode, or when commuting on a bus seat (assuming power delivery is sorted - and yeah if everything needed for it is portable enough).


They'll do it on a $400 tablet, not something that costs 2 months' rent.


Yeah that was silly, but aren't all of the new iPhone cameras 3D cameras? People take photos/videos all of the time. Now you can immerse yourself in them. I think it's pretty cool


Presumably the current and next iPhone Pros can capture 3D video.

I don’t know why this wouldn’t have been ridiculous, because it really is ridiculous to suggest this would be worn by a parent during a young child’s happy birthday singing and blowing out the candles.

This idea seemed like way too much of a stretch for this intro. They had to know this, so I am very curious what the reasoning was for why they included it.


> I don’t know why this wouldn’t have been ridiculous, because it really is ridiculous to suggest this would be worn by a parent during a young child’s happy birthday singing and blowing out the candles.

Do you not remember the 1970s-1980s, when "filming home movies" meant resting a 50lbs camcorder on your shoulder and looking through the eyepiece in a way that blocks anyone from seeing 75% of your head?


This was also my thought. My grandparents had a Panasonic VHS camcorder in the 80s. Everyone in the family took turns sharing it. I can see people using Vision Pro in a similar way to film short segments of family events in 3D.


Too bad we now live in the 2020s where no one wants to do that or look stupid doing that. A camera strapped to your face and you having to move your own face/body to zoom into something is way more ridiculous than the camcorders of those eras.


People generally didn't want to look stupid back then, either.

But dads finding ways to combine "being excited about their kids" with "nerding out about new technology" have eternally been the exception to the "people don't want to look stupid" rule.


They can do that already with a GoPro if they wanted to strap a camera to their head.


That was literally the only way to capture video back then. Everyone has infinitely better cameras in their pocket, notice how few people buy and use video cameras outside of professional or hobbyist creators.


Current iPhone Pros? How would they? Their cameras are super close together and different focal lengths (or whatever the correct term is for "they're 1x, 3x and 0.5x").

I share your immediate skepticism that wearing one of these during any moments you'd like to relive later seems preposterous. May as well just be DVRing the "moments" with your goggles and be watching a movie on the inside, because that's how present you would seem. Unless the entire family all had their goggles on ("Apple Vision Pro Family, starting at $9,999!") and you are all actually experiencing a remote moment virtually!


iPhones Pro also have a LiDAR scanner on the back.


> because it really is ridiculous to suggest this would be worn by a parent during a young child’s happy birthday singing and blowing out the candles.

I see people keep repeating this, but why is that? Most people take videos / photos on their phone, and because of that their eyes don't actually see the event happening, they are just looking at it through the screen. With this you'd actually be able to record while also not focusing on your screen but looking at them.


I don’t mean to make this personal, but have you raised a kid?

If someone is holding up an iPhone taking a video, especially up close it is a distraction.

Depending on how much they are aware of it and the person’s self consciousness, it can really take away from or alter a moment to have it so obviously recorded.

Kids can be extremely perceptive and sensitive.

Our kid is not even two and there is a subtle change when a phone is obviously out, pointed at them and capturing them.

I know it’s always better to interact without a phone in sight.

I still capture a lot of great stuff but sometimes something is so special I can’t bring myself to disrupt it by trying to record. My wife and I will look at each other and know something truly amazing is happening and both just live the moment.

Looking at the Apple Vision, as it is at launch—-it looks disruptive to both the subject and the wearer in the circumstances I’ve described above.

Perhaps in time they will become so ubiquitous a headset like this will be noticed as little as a smartphone.

But at the start, especially with the price and production volume expected this is very likely be an unusual thing to see around in the world.

Yet in the example Apple showed it appeared to be taken very, very close to the action.

I’d guess if someone tries to do this it will cause all the other kids to be looking at you, not your kid during their special moment.


Yes, I agree with this. I'll only be buying one of these if it means I can replace my work displays with it–I'd happily pay the exorbitant price if it meant being able to have the equivalent of an unlimited high-res display anywhere, at any time. The lack of sub-pixel rendering on macOS means that I'm already forced to buy an expensive 5K display for every place that I plan to do work; a headset like this is a bargain in comparison. Obviously this means that the headset will have to be comfortable enough to use for long periods of time and have high enough resolution to compete with a hiDPI display. I doubt that this device will be able to deliver on both of those fronts.


I can see how that would be a popular application. I imagine a software engineer that was forcefully returned back to the office would love the ability to have unlimited computer screens hidden from the prying eyes of busybodies around the office.


I feel the same, like glass, it has some real work (or peculiar situations) value, where augmenting the information at hand for important tasks would be a massive leverage. Electrical work, repair, medicine..


That would be a valid use case, but it's hard to imagine that this headset is miles ahead of all existing ones in resolution and clarity. In all present-day headset even simple tasks involving text are a challenge. They just can't legibly render more than ~10 lines of text in your field of view. To compete with monitors for day-to-day tasks, the perceived resolution has to improve by an order of magnitude.


4k monitor at 27 inches costs less than 400 dollars. Unless its 10 bit with 1ms response time for gaming. Who is spending $3500 on monitor setups besides gamers?


The same people who spend $50,000 on home theater gear, or buy a house based on the size of the "man cave."

I've met sports gamblers who have a dozen or more flat screens on a wall so they can fully indulge in their addiction^w hobby.


the second I want to show any of my colleagues what I'm working on to get a question answered I need a headset for each of them. That kind of kills the whole idea of VR goggles in general.


My 27" $500 monitor smokes anything Apple has ever produced in the monitor space.


Applications are open for YC Winter 2024

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: