The display may be 3,386 PPI, but wow my laptop screen has never looked sharper after taking the AVP off my head. Screen sharing from my Mac looks distractingly bad, which is a huge bummer. (AVP apps, by contrast, look fantastic.) No matter how I scale or where I place the space, it just never looks very good.
I'm really hoping this improves because it's nice to turn my 14" MacBook Pro into a theater screen of sorts alongside native apps. For the moment, though, AVP doesn't seem very useful apart from watching movies (which is amazing).
It's not surprising. The AVP has a PPD of 34. A 14" MBP screen with a resolution of 3024x1964 already has a PPD of 66 at a close 14" viewing distance and it gets higher as you move further away [1]. Retina is considered 60+ at 20/20 vision. Of course a real display has no resampling or micro-OLED smearing to overcome either.
To achieve the same effective pixel count as a MBP, a virtual screen at the AVP's PPD would have to be resized to 90° horizontal FOV, which is just too big to be comfortable, equivalent to sitting 6" from a 14" MBP screen.
The technology just isn't there yet. We need at least another doubling of PPD. See you in 10 years.
That's the great irony of Apple trying to make AR work as a monitor replacement, their existing users will be some of the hardest to please in terms of resolution since they've been spoiled by retina displays as standard on every Mac for the last decade.
yea i truly do not understand. not to mention cranking your neck for more than 2 seconds a workday is going to make this a nonstarter once most people get over the cost justifications
My current setup is 3x monitors + a laptop screen to the side at the office and 3x monitor + a wall-mounted television in my home office. That doesn't even begin to scratch the surface of what people are willing to do for screen real estate.
Many people are accustomed to working with 3x or more display devices where at least one of them is out of their field of view no matter where they are looking so I imagine the primary adaptation to using a vision pro would be the weight.
I am a color accuracy and resolution snob so I only use professional monitors: BenQ at home and Eizo at the office.
If one day the vision pro can match the performance of color-critical monitors for less than $5,100 it will be a definite "oui-starter".
If one day the vision pro can BEAT the performance of color-critical monitors for less than $5,100 I will hire a personal trainer for the sole purpose of getting my neck buff enough to use one without strain.
> That doesn't even begin to scratch the surface of what people are willing to do for screen real estate.
Very, very few people. I would be surprised if it's more than a fraction of a percent of the home computing market.
In fact, most people seem quite content to downgrade. They have laptops or desktops that are collecting dust while they're browsing the web on their phones while at home. My entire family does that, I'm the only outlier.
I'm sure that cost and convenience are factors and there would be a larger market for zero-hassle "high-FOV" computing, but a bulky headset isn't it. As it is, the market for Apple Vision Pro is almost nonexistent - not without some killer use case that appeals to more than just hardcore geeks.
> My current setup is 3x monitors + a laptop screen to the side at the office and 3x monitor + a wall-mounted television in my home office.
Well, that only leaves one question: Why so few monitors?
More seriously, I'd be really interested to hear your system for arranging things across all that real estate and how you use it optimally.
I used to have 2.5 monitors (two widescreens + a laptop off to the side) --- but after several moves, the inexorable passage of time, and my growing laziness, I now just spend my workdays hunched over a single laptop.
You didn’t ask me, but I’ll chime in here. I run 6 monitors — four landscape in a 2x2 config and a portrait on either side. I use the bottom right landscape for most web browsing, email, and sharing my screen. When I’m sharing I do those things on bottom left landscape. Top monitors are for reference type info as they’re too high to stare at for long periods. Top left is downloads folder, CPU/RAM/etc monitors, sticky note, and to-do list. Top right is calendar and security cameras. Right portrait is slack (portrait is great for this, lots of history without scrolling). Left portrait is windows explorer. It’s super fast to just move the mouse and turn my head to what I need without switching windows as much.
My setup is similar to your multi-monitor arrangement, except instead of moving my head I switch between respective window sets in Stage Manager (macOS).
One downside is that my neck does not exercise much because I do not look in different directions while working. Another downside is that at certain times context can involve multiple window sets, but that is rare, and it comes with an upside: it is easier to focus on one task without windows on other monitors distracting me.
Right: Browser and as many terminal windows as I can stand before I start to go insane.
Laptop/TV: personal stuff
I design satellites. Specifically antenna tiles and onboard processors that do RF things. I also assist in the testing of new RF signal processing methods.
I spent a couple months sporadically training a neck harness to build up for lifting and it made a big difference when I gamed long sessions on VR again. I feel like will probably take headset fatigue/injuries as seriously as any other office overuse. Which is to say, probably not much, but maybe awareness will get there one day.
I think we need to withhold analysis based on the 34 PPD number. In the article you can see that the rendering projection is quite distorted, with the apparent effect of using more pixels per degree for the center part. They don't account for this when calculating the 34 number. Of course a "fisheye" lens would also be "distorted" and allocate an equal number of pixels to each angle, so it's hard to tell by just eyeballing things. I wouldn't be surprised if the actual PPD number in the center, where it matters, is higher.
I think the article merely superimposed manually distorted screenshots from the AVP onto a photo of the sensor, rather than taking a photo of the sensor while turned on, so it's not representative of the distortion, but yes, I agree, you make a good point - the pancake lens optics might be nonlinear, with higher PPD in the center.
He concludes, by making comparison with the Meta Quest which has similar pancake optics to the AVP, and by scaling up the PPD, that the AVP's PPD could be as high as 39 PPD in the center of the image.
I think most of the conclusions derived from the analysis here will still hold true if the central PPD does turn out to be as high as 39, rather than 34. It's still a very long way from 20/20 retina at 60, from foveal cone density at 110, and pales in comparison to the PPD achieved by even entry level 14" laptops or 27" 4K displays.
That's my kind of monitor. :) It's not wasted either. Foveal cone angular density is actually around 12,000 cones/deg^2, or 110 PPD, so anything up to that is achievable with perfectly corrected vision. That's a lot higher than the 60 PPD typically claimed for retina PPD, which is at an imperfect 20/20. 110 PPD is more than 3x the PPD of the AVP...
Has anyone done studies showing how much people notice the difference between 60 and 110 PPD photos for example? There are many people who have a hard time articulating 30 vs 60 PPD as it is with computer monitors. How much of a diminishing return is it to go beyond 60 PPD?
In summary, with an 88" screen at a distance of 5 feet, some viewers rated 8K content (118 PPD) higher on average than equivalent 4K content (59 PPD), particularly those with better than 20/20 vision (~60 PPD). Most would not notice. It wasn't directly tested but I'd guess most people do notice the difference between 30 and 60 PPD, but only those with above average eyesight will see improvements beyond that.
I don't know about studies, but I'm pretty sure there is a large difference between very high contrast edges (worst case: text rendered without anti-aliasing) and photos or even videos, where there usually aren't any high-contrast high-frequency edges (black pixel next to white pixel). Since anti-aliasing is standard in font rendering and pretty common in computer games, I don't think very high PPD content would often be noticable.
That makes sense. Given that the smallest legible fonts are around 5 pixels tall, that's 3-4x less PPD than your monitor offers at 4 feet, so you'd need ~46-62 PPD eyesight to be able to read the text. 20/20 at 60 would be more than sufficient.
Good to hear! I wonder how many people kinda need glasses and don't get them, making the popular opinion of what the retina is capable of much too low.
I use a 27" 5K display as well, only with contacts. I'm short sighted at -6.75 in each eye. Why would I be unable to wear contacts when using it? If anything its an improved experience as my lenses significantly shrink whatever i'm looking at which the contacts do significantly less.
At -6.75, the impact of the shrinkage caused by eyeglasses would very likely exceed any slight visual imperfections caused by contacts (glare, halos, instability, rotation misalignment for astigmatism, etc.) I'm at -4 and prefer contacts for computer use, but it's right on the point of almost indifference for me.
Maybe I had bad contacts, but I found that my vision was just worse with them. Couldn't really explain it or describe it. It wasn't like the fuzziness of near-sitedness.
At -2.5, you probably don't get much shrinkage or aberration with your glasses, so they work better for you. For me, at -4 plus strong astigmatism, wearing glasses makes everything look tiny, and circles look like ovals... contacts are a better experience. That said I have tried over a dozen types of contacts over the years and it took me many many attempts to find both a product and a set of lens parameters that worked well enough for my eyes to want to switch from specs. Now I found that, I don't even own spectacles any more.
I hate that at -6.75 that my glasses shrink things so much that a 27" monitor almost looks more like a 24" monitor and so on. I'd wear contacts more but I can't wear them often or for long due to dry eye issues caused by something else.
What brand/models of contacts ended up working well with your astigmatism? I find that one of my eyes has issues with the contact losing alignment too frequently, and so I’m always looking for other options. Thanks!
I've been wearing lenses since the 90s. These days I wear 1-Day Acuvue Moist for Astigmatism for the most part, released in 2006, and they continue to be my overall favorite.
I keep trying newer lens technologies as they come out, but mostly it's been a bust. However I do also like Precision1 for Astigmatism water gradient lenses, which came out in 2021. They are thicker and more stable overall, with slightly higher vision quality, but prone to drying up if I don't blink.
Occasionally for a change I'll wear Acuvue Oasys 1-Day for Astigmatism, which came out in 2017.
All three work well for me but the Acuvue Moist are the ones I tend to reach for the most often. It depends a lot on the individual and the fit, I think.
i dont know why it took a week but i finally have you to confirm my suspicions based on the nebulous " its a bit out of focus i dont know" crap from every review I came across.
Agreed. I find my AVP actually quite bad for using my Mac. I use a MBP instead of a MBA because 120hz makes a big difference. The MBP screen within AVP is probably something like 30hz (likely because it's using Airplay, and it doesn't have enough bandwidth). And I can't change the resolution either! It's kind of like working on a TV — I don't see why it's any better than the 14" MBP screen.
Overall, the AVP is a disappointment for me. Most of the new UX patterns I find far worse than keyboard shortcuts. (For example, a window manager is _much_ easier to use than having to pinch to drag windows around. For example Vimium is much easier to use than looking at elements and pinching at things.) I don't consume much content (e.g. TV), but of the content that I do consume, it feels lower resolution to me. The demos they have (e.g. the Alicia Keys video) feels nothing like real-life to me. As a parallel to what you said — real life has never looked better (after taking off the AVP), haha.
You can change the resolution though? If you go to Displays on your mac you can change the resolution of the AVP mirroring. I find it works much better at "1080p" than the default since it makes small text much more readable.
You can combine paradigms for the best of both worlds (e.g. physical keyboard + your apps). I imagine that IDE experiences will only improve as time goes on.
> Screen sharing from my Mac looks distractingly bad, which is a huge bummer. (AVP apps, by contrast, look fantastic.) No matter how I scale or where I place the space, it just never looks very good.
I think a big part of the difference is that your feed is rotated and skewed raster graphic. That might work fine for photographs and movies. But UIs and text will quickly suffer from washed out edges when you apply transformations to otherwise "perfect" pixels.
The resolution of the pre-transformed image would have to be vastly higher to counteract the fuzzyness.
An even better solution would be if the UI could be rendered directly to the AVP rather than being a screen capture.
There's so many layers of compression and scaling and transforming going on that it's never going to look Great.
macOS renders to 5120x2880 'display', then it's downscaled to 4k, which is then streamed to the Vision Pro with whatever compression or bitrate, where it's scaled to whatever size you display it as.
The pixels of the AVP display are wrapped around your eyes, so pixel perfect rendering would result in an out of shape screen (although it would likely be much sharper). Also - from my experience - it doesn't feel good to have a large object unnaturally fixed to the center of your view in VR.
I for one don't notice any quality/resolution difference between native apps and a mirrored Mac screen.
I agree it feels "worse" than a physical screen - it is certainly less crisp. But I ran an experiment, and created a doc with font sizes from 1-15 pt, then viewed it on my physical 24" 4k monitor, and on a mirrored Mac display in the AVP set to the same (virtual) dimensions and distance away.
Interestingly, text became illegible (without leaning closer in either case) for me at around the 6pt mark for both of them. Objectively speaking, the size of font I can read is pretty equivalent, although I find myself preferring fonts about 2pts larger on the AVP than on my physical monitor.
I suspect this is because the physical display is able to benefit from precise pixel alignment and subpixel rendering, which the AVP can't. But the "average density" feels pretty equivalent... it somehow feels more an issue of optical sharpness than of "pixel size" (even I know the two are related.)
It's hard to talk about something so based in individual perception: it's true, my eyes could literally be different.
However, my strong suspicion is that the visionOS native apps have a larger font size and bigger, smoother UX elements. In this case, the mind doesn't notice a little blurriness around the edges.
Whereas with a mirrored display, there's tons of tiny thin lines where it's easier to notice that visual acuity is not 100%.
Some people immediately notice 90/120Hz screns vs 60. Some will see pixels on a 3K 14" screen while it will "good enough" for most.
Aside from sheer eye sight, I think there's also attention and sensibility to these specific things. The same way web designers recognize fonts from 4 letters when generic users barely pay attention to whether it's serif.
For comparison purposes can you share the size of your monitor and how far away you sit from it? Trying to decide if it's worth getting one of these (which would be a bit of a pain since I'm in Canada).
Of every major issue with the AVP the fact that they not only avoided but made impossible to have a wired connections to anything even though they laughably made the battery pack external is really the most stupid thing ever.
As a display replacement for more powerful hardware, we will never overcome the bandwidth problem (heck, it's already hard enough to do 5k at high fps with a wire) no matter how good the compression is.
It makes no sense that they built and advertise this "feature" when it could have been so much better with a thunderbolt/usb c input on the battery pack...
It doesn't spell much good for the future of the device, because it seems it will be largely locked down and limited like the iPad but even less enticing uses cases...
Given that the screen inside the AVP is basically 4k, your screen would then have to fill your entire vision to have the same resolution. So almost like having your nose touching the screen.
Matching the 60° FOV of a 27" monitor at close distance, it's equivalent to 1080p or less. It gets even worse at FOVs equivalent to more reasonable distances.
Whatever the strengths and weaknesses of the Apple Vision Pro as a product, this is a marvel of engineering. The APV seems to include the most advanced version of the output from several industries (semi conductors, display, materials, assembly, etc.)
Also textiles. The solo knit band is a technical marvel. A single 3D knit with embedded mechanical parts and hidden cables to adjust tension. It’s not my wheelhouse but I’d love to read a textiles expert’s take on it.
It's a shame the solo knit band seems to be form-over-function, it looks cool and a ton of work clearly went into developing it, but nearly everyone seems to find the much simpler dual loop band to be more practical.
e: I stand corrected, the reviewers I've watched mostly preferred the dual loop but there's plenty of solo knit fans here. I guess that YMMV factor is why they include both.
>The dual loop seems designed for people with more elongated heads.
As the owner of a long head, I definitely struggled with feeling like I couldn't get the VP to sit right for my eyes with the solo loop. The dual resolved that easily.
Then you get to have the single loop as new fidget toy.
I agree. As much as tech communities tend to dump on achievements, we really should applaud this work. I can't wait to see breakdowns of versions 2, 3, 4. As you begin to master your own creation and manufacturing process, it will be neat to see how much simpler it becomes.
I do wonder about the actual material constraints. My thought is, will it even be possible to make something so slight that it becomes an everyday piece of tech like the iPhone, or is it destined for entertainment and casual work?
Batteries are heavy, the optical aspect is bulky, heat dissipation is challenging. I haven't seen any tech on the near horizon that can answer those challenges. I wonder what Apple has in their R&D plans.
Similar, but all the components in it are a notch better — higher resolution, better pass-through video, more precise hand tracking, and much faster CPU.
However, my impression from iFixit's teardown is that there's a ton of tiny boards, screws, and ribbon cables. I bet in v2 they'll figure out how to package it even more densly.
That's kinda ignoring that there's more advanced headsets as well on the spectrum.
I genuinely enjoy the positivity around the product and the effect it could have on the field (even as an anchoring of what any new headset wants to be from there)
But that supposes the AVP is put in context, and I think we miss a lot of it.
I kinda feel we're back again in the PowerPC macs vs the rest of the industry days, with Apple set in its own impermeable bubble.
Early VR of this wave had some remarkable displays, not in a sense that "wow they are so good", but "wow HMD displays were pure trash before". Quest 1 while it was a very silly device, it was a pretty good inside-out HMD. Q2 was a lot better.
All of those devices still suffered from the same issues: they can put only whatever Qualcomm can make for them. XR2 Gen 2 is slower than Snapdragon from the same generation. This is due to how "inefficient" Qualcomm SoCs compared to what Apple makes for themselves. (not M2 is meant for desktops and XR2 Gen2 is based on SoC used in mobile phones, so they were created with different power and thermal budgets in mind)
I find it interesting how apple went about the "number crunching" part of AVP by making a separate coprocessor just for that. Yeah, XR2 Gen2 has similar capabilities on SoC, but I want to see benchmarks.
I am not sure that I agree. Engineering is a world of tradeoffs. Juicero said "we are going to overbuild every component even if it means the company goes out of business". Welp.
Most impressive to me is when every unnecessary component has been removed, and every remaining component is as simple as possible. (The example that sticks in my mind is the humble $5 drip coffee pot. No moving parts except the switch to turn it on!)
I'll jump in here and suggest a Technivorm Moccamaster. As simple as possible but no simpler. It has an "ON" switch. Excellent drip brewer. Widely available. Handmade in EU. Five year warranty. All parts available and serviceable. Design has not changed materially for decades. Certified by European Coffee Brewing Center and the Specialty Coffee Association.
Where I'm coming from: I'm picky about coffee making and am not interested in the espresso investment for home. So I made delicious Chemex and Hario pourovers for myself, family, and friends for 15 years because I never thought I'd enjoy the compromises of a drip coffee maker. I was wrong. The Moccamaster makes an excellent half or full carafe, consistently, every time. And I don't suffer Pourover Interruptus on busy mornings. The thermal carafe version was my choice. No regrets.
I'll second this. I've had a Moccamaster for 15 years and I suspect it'll last another 15. The version with the thermal carafe is simple, durable, and makes excellent coffee.
It depends on how much you care about coffee. If you care enough to freshly grind everyday, weigh the coffee, weigh the water, etc. then you probably just want a simple V60 pourover (or Chemex if you want something fancier). Aeropress is also good and my personal favorite brewing method.
If you want something to produce a vat of coffee in the morning on a timer, which is totally understandable, there are some good options. Buy whatever James likes here: https://www.youtube.com/watch?v=t8eYs2vxT-8 (I watched the video 2 years ago and forget ;)
The problem I have is keeping coffee fresh for how little I drink. I ended up switching over to a subscription service that just sends me frozen single-cup pods of coffee. It's pretty good and very easy to make; just add to hot water. It's like $70 for 32 pods, though, so not particularly economical if you drink a lot of coffee. But it is easy and the coffee is fine.
I do weigh the coffee, but only because I’m making a custom mix of pre-ground decaf and caf coffee in the morning (I can’t mix too much caffeine with my Concerta).
My favourite methods to date are Moka pot, and French press. So I reckon the vat of coffee approach is probably they way. I reckon my favourite from the hoff video is the Technoworm.
I'm not a coffee connoisseur (and honestly could only spell that because of autocarrot), but for me beans are hands down more important than any other part of the brewing process until you get to absurdly expensive.
So we got a Krups grind and brew unit, though the exact model we have isn't available anymore, it's been going strong for maybe 7 years without any issues. It has a hopper you can just fill with beans and it grinds an approximation of the right amount for home much coffee you want (and you can put water in the night before and set it to have coffee brewed when you get up). My only complaint for our particular model is the burr is really loud, but I don't recall if it's always been that loud or it's just beginning to get old.
[edit: this is an addendum because I saw u/emptybits suggestion and just wanted to acknowledge that it's a nicer brewer but I'm a pleb and here's the reasoning I have for preferring my lazy-omatic brewer (maybe I should trademark the name?) :D]
It's not as expensive (or as nice looking) as the Technivorm that u/emptybits recommends, and I would absolutely expect better extraction from the technivorm as well (at least vs our model, which has the very old school single drip point), but the convenience of a single appliance is honestly more important to me than the price difference. I think there's also an element of how much the ritual of coffee making matters to you? I don't care at all, so it's just an appliance in the corner, and there's the other extreme where people have millions of gadgets and/or siphon brewers (which I admit do look super fun, but they also apparently don't even produce good coffee?).
One thing I do wish is there was a thermal carafe option for our unit, even though it does keep the element going to keep the coffee warm I do think that even I can taste some harshness after it's been sitting on the warmer for a while, and then when the warmer does disengage it becomes cold very quickly.
Thanks, I like the idea of a grinder. To be honest I’m not even bothered by using ground coffee. Right now I mostly use a Moka pot or french press, with Illy/Lavazza preground coffee.
> Same as juicero, many people bash the product but the engineering is world class.
Absolutely not. Juicero was absolutely terrible engineering. Any mechanical engineering intern can whip up a big chonking hunk of a juicer or any other kind of product in SolidWorks.
Filleting every edge does not mean "engineering".
The vast majority of the work and difficulty in engineering is NOT getting something to work, but doing so while also working within constraints. There were no constraints in the Juicero. It was impressively manufactured. But terribly engineered. Which is why the entire thing went down despite an absurd level of fundraising.
There is a great tear-down video of a juicero on YouTube where the guy makes comments the whole way through about how every part is ridiculously over-engineered: https://youtu.be/_Cp-BGQfpHQ
Have to remember that this was a "give away the razor, sell the blades" scheme. If you're the proprietor of such a scheme, you really, really don't want the razor to break.
The ballsy thing about Juicero was that they didn't give away the razor. It was priced at something like $400. They were awfully high on their own supply.
Eh, sometimes the Nuremberg defense is valid. The engineers were just doing what they were paid to do. As I recall from the Bolt teardown [1], if it had been a useful and desirable thing to build in the first place, it would have worked well.
Even if the engineering was "world class", it was completely pointless seeing as people could get most of the juice out of the packs by just squeezing them with their bare hands.
Though I get the impression they put most of their real engineering effort into their obnoxious juice drm scam, and everything else was just overbuilt to make it feel premium.
Disappointing that more analysis isn't put into the lense effect on PPD.
SimulaVR (a competitor) has an extremely good blogpost about this [1] where they argue that the PPD in the central 30 degrees is far more important than the average PPD over the whole headset (because you very rarely turn your eyes more than 15 degrees off center, and your eyes have the highest resolution right in front of them), justifying the same sort of lenses that Apple appears to be using. They claim that this allows them to reallocate pixels for ~45% higher "foveal PPD" than would otherwise be expected.
“friend of iFixit Karl Guttag has a blog post explaining why using the Vision Pro as a replacement for a good monitor is “ridiculous.”
…
This, says Karl, makes for a virtual Mac display with low enough PPD to see individual pixels—not quite the standard desktop display”
So rather than just trying it for themselves and realizing Karl is wrong, they quote this uncritically?
For what it’s worth, there’s nothing wrong with Karl’s math. He’s just not taking into account the fact that the Vision Pro isn’t mounted in a fixed position and that your brain integrates information over time.
In practice it certainly is as clear as my 4K display when I use it for mirroring.
It’s definitely not as good as a 5K studio display, but then Apple never makes that claim.
Karl Guttag has a history of getting a lot of excruciatingly difficult details right, and missing the literal big picture right in front of him. I think he makes great technical contributions to the field (at least from my perspective as a layman), but every one of his analyses that I have read ends with something to the effect of "and this is why it is physically impossible for AR/VR to ever work", which is far beyond skepticism. It makes it hard to take the rest of his analysis seriously because it shows an irrational bias against the technology.
Yeah. It’s weird. Even the analysis of passthrough is so off. If I look out of my window with AVP on, it’s obvious that I’m looking at a comparatively low res screen, but if I look at digital content, passthrough seems perfect because it’s not in my fovea, and given that the point of the device is to interact with digital content, passthrough is definitely at the level of being good enough.
One of my friends who got a Vision Pro is disappointed in its use as a monitor, but only because he has multiple large monitors on his Mac, yet you can only set up one 4k display inside the Vision Pro, so it ends up being effectively much smaller and lower resolution for him.
Slightly different problem, but related. I'm sure it'll get there over time.
This was going to be my complaint with the headset as well. I use 2x 49" super ultrawide displays, as well as 2x 27" 4k displays. Lots of real estate.
But then I started to think harder about what I was consuming display space with, and started to peel those things out into their Vision or iPad app equivalents. I'll not get into the obvious trade-offs of using iPad apps, but it largely suits the need.
Once I had Slack, Outlook, a Terminal, and TablePlus floating separately from the Mac display, I started to see a path toward potentially not caring if I even had the Mac display.
He didn’t buy it for that specifically, he bought it because he wanted to see what it was like. This is like one sentence of many discussions we’ve had about it. Lots of pros, lots of cons.
Eh... it's extremely good and more than I expected, but it's not quite 4k quality IMO. Not because I can see the pixels (I can't), but just because of very slight inconsistency across the field of vision, and motion blur when I move my head.
saying 4k and 5k displays while talking about PPD, without mentioning display size, is sort of useless. a 22" 4k display is going to look different than 85" 4k display.
Karl Guttag: "As per last time, since I don’t have an AVP."
For a device like this where there is so much more than just raw hardware impacting user experience anyone who doesn't own one should have their opinion immediately discounted.
> The Vision Pro has another small problem for spectacles wearers. Contrary to some reports, Apple says that corrective lenses are available for most conditions, including astigmatism (which we weren’t sure about in part one), and they also offer bifocals, and progressives. But if you have a prism value included in your prescription, you’re out of luck. Prism correction is used to correct for diplopia, or double vision. The easiest way to see if your vision prescription is supported is to use ZEISS’s online tool.
As someone that has prism in their glasses, that sucks. I do have custom lenses with prism for my Quest 2. Not sure why apple can't offer it as well, especially if the lens comes from ZEISS.
I think it's because of eye tracking issues. If you need a prism then you probably have other issues with how your eyes move around and they haven't developed for that yet. They are thinking about it although as shown with the accessibility options. I think they will figure it out eventually.
I suspect I might need prism lenses, but there's already an eye tracking option that tells it to only use my left eye. Used that option during the in-store demo, and it worked pretty well. If the image in the right eye were adjusted to match my condition then that would've been even better, but my brain already mostly ignores the image from the right eye so it didn't matter that it wasn't.
One thing the article doesn't get quite right is you can't compare ppd of a virtual display directly to an IRL screen. The virtual pixels are not 1:1 to device pixels. Your perspective is constantly shifting slightly and the virtual pixels being sampled from is constantly changing. This is providing your eyes and brain a lot more data to sample from and means the perceived resolution can be higher than the actual resolution.
I've noticed this as well. I kind of think of it like when you look through a chain-link fence, there's a lot of visual data that gets blocked.
But if you're in a car driving past, the chain link fence gets blurred out and you can see everything again.
Something similar happens with the constant resampling happening in a VR headset. The hardware pixels are constantly shifting over the image pixels, as your head constantly moves just from breathing and whatnot. So not only does the signal-over-time counteract any blurriness from resampling in a single frame, but I find myself wondering if it enhances the resolution slightly.
(After all, you can buy 1080p projectors that "jiggle" the image by half a pixel diagonally, several times a frame, to effectively double the resolution and claim it's halfway to 4K. And what is the constant movement of pixel resampling if not a similar jiggling?)
I'm curious if there's an image processing term for this effect? The effective signal resolution from a constantly shifting pixel grid over time, and how to calculate it mathematically.
This is an interesting perspective. Genuine question: would it not be the other way - seeming lower than actual resolution when slightly moving? Why?
Could this be used to simulate a higher pixel density? Maybe doing something crazy vibrating the display in some kind of circular motion or something silly?
The issue is that the original Mac pixels get 3D-projected and thus interpolated onto the VP panels pixel grid. This invariably loses some sharpness and detail. Even just 2D-rotating a Mac screenshot by some non-orthogonal angle will make it more blurry.
I am not convinced. Your eyes are literally magnifying the pixels compared to being at a distance from a modern 2k display. You may have dense PPD in AVP but being that close to them doesn't do it any favours.
I mean, if you can see a pixel you can see a pixel, there is no getting around that. It's like once you notice a blemish you can't not notice it anymore. From what I hear from users is you can absolutely see the pixels.
I think the key detail here is that each eye is getting its own set of pixels for looking at the same virtual content. This can lead to more detail, because the pixels aren't 100% redundant (they're both looking at the content from a slightly different angle.)
If I'm looking through the AVP at a "square inch" of my virtual Mac display, it may be that each eye is getting say, a 100x100 grid of pixels in this angular area. But since each eyepiece is giving me a slightly different projection of that same inch of space, the pixels themselves are going to be of subtly different values, essentially contributing more information to what I'm looking at. This is a lot different from a "real" display, when both of my eyes are staring at the same physical pixels. I think the idea is that my brain will be combining this information into a perceptual image in a way that will appear slightly more detailed than the equivalent-sized pixels in meatspace.
> It's like once you notice a blemish you can't not notice it anymore
Interesting that you say this... because when you move your head towards a virtual object to inspect any pixels you saw, the image literally gets clearer (because you're getting "closer" to it) and you don't see the pixels any more. IME this goes a long way towards tricking my brain into thinking the pixels aren't real and therefore aren't there. (I've been using my AVP for real work all day today and I've been mostly very happy with it. The resolution is absolutely the least of my concerns, and it looks subjectively phenomenal to me.)
Living with the Vision Pro is much easier than I expected. I thought it would feel bulky to manage with the external battery — it hasn't. I put the battery in my pocket and pop into visionOS without any hassle. I found the max time is about 1.5 hours before I feel slight eye and physical pressure fatigue from the headset.
The potential of the display is what convinces me that Vision will be a breakthrough device.
If Apple has a roadmap to keep scaling up the resolution, while only incurring a fixed manufacturing cost of producing a small 2cm display, there will soon be no other device category that will be able to offer such image quality at so low a cost, not laptop screens or desktop monitors or TVs or movie theatres.
Long term I dont think they _want_ Vision to be a breakthrough device. It's only supposed to be a stepping stone towards a true AR/MR device. The ultimate and likely not possible for decades goal is a contact lense based solution. The main goal is a normal looking pair of glasses capable of true AR/MR (not Google Glass or Nreal style attempts with hardware thats not really capable of a truly usable experience).
Long term wide adoption cant really happen with a VR device, the pitfalls (Weight, long term pain, hair being messed up, red marks on your face, poor battery life, etc) are too high of a barrier.
You're confused. Vision Pro is a true AR/MR device. It uses AR passthrough. A lot of people for a long time have assumed there's some inherent reason people would never accept passthrough AR devices. A few people like myself have been consistent saying that's not the case, that passthrough AR is actually the right solution. Vision Pro is just the first version of what I expect to be a long line of passthrough AR devices from Apple.
That's been their plan in the past, but I wonder if they'll reconsider. You can't do stuff like virtual environments in AR glasses. Apple may never care about games, but there are other VR features which users might really love.
I wouldn't be surprised if the headset continues even in the future where amazing perfect AR glasses are possible.
I don't think the physics will ever be there in terms of battery/energy consumption of said future devices, meaning that I don't think Apple (or any other company) will ever be able to build a processing-device that won't have to depend on a big enough battery.
You could try to miniaturise it all (up to a point) and include it as part of the device itself, similar to Google Glass, but who would want such an energy "bucket" so close to one's eyes and face? That's just disaster (and a big lawsuit) waiting to happen.
I wonder if another aspect of the external battery is that it presumably heats up, and so attaching that as a counterbalance on the headstrap would get uncomfortable?
The iPhone is quote a big (and at times quite heated) device, not sure that the philosophy behind it will help Apple in this case. Apple Watch, while physically small, it can't honestly do all that much (like displaying a movie in HD-like conditions on a reasonable big screen), so its energy requirements are not that big to begin with.
> The main goal is a normal looking pair of glasses capable of true AR/MR
You can't consume media e.g watch sports, movies etc or do any real work with those glasses which is a huge part of why Apple built Vision Pro in the first place.
And the battery technology doesn't exist for them to last anything longer than a few minutes in such a form factor unless you have it tethered.
I agree, but how will this device ever get beyond home entertainment or light office work if it's always going to be a big screen strapped to your face, not to mention the social aspect. I dread to live in a world where people are walking around self absorbed prodding and poking at things no one else can see. It just seems so selfish and antisocial.
And that's why none of this will go anywhere of consequences for a good long while. Regular people won't wear a dorkbox on their faces and the glasses form factor cannot today do what's needed.
I thought Sony makes the display (and cameras for that matter). It's odd that there are two critical pieces of tech that need a lot of R&D for the Vision Pro to keep progressing, and Apple has nothing to do with them. The screens need to get brighter and have higher resolution. The cameras need to work faster in low light and capture more accurate colors.
If I understand your point, I don’t think it makes as much of a difference as it may seem. Display panels are typically made from large “mother glass” panels out of which smaller display panels are cut. So you can make many different sized panels from a single type of mother glass (all with the same PPI of course). And the TV/monitor industry has delivered remarkable improvements in quality and cost very quickly, so it’s not as if our current methods aren’t working.
It’s an interesting thought though, let’s see how it develops.
The process used to make Micro-OLED displays is very different to the processes used for large OLEDs or LCDs, it has more in common with silicon manufacturing than conventional display manufacturing. AIUI that means the panel size will always be limited by the maximum reticle size of the silicon process, which is usually at most ~800mm².
That lines up with the 27.5x24mm size of the panels in the Vision Pro - that's 660mm², already close to the largest silicon die you can make.
After using it since launch, the displays are still startling every time I step back to really look at them. They look so good. There's a few things that could be improved upon — let me look at my phone, or project a digital display over my phone screen — but the UI and virtual displays you see in Vision Pro feel just like reality. It's startling, and bodes well for future Vision Pro devices (and other VR handsets, too).
... will never be manifested, because Apple won't let you have it. Use your imagination to tease you, but you won't ever get it. Lock'n'lol. iPad paperweight. Maybe next time, tiger.
How so? The light coming into your eyes is taking the exact same physical path through your lens, and inside, as if the virtual display actually existed*. It's a physical requirement that the optics achieve this, for you to be able to see the image.
* Assuming the virtual display is virtually positioned at the focal plane of the lens, which is usually around 3m.
The displays on the AVP are truly amazing, the best I've ever seen. In fact I noticed zero issues relating to the displays themselves. Where this device stumbles is the passthrough (cameras) and the Mac Virtual Display (AirPlay). If/when those get better I'll buy another one but for now I'm planning on returning mine before the 14 days are up.
This article does a fantastic job of explaining how good the screen is in clear, easy to understand language. Good job!
Is it a good display? It's incredibly good, far beyond the DPI of your phone or monitor. Does it look as good as your phone or your computer monitor when used? No, at the distance you'd look at them from, it looks much worse.
So the PPD means we only need a factor of 2 denser pixels and we can throw all other 4K screens away. Wait for a factor of 3 higher PPD (~ 100?) and we may get rid of all of our single-person-use monitors. I guess I'll wait for Vision Pro 4 to get the "retina display."
Perceptible resoultion is what 's missing from all these VR goggles. The density of the screen doesn't matter much if the angular resolution isnt great. I was hoping Apple would be the first to rival our actual Retina
Can someone explain why they think the AVP version in 2034 will be light years ahead of v1? I see amazing engineering but not many places to shrink everything into a lightweight pair of glasses.
You should ignore anyone claiming unquantifiables like "light years ahead".
In 10 years, my hopeful list is:
* More custom/specialized silicon for XR. There's aren't many players here currently.
* Retina/~60 PPD.
* Micro LED (not OLED), making varifocal and HDR possible.
* Varifocal lenses, which are just as important for comfort as high res.
* Cameras optimized for passthrough, rather than borrowed from cellphones.
* Lighter weight. They're almost tolerable, rather than acceptable, as is. This almost certainly involves offloading some of the compute.
But, a powerful untethered XR system and a lightweight set of untethered glasses will necessarily be very different, in capabilities. I imagine I'll end up with a tethered device, as a display replacement, in the next 10 years.
Thanks for this list, doesn’t seem like the pair of glasses that people are envisioning, more like iPhone X, lots of incremental improvements but not a major difference from v1 to v10 or even v15.
> doesn’t seem like the pair of glasses that people are envisioning
Do you have an example of these people, or what they're envisioning? Are they expecting ready player one type glasses in the next 10 years? I think there will be different classes of devices: everyday wearable AR more than XR, and screen replacement/heavy compute XR, with vastly different experiences.
> lots of incremental improvements
As should be expected. This is the only way tech has ever progressed, without exception. The only perceived exceptions involved shifting existing tech into a space that it wasn't present before.
> from v1 to v10 or even v15.
I think it's a little more than that. I think most would agree that it's not usable, in its current state, for a screen replacement, or as a primary computing device. Even Apple suggested so in their announcement, saying something like "short trips". I suspect that V15 will be the primary computing device, for most people. If that ends up being true, that's significant.
> Can someone explain why they think the AVP version in 2034 will be light years ahead of v1?
Think back to when the iPhone first came out - could you have predicted how it would improve over the next 10, or 15 years?
Sure some things are obvious like faster networking, higher quality screen, better camera and just faster processor. But in all honesty I'd say some of the biggest improvements are apps that now define how we use it and what we use it for. It's obviously so much more than a phone.
I'm not sure anyone quite knows what the future of VR/AR truly is, but I'm betting it will apps which will change how we use it and what we use it for that will make it "light years ahead" of where we are now.
Has anyone done any studies on what the corresponding carpal tunnel-like diseases will affect the eyes of the people that will start using this device quite a lot.
Meaning that our eyes (and the associated "mechanics" behind them) for sure weren't meant to fiddle with computer windows on a "screen", the same as our hands weren't meant to type on a keyboard/click on a mouse for hours on end, that's why I'm curious about the potential side-effects.
Never used this thing but as far as I understood you have to "move" your eyes in order to drag a window, or to click on something, or to do anything of the sorts, while when I'm looking on a screen I generally read stuff (which doesn't involve that "much" of an eye movement) or watch a video (which doesn't involve eye movement almost at all).
Wow, really? I don't see how that makes any sense. Bifocals are for things that vary in distance, but the displays are at a constant distance from your eyes. Are people really so used to looking through bifocals that they need them in a headset just to feel comfortable even though they aren't necessary to focus?
I'm disappointed there is zero detail about the eye tracking cameras. Where they're located, what kind they are, how they see around/through the lenses, where the infrared illumination is. Has anyone else been investigating this?
I don't see how bifocals could possibly work in this; the display is always a constant distance away. What I bet they do is interpolate between your distance and close-up prescription to guess what prescription you need to view the constant-distance displays.
You can ask your optometrist for "computer glasses" if you want to see this in action.
That brings up a question for me: Are there people with wildly different near and far prescriptions whose vision would be improved by looking at the world through this device which reprojects it at a fixed optical distance such that a single prescription can be used?
Certainly wearing this device would remove the need to refocus your eyes, so people with presbyopia (everyone, yes everyone, gets it eventually) would see a benefit. But there are so many other downsides (FOV, latency, resolution, motion blur, color gamut, brightness, reprojection errors, not to mention battery life and comfort) that it would be hard to imagine that anyone's vision would be improved on the whole.
Yeah. It would be nice to stare off into infinity as you're working. I don't think a computer needs to be involved, though, you need glasses that have an autofocus system. That's what progressive bifocals try to be; when you look at something at the bottom of your vision, it's expected that it's close, so you're looking through a lens that compensates for your eye's inability to focus on something that close. It's imperfect, though; computers are close but they often fill your whole field of vision. That's the idea behind "computer glasses", though I personally could never get used to them because I'm not always looking at computer screens from the exact same distance. That's why you'd want something to measure the distance and adjust the focus, like your camera does.
I think it's too complicated and what people do to work around it is just make the font gigantic or whatever. I have bifocals but honestly have enough control over focus to not really get tired looking at my computer screen through my distance prescription. But if I'm tired it does get harder, and I just lean back a little bit and look through my close-up prescription. (I have progressives so infinite adjustment between the two. It is nice when you get it perfect but only a tiny area of your vision is in perfect focus.)
Also something that annoys me is that glasses lenses are super primitive compared to camera lenses. I have a pretty heavy prescription and always notice the chromatic aberration. Camera lenses use multiple elements to minimize this problem.
I'd spend $3600 on a pair of glasses that could focus the entire field of view on one plane automatically. Prescription changes would just be firmware updates, so you wouldn't even be buying them every year. But, I don't think the tech is anywhere close to that yet. The vision pro isn't suitable, it's low resolution and has low battery life. You need 16+ hour battery life for glasses. But if someone invented my thing I'd buy it.
The tracking system is inboard of the lenses, so it doesn't need to look through them. There are two cameras per eye, below and on the inside (near the nose), and infrared illuminators (not sure if they're just LEDs or something more complex) surrounding the lenses on all sides.
Thanks for that image, I can see the locations in the teardown now as well. But it couldn't be more clear that the cameras do need to look through the lenses. And furthermore, the IR LEDs also go through the lenses. They are all on the display side of the lens. It must have been tricky to avoid reflections from the illuminators going through those very fancy lenses.
It's a contextual communication choice: the intended reader is someone who is concerned about their bifocal prescription. I would assume that whoever makes the lenses will simply choose the correct focal measurement corresponding to the Vision Pro's fixed 6ft focal distance, and manufacture a lens for that specific focus. What needs to be communicated to the intended reader is that a bifocal prescription will be properly accommodated.
I'm pretty sure you're right. I'm not the intended reader. I'm wishing for a much more technical teardown while iFixit is going for a general pop sci audience.
I’ve found a lot of recent teardowns from them are light on interesting details. They seem to be making short YouTube videos now that don’t really cover everything.
The screen analysis was interesting but they rushed through the rest, and didn’t even look at any of the cameras???
Not mentioned, but Apple also has help pages for what to order if you only wear/need reader eyeglasses. I'm equally confused why this would be necessary.
Anyone know how the Vision Pro handles focus distance? I use multifocal contact lenses, but even with the contacts, I can't focus on anything closer than 25-30 cm. I assume VR headsets like the AVP — where the screen sits much nearer the eyes — use lenses that move the focus distance, but does it work well even for people like me?
I'm waiting for a "monitor-replacement glasses" - show me a device which simulates an equivalent of my current 24"/4K/60 Hz monitor (same amount of pixels, same angular size in my FOV). Since I can touch type, I don't need AR/motion sensing, just the display.
The actual distance doesn't matter at all, only the apparent distance after taking into account the optics, your eyes can only see the latter. I don't know the number for the Vision Pro, but it's probably similar to most VR headsets at ~2 meters away, far better than most computer screens that people work at all day.
I've said this before, but until we have gimballed screens that can reposition a very high-density portion of the display to always be on the eye's fovea, we will not have a good productivity tool headset.
AVP v1 does the eye tracking quite well. I bet real $ that v2 will have motors internally to reposition the screens.
Issue is that you can look around with your eyes at any part of the display, so the whole display needs to be able to show very high PPI/PPD, if you look at it. They can save on compute by detecting where your eyes are looking and reducing render resolution on the edges of your eye's FOV though.
Vision Pro is not capable of delivering the level of low-strain fidelity needed to work long hours on anything related editing text. I think it is fair to say a 4k display or near equivalent is the minimum generally needed for this now, particularly for programming. [1]
Spatial Computing is not about arranging windows and using interfaces designed for flat surface devices.
It is about addressing problems using a three dimensional space. Instead of dashboard panels showing graphs in a window floating in the air, there are zones laid out in front of you showing data illustrated using 3D infographics
Apple has had to provide many examples and artful ways to work with windows because it is the only way people understand multi-tasking applications today.
"This device has to be good at window management or it isn't a useful device" is the reasonable thinking of a person who will continue to spend most of their time in flat-land, relating to others who also live in flat-land computing.
I think Apple knows very well that traditional application presentation is not actually where visionOS / spatial computing aspires to be. You can almost derive this from the types of games they put in the Apple Arcade of the app store. These do more than anything else to show where productivity will go.
Even for programming, spatial computing will shine by providing intuitive and interactive visualizations of software systems and infrastructure. Not because it renders an IDE in a window really well. That's what monitors are for!
In some way, how we solve problems has to evolve to make use of what spatial computing can provide.
Focusing on how the Vision Pro falls short of presenting the old interaction model served via hardware designed for that model is missing the true intent of the product and where it is all headed.
Apple is in a tough spot with this, because without handling some of the old way, the Vision Pro doesn't solve any "real world" problems. However, spatial computing solutions to old problems do not really exist yet. Most VP apps are just rebuilt iPad stuff in windows.
I think this is why the home screen apps are locked and Apple is so careful to subjugate apps not designed for VP. Also, why the App Store doesn't just show you a list of all apps published with VP entitlements.
I suspect they already have internal disagreement between product and marketing on what the low bar should be to qualify for an app getting to be listed as a Vision Pro-ready at all.
[1] I use an XDR Pro at my home office and a Dell U2723QE at my coworking space and the 4k is cringe-worthy in comparison. The VP comes nowhere near the experience of the Dell.
> Looking like a bulked-up first-gen iPhone, the case is milled out of a single chunk of aluminum, and the lid snaps into place with firm perimeter clips, leaving little to no seam for us to pry at. We needed a hammer and chisel to open it up! Adhesive also lines the lid, just to make sure you get the message: this pack is not designed to be opened.
> Currently, iFixit engineering intern Chayton Ritter has Apple’s proprietary battery cable splayed out on a breadboard, and is trying to determine what sort of electronic handshakes are required to make the Vision Pro accept battery packs.
Apple really dropped the bar on this one. It's high time the EU forces companies to design their hardware in a way that allows the user (or a recycling company!) to replace the batteries inside without destroying it in the process, and for fucks sake what happened to the USB-C requirement for electronic devices?
The brick/battery charges via USB-C from what I have seen. Casey Neistat ran that into a battery pack for extended use. I think that makes them plenty compliant.
I asusme the custom connector on the head unit is better fit for purpose (comfort, not getting snagged) than USB-C would be.
Or, to put it even more simply -- the battery pack is the replaceable battery. Which is fine; it's little different from the similarly sealed battery packs on electric power tools, for instance.
The battery charges via USB-C. The battery connects to the headset with a weird side-snap connector that's rather flat. As designed, there's not anywhere obvious that a USB-C port could really fit on the headset and assuming they hope to make it smaller in future years, the problem only amplifies. I think other headsets have cables coming out of them when they need to connect to other things, rather than relying on a port. It's not particularly aesthetic.
> It's high time the EU forces companies to design their hardware in a way that allows the user (or a recycling company!) to replace the batteries inside without destroying it in the process
Why? The battery pack is external. I have not ever seen a requirement that battery packs themselves need to be repairable. Not many devices come with external batteries these days, but do you really want to open them up to replace the cells? Where are you going to source reputable prismatic cells to use with such a device? They are also pretty delicate, you don't want most people handling them. If it used standard 18650 cells I could kind of see the point, but it would be much larger.
A recycling company can destructively open it up to recover the material.
A better 'mandate' would be to require it to accept third party power supplies.
> for fucks sake what happened to the USB-C requirement for electronic devices
USB-C would suck for this application. It already does for headsets like Meta's Quest, where you can use a link cable to a PC. You need to secure any USB-C cables pretty well if you don't want to risk damaging the port.
> I have not ever seen a requirement that battery packs themselves need to be repairable.
Indeed it is, but it would make the life of recyclers way easier. Having to use a hammer and pin to open it risks puncturing the pouches.
> Not many devices come with external batteries these days, but do you really want to open them up to replace the cells?
Uhhh... yes? That was the state of the art for decades before removable batteries were sacrificed for optics. You can even do watertight devices that way, Samsung's Active Tab 3 is watertight and supports swapping batteries.
> Where are you going to source reputable prismatic cells to use with such a device?
The aftermarket supply chain for devices has gotten pretty well in serving that market, although I'd also love some standardization effort in that space.
You already can replace the batteries. Apple will gladly sell you another battery: https://www.apple.com/shop/product/MW283LL/A/apple-vision-pr... But there's no use case for a consumer to disassemble the battery or to replace parts inside the battery.
I'm really hoping this improves because it's nice to turn my 14" MacBook Pro into a theater screen of sorts alongside native apps. For the moment, though, AVP doesn't seem very useful apart from watching movies (which is amazing).