Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

If you're suggesting that Apple implemented Meta's research starting as it was published on 8/2021, then that timeline absolutely does not work.


Could you say more? I agree an Apple VR headset has been in the works for longer than 2 years, but is it that crazy that they were working on multiple approaches and didn't settle on a final design until after 8/2021, which included using some non-trivial ideas from that paper?


Can't really give out any non-public info unfortunately.

I'm not saying that everything was fixed in stone by 8/2021, but any big hardware features like the front-facing display would take longer than that to develop start-to-finish, so I'm just refuting the possibility that Apple could have started development of a front-facing display on the headset and had it ready on the final product in <2 years.

It's not necessarily that a display itself (or any other individual component really) takes >2 years to develop, but that a tightly integrated cutting-edge system can't have significant hardware features added on <2 years before the final product is demoed to the public.


I'm gonna go ahead and play:'I work near hardware in FAANG, this is totally possible to pull off in 2 years'

...but if you're asserting 'I work at Apple, impossible', I'll give it to you.

Generally people believe way too strongly that phones / other hardware /etc. are set 3 years in advance. Note it's well-reported Vision Pro just got to DVT in the last 4-6 weeks.


Yeah I definitely agree with you that a FAANG could add a substantial feature to a headset in 2 years, but I think that there are a combination of circumstances that make it pretty impossible in this case.

In my mind it's some combination of:

- New product line (this would be easier in a well established product like iPhone, Mac, etc. or from Quest 2 -> Quest 3). A lot of decisions have to be made much further in advance because you're starting from a clean slate and have to have a final product at the end of similar quality to the iPhones that have been iterated upon for 16 years.

- The Vision Pro is much more constrained in a few key areas that are hugely impacted by the addition of another display: size, energy draw, compute, weight. Much more so than something like a Mac would be. If an additional display wasn't in the budget 2 years ago then you really aren't gonna just find the space for it in all those key areas.

- Custom silicon: Any feature that requires a decent amount of compute and/or IO bandwidth will have to be accounted for when designing the chips. Meta's headsets use off the shelf Qualcomm GPUs and they could in theory bump up a level later in the design process if they need more hardware (not as easily as I described but still possible). Apple simply doesn't have that option.

- By virtue of this being Apple and not Meta/Google/Microsoft/Amazon. I'm not knocking those other companies, but they are differently positioned in the market and are ok releasing more varied and less polished products just to see what sticks. Apple enters later in the game with a product that has had more time to be refined. Google Glass, Oculus, HoloLens, etc. all paved the way for the Vision Pro and it wouldn't really work the other way around in my opinion.


Really insightful, Ty (love the point re: thermal budget / numerous hardcore constraints)


Happy to give insight and answer any more questions you might have! I'm no longer at Apple but I did work on the Vision Pro and am super excited to be able to talk about it now. Of course I can't spill all the secrets and am being careful to only say stuff that's public knowledge, but it's a great feeling to finally talk about it after spending the past 3 years hush-hush.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: