Hacker News new | past | comments | ask | show | jobs | submit login

This "unlimited monitor space" is a complete non-selling point for me.

Being a wealthy software engineer, my monitor space is not bottlenecked by my budget or desk space, but by my literal neck. Constantly rotating my head back and forth from one monitor to another is, quite literally, a pain.

For me the sweet spot is a single curved monitor right in front of me. If I need more "desktop space" I add another Space with Mission Control. And with keyboard shortcuts I can move between Spaces nearly as fast as I can rotate my head around.

So what am I going to do with a VR headset if I ever got one? Put the active app straight in front of me just like I do with my normal monitor. I'm not going to put my terminal at some odd angle 25° above my head and crane my head back when I want to run a command in it. I won't put the Weather app 90° to my right, obscuring what is currently a nice picture window looking out on my yard.

For me, VR needs that "killer app" to justify the high pricing and inconvenience of use, and I just don't see one yet. I don't expect one any time soon either; if VR was going to get a killer app, it would have shown up by now.




You sound like someone who has a very stable and spacious office. Have you considered that "having more desk space than there is space in the room" is the killer app for many (wealthy!) people who either 1. travel a lot, or 2. live in countries like Hong Kong where space is at a premium?


The travel point is a legitimate one. This is less a device to look at code, and more a device to look at people and presentations. Practically every Fortune 500 executive will have one of these because they'll be able to immerse themselves while jetting around the world - neither limited to a laptop screen, nor to a cartoon environment where people don't have legs, but in a truly effective war room that interleaves live video conversations, presentations, dashboards/visualizations, and their physical travel companions.

Or, at least, they'll want the ability to brag to their peers that they can do these things! It's the Apple playbook, and it will create a tremendous amount of envy. If it's at a price that's profitable, it can sustainably anchor their reputation even if it never goes mainstream.


> Fortune 500 executive

> they'll be able to immerse themselves while jetting around the world

> a truly effective war room that interleaves live video conversations, presentations, dashboards/visualizations, and their physical travel companions

This is the world we make, and it's for them!


It's yet another way for executives to torture themselves. Don't envy them.


F500 executives tend to have people who will show these presentations on big screens, in rooms they can just stroll into (and out of). And they don't want to strap anything to their face, particularly something that might (horror!) upset their carefully-placed hair.


Yeah that comment is desperately out of touch with reality. I presume the person never actually met/dealt with these folks, for them it would be humiliating to wear it and to be seen wearing it, Apple badge or not doesn't matter. For those levels, carrying >100k watches and having plastic ski goggles on your head? Forget it, anywhere where others can see them. Maybe this mindset changes in decade or two, but not earlier.

Generally on the topic, its rather underwhelming release of device that is searching for its market (while usual Apple echo chamber here on HN sees it as second coming of Jesus). No wonder they scrapped the release few times in the past, it must have been properly underwhelming when compared to competition. And pathetic 2h battery life at best? That makes it useless for any longer flight (I am sure you can plug powerbank and continue but it will look pretty bad and annoying as hell).

I am sure Apple will tune software to perfection, but I can't see it being enough, market is tiny considering the investment, well saturated and from what I heard rather shrinking. But I hope they will push the market in some good direction long term with their creative approach, so we all can benefit eventually.


You could have described BlackBerry in similar terms pre 2008


The difference is that Blackberry let you do something you couldn't before.

Which is this entire thread -- what can you do with AR that you couldn't before?


> what can you do with AR that you couldn't before?

Its my belief we are about to find out in the next 3-5 years.


The only compelling answer I can think of is "everything we already do now, only untethered by physical location."

Which is less about polish and more about deployment volume and/or standards interoperability.


Immersion.


My dude, that's what they have when they actually arrive at their destination. We're talking about what they do on the plane, or in their hotel room.

Or, perhaps easier to picture, when they're on vacation on a beach in Tahiti. They could be chauffered 20 minutes back into town to a "secure workspace" in order to have a five-minute call where someone back at their HQ [where it's the middle of the night] briefs them on a screen... or they could go into their cabana, strap this thing on, have the five minute meeting right then and there, and then go back to sipping Mai-Tais.

Executives already make this choice, this way, right now. This choice is the reason that the iPad Pro has traditionally had better "stuff" for teleconferencing than the MBP does: the iPad Pro is — or was — the thing Apple most clearly marketed to executives. Right now, executives take out the iPad Pro to take that quick cabana video-call.

For this use-case, the Apple Vision is just a one-up to everything the iPad Pro is already allowing them to do. It's more secure (nobody can watch the presentation over their shoulder); it gives the presenter back at HQ more visual field to work with to make their point; it's more discreet in how it presents them in video calls (i.e. if they're calling in while laying naked on a massage table, that won't be reflected in their 3D-model recreation); etc.

---

More realistically, though, ignore the F500 CEOs. I have a feeling that I know exactly who this was built for — and it's not them. Apple engineers aren't any more in love with the idea of serving the needs of executives than anyone else is. They throw them a bone now and then, but they have other things in mind when building the core of each product.

Now picture this: you're an Apple hardware engineer who wants to work remotely, but you were forced to work-from-office due to not just the secrecy around the Apple Vision project you're on, but also the collaboration benefits. (It's currently basically impossible to review 3D models for "feel" on a laptop; you need either a big bulky 3D TV, or some other company's big bulky HMD setup. Neither of which travels well.) But your dream? Your dream is that you can figure out a way to do everything you're currently "doing better" by being in the office — reviewing and collaborating on 3D models of the new hardware, for one important thing — while on vacation in Thailand, sitting in your rented condo, on the couch. No need to also be paying for time at a coworking space (or to even be in a town large enough to have those); the HMD is the coworking space. As long as you have wi-fi, you can do everything the engineers back at Apple HQ can do.


This sounds a lot like the use cases stated for the office metaverse thing FB was pushing that failed to materialize.

The last thing executives want is a "more immersive" PowerPoint or Zoom call. It's either Zoom or in-person with all the trimmings, e.g. nice dinner, round of golf.


>This sounds a lot like the use cases stated for the office metaverse thing FB was pushing that failed to materialize.

Apple might be a company that is better at implementing hardware and platforms than other companies, especially Facebook.


The problem is a lack of real use case and input methods. I see none of those solved by apple.


The "look at the search bar and speak" was pretty cool even if it's simple. Eyetracking is not available on most other VR headsets yet


> Practically every Fortune 500 executive will have one of these

Even if that's true, that's only like ~50k people lol.


I mean if every single one of them buy only one of them, that's only $175 million dollars right there. Totally not worth it for Apple to bother even trying


Apple's first year sales of their watch was a failure with 10 million units sold instead of the projected 40 million. Apple now has 34% of global market share. Now remember Steve Ballmer laughing at it.

It is not the 1st generation of most of their products, but the follow ons.

I'll wait to see what the first months of hands on reviews and perhaps a personal demo. How heavy is that headset and how long is the battery life (I thought I saw 2 hours)?

Time will tell.


Good example. When the 1st gen watch came out, I knew I wanted to have one, but I also kind of knew I wouldn't want the first generation. Lucky me, because I had quite some GAS at that time, the 1st and 2nd gen watches were never really easily available where I am located. Then, I conveniently forgot about the desire to own one. For years. I now have my first watch, 7th gen, and love it. Well, it is more like with a cute pet. You love it, and you learn to love its quirks. So even after 7 generations, the software is still not flawless, nor are the sensors. This is the first thing I would be worried about, if I had any inclination to use a headset: How distracting are the bugs they definitely will have? Since I totally stopped to install anything below iOS #.2 I wonder how "fun" it is going to be to use this product once it comes out :-) I have no trust left in their QA, shipment date is more important then user experience... :-(


Apple only truly started competing against Garmin recently. Improved running metrics, low power mode, better battery (Ultra) etc only showed up recently while Garmin and others had them for years. Even GPS wasn't on the first iteration.


I am not looking for a fitness tracker, so Garmin is not even close to competition for an Apple Watch to me. Why? I use VoiceOver. Garmin does not have any speech output at all, so they can not even be compared for me. I do a lot of FaceTime Audio from my watch, another use case where Garmin doesn't even come to mind. Dont forget that products these days have a pretty diverse feature set. Assuming everyone is looking for a fitness tracker just because this is the new hype is rather, erm, unimaginative.


I’m still unsure that they’re any sort of competition for Garmin and co yet.


They are not (yet), but target group doesn't care about raw stats, or price/performance ratios. But I love them, because they will push Garmin making even better watches, so everybody wins.


Yeah it's a win/win for users I think. I just upgraded to the Fenix 7 Pro range and it's very nice.


Not sure how you can say they are not competing. Anecdata but I considered a garmin vs apple watch. Biggest driver was cellular to call either my wife or 911 when kitesurfing alone (yeah I know I just shouldn’t do it) so chose the apple watch 3 when it came out. Now have an ultra and that’s really starting to catch up with some of the other features I wanted. Seen several people in the kiting community pick apple vs garmin and vice versa for a myriad of reasons.


The Apple Watch has truly succeeded in the smartwatch space, but is the smartwatch space even worth a damn yet? Or is it perpetually waiting for the opportunity to monetize users’ health data and other tracked biometrics, for it to really be profitable.


Maybe this “space” thinking is wrong. Don’t worry about the “smart watch space”. Worry about making a product that will make a bucket load of cash. Does it matter if the sector is worth much overall when you rake it a butt load of money for yourself?


That’s what I’m getting at. Is the smartwatch market in general really worth all that much money?


https://www.statista.com/outlook/dmo/digital-health/digital-...

Revenue in the Smartwatches segment is projected to reach US$44.91bn in 2023.

Revenue is expected to show an annual growth rate (CAGR 2023-2027) of 8.26%, resulting in a projected market volume of US$61.69bn by 2027.


By that measure, the iPhone is a total failure, together with the smartphone market it created. It pales to insignificance compared to the market for food! And don't even think of looking at the market for shelter, then it's hardly even a joke, why bother. Or maybe that that's not really a meaningful angle of looking at markets?

What exactly is the "all that money" you talk about anyways? If Apple's watch division was a separate entity on the stock market and they had inexplicably high valuation I might enthusiastically agree with you, but it's not.


Anecdotally, the Apple Watch is very popular in the bay area. I'd be very suspicious of any claim that Apple didn't make boatloads of money selling it


Neither butts nor boats are all that large though. Even if Apple has made both boat loads and butt loads of money, we would need to be talking about gigabutts or kiloboats to get anywhere meaningful.


Maybe it's a butt full of prepaid debit cards?


It is while people still have too much money to spend.


I'll agree that smartwatches seem niche and not particularly useful. (I've never had a smart watch other than Fitbits, but I really don't see much value beyond tracking steps and heart rate. The notifications on my wrist aren't useful; maybe controlling music would be, but I'd rather just do that on my headphones.)

That said, it's probably a lot easier to switch to Android if you have an iPhone vs. if you have an iPhone + Airpods + Smartwatch + iPad + Apple laptop. The smartwatch as one additional small tether could make it worthwhile for Apple all by itself.


But the watch has a real use case and is in the price category that people can actually afford it.

But you are right, time will tell.


All of these depend on the individual. I've never had a wrist watch since I finished college(used for timekeeping in exams). Mostly because never needed it. Mobile phones were out by then, and you had a watch and much more in it. Its just that use case for me died out. I'm also into swimming, and other exercises(kettlebell), but the fitness features don't seem to be attractive to me either.

I didn't find the steps tracker etc wearables attractive either. It felt most people wearing them were interested in measuring and reporting things, than doing the actual workout.

But I just looked up now and the Wikipedia page for Apple watch says they sold more than 100 million units so far. And now have a fairly large portion of market for watches world wide.

Different people have different use cases, likes and dislikes. And there's also the additional public mood factor which is very hard to measure and understand. Based on that this product could be a huge success.


Agree 100%, most folks I know have Apple watches to appear sportive, because its such a cool crowd to be in currently. The guys actually doing some proper trainings almost never have them, including me. There is also category of pros/semi-pros/hardcore amateurs where it actually makes sense to use some form of it(but I never saw pros training ie in Chamonix to wear Apple brand for that, and those folks all have chest straps), by measuring any small deviations, progress etc.

For me, it actually distracts me from workouts and activities. I used my wife's Fenix 6 pro twice for running to get the idea how long my usual trail run in the forest is, and how much elevation I gain/lose. What I estimated from my feeling was anyway 95% correct (although I don't think watches measure small variations of natural terrain very precisely). But it was distracting, looking at heartbeat you subconsciously want to push/keep yourself in some perf bracket (ie just below or above anaerobic threshold for me). Vibration after each km (probably can be turned off though).

After that measurement, running again without them was so liberating, and had this nice feeling of extra freedom in the nature, just me and the trail. I feel very well when I cross anaerobic threshold, perform above it or being close to it, don't need gizmo to tell me so.


Of course all of this depends on an individual. But apple is a for profit company that spent a tremendous amount of money on the R&D of this device, and I don’t see a good return of investment here, as not many people need it, let alone can afford it.


Have you considered this is a beta product for the cheaper mass market versions coming in 1-2 years?


I don’t think F500 execs spend as much time looking at monitors and slides as you may think. Also people travel to see them, so face to face is unlikely to be a benefit to them.

Also, it’s a huge expensive gadget in a time of austerity. If your 100+ execs get one of these, it won’t look good to shareholders IMO.


US$ 3.5k per executive is less than what is spent on their secretaries per month, it's absolutely doable even more as it becomes tax-deductible opex.

US$ 350.000 is nothing if your company has 100+ executives, let's be realistic.


How well do that work on planes? People who tested quest on planes found that the motion of the plane interfered and made it unusable.


So.. a whopping low 1000s devices will sold as per this business plan?


They will sell much, much more than this. All the wannabe startups and bigwig CEOs will line up to buy this, even if they can't afford it. All that matters is the image.


But I'm genuinely curious, why would the bigwig CEOs buy this if they didn't buy the Quest 2 or other previous headsets that could do the same things? You could do the cinema and virtual desktop and zoom calls with the Quest. Why is the market much larger for the Apple headset compared to the others? Except for the initial hype of "I need this new apple device" I mean.

The other headset manufacturers have been searching for the killer apps for years, both in gaming and pro usages, both with AR and VR. I didn't see anything in the Apple presentation that was new. It seemed contrived, like this woman who accidentally had the big headset on her head while she was packing a bag and therefore could take a call that hovers in the air. I just don't buy that (and neither does the various YT influencers I've seen reviewing the Vision Pro).


Existing VR headsets are too low res to work on text based content. This new product is a 4k screen in each eye


Existing VR headsets have 4k in each eye? It's considered the minimum iirc


what? I have a occulus quest. it definitly does NOT have 4k per eye. I've actually tried to use it for a multi monitor VR and the resolution was too slow and latency too high to be workable.


More than 4k actually if you square 23m pixels is 4795x4795


Less than 4k actually. You need to divide that 23m with 2.


> why would the bigwig CEOs buy this Because the Apple device looks like a desirable item instead of just a functional toy. It's the wealth signalling and image that count.


Because Quest 2 doesn’t “do the same things.” You’re acting like Vision Pro is just another version of Quest. It’s not even in the same time zone. It’s like saying “why does anyone need iPhone when a Palm Pilot is perfectly fine?”


What are the things Vision Pro can do that Quest cannot though? Genuinely curious as I don't know much about the Quest - and others above are saying it already supports floating virtual desktops/windows and video conferencing.

Quest doesn't broadcast your eyeballs onto a front screen obviously, but is that the only major feature difference? If not what other things are new capabilities?


> What are the things Vision Pro can do that Quest cannot though?

Quest's resolution and optics are not good enough to make text legible unless it's blown up to billboard(Ok maybe just poster) sizes. The iGlasses may be the first headset with adequate resolution to make text comfortable to read, making it possible to use for work.


Any business that has a CEO can afford this.


And all the diehard mac fans, YouTubers and such that will be talking it up for the next 2-3 years, building up the hype train, until Apple drops a $400 version for consumers.


Which would still make it a huge loss for apple.


I guess then sales of 5,000 of these are guaranteed. Somehow that’s a bit lower than I would guess apple hopes for.


Nah, it'll mess up their hair.


> This is less a device to look at code

why though?


Like the OP, I found I was more efficient/comfortable on a single screen compared to the 3 or 4 I have had at one point. Now in my 40s, I find myself more comfortable on a 13" laptop compared to a 34" screen. It's just easier to concentrate.

IMHO ideal computer use is to move things in front of your eyes instead of moving your eyes/head. Your area of focus is quite small with almost no value to filling your peripheral vision.


39 here, but I really cannot imagine ever leaving my triple-screen [tie-fighter](https://i.imgur.com/DkqkER7.jpeg)-style setup, unless it was for an unlimited number of unlimited-resolution screens.

If I could have one screen per application and surround myself in a galaxy of windows, I definitely would.

Would I look at them all on a regular basis? Of course not. 80% of them I would only look at once every hour or so.


39 here too, and not turning my neck all the time to look at multiple monitors anymore has helped save me a lot of pain.


I'm a fan of two monitors, my main horizontal (though I got one with much more vertical resolution than most 4:3), and one in portrait somewhat to the side.

So many big wins. I can do a zoom screen share on my main window and have notes, private stuff on the side window, I can read documents that often are vertically formatted on the side window.

I do a fair bit of comparing type work where I need a reference index doc on the side, then I got through the individual docs for tieback on the main.

It's game changing to have multiple monitors and particularly have one portrait and one vertical.


I hear you, I'm 38. I've been using a 14-in screen for the last ten years. Clients will ask why I don't use more monitors, but I can really only focus on one thing at a time, and my field of vision isn't that big. If I need to look at another screen, I just three-finger swipe.


Maybe your eyes are better than mine, but I have a real hard time working on a 13" screen. Trying to do Excel work on a tiny screen drives me up the wall. Either I'm sitting too close squinting at tiny text, or have to enlarge everything and fall into scrolling hell. With my 27" monitor I can enlarge the text and still have lots of screen real estate to do my work.


Well, it works for me right now... but that will surely change. I'll just make smaller and smaller functions until I need to get a bigger screen, haha.


Random insert point, but all this 1:1 comparison to the existing extra monitor concept of operation is emblematic of resistance to XR in general. I see it as trying to shoe horn today's use cases as a template for something that is literally a phase change of capability -- much like how the first automobiles were framed by the lense of horseless carriages.

3D in 3D is different. And when you put 2D screens into a 3D digital space viewed as embodied in 3D XR you still get affordances you didn't have before. Sure you need to reimagine and rewrite from the ground up these long established and stable 2D apps, but there are places where real gains are there to harvest.


Exactly. Seeing people talk about "unlimited number of monitors in VR" is kind of frustrating. Monitors are containers for apps, portals into your digital desktop. You don't need monitors in VR. The monitor is a skeuomorphism! Just put app windows wherever, unbounded by monitors.


The problem is wide monitors. Nobody need really wider monitors for work. Mostly you want to have more vertical space.On work I have a 32" monitor, at home even a 43" monitor. The cool thing is the vertical space. 16:9 is bs for work. A large 4:3 would be much better choice today.


That's so different from here. I'm 35 and when we finally got a large size TV last year I never went back to the small screen. Well except when I have to.


You watch a TV from quite far ahead. Even a huge TV might not be much bigger than a normal laptop in your lap, let alone a single big monitor.


My wife and I literally live out of 4 suitcases. We “nomad” 7 months out of the year and when we are “home” for five months, we still can’t accumulate anything that we can’t take with us since our condotel [1] unit that we own gets rented out when we aren’t there.

But I still have plenty of screen real estate that I can set out at my desk at home or in a hotel room between my 16 inch MacBook, my 17 inch USB powered/USB video portable external display and my iPad as a third monitor.

[1] https://en.m.wikipedia.org/wiki/Condo_hotel


The resolution might be sufficient, but all of my attempts across quite a few VR headsets has been sad when it comes to text. The crispness you really need is possible on static glasses (i.e. Nreal Air), but all of the anti aliasing on projected textures has often made long term work in VR hard for me.

But the displays are pretty high res. Guess we'll see.


Crisp text is also Apple's bread-and-butter. They've been typography nerds since the 80s, I've long assumed that their headset is this late to the game because they needed display technology to catch up to text rendering in VR


Yeah, getting a more flexible work environment seems like the only non-gimmicky selling point here. But there are much cheaper and lighter devices for that. Like NReal Air. (Haven't tried it but reviewers seem fairly happy)


I feel like an 13" MacBook Air is the ultimate in flexible work environments. Incredibly light, powerful, goes anywhere, long lasting battery. Perhaps I'm just a philistine and haven't yet gotten a taste of the new world yet...


Plenty of software and workflows chew up a lot of screen real estate. 13" isn't enough for how a lot of people like to work.


I make it a point to do all my work on a laptop like this. That way, I’m 100% productive anywhere like in a hotel for example. I never miss giant external monitors because I don’t have any.


Alternatively, you're 50% productive everywhere.


I have a 13" Macbook Air, try to travel as much as possible. At home I have a single 27" 4k screen. Both at home and remote I work with just one screen so I'm able to keep my workflow exactly the same. Honestly, I think my productivity on my 13" does drop somewhat, but nowhere near the 50%. I would say I lose 10% of my productivity. For me that 10% is totally worth it to be able to work remotely and travel more.


Might not.

I have gone both ways several times.

Being able to group apps and then being them to focus on the single display works fantastic!

I took the time to get seriously productive in either case. The difference was not a big deal.

Chances are the OP rocks it as hard as they can. I was able to.

And being mobile these days, being able to work on an Air is a real plus!


Nreal Air is good for resolution but bad for view angle. It's not for monitor alternative use.


Thank you. I was hoping for some testimonial on this use case, since the price and features are pretty attractive for the air.

I will now wait for a future revision.


For working, it's not as good as an actual monitor but much easier to travel with. Really shines for games/movies though.


That's a bummer. I'd really like to not be dependent on additional displays or bending my neck all the time.


If you can afford a USD$3500 headset and live in HK, you are already wealthy and have a large apartment. Avg income here is around $2000/mo.


Hongkonger here. Lot of people can spend USD$3500 for a watch, gadget, or computer, AND still live in 200-300sqft apartments in HK. Doesn't make them "wealthy".


I don't know any world where spending that much in gadgets isn't for the wealthy. Yes, there are richer people out there. That is still a lot of money.


It's really not. Growing up, I had plenty of classmates who spent more than that on superficial car modifications while working a minimum wage job and whose family was on food stamps.


In one shot? That feels wrong. And is a poverty trap. :(


Lots of people on modest incomes buy gadgets on credit.


Having a 3k+ credit limit isn't that common, is it? And I don't know any consideration of the topic that doesn't treat credit cards as a problem/bad idea. Especially at that level.


Maybe cultural difference issue, but your logic sounds odd to me.

Surely with $2000/mo income (which you describe as average for HK) one can afford an occassional one-time purchase of $3500, after some saving (or, although I wouldn't personally do this, with a loan).

Or even more than that: my country has a similar average income, and average people spend 20K on a car without a second thought. And no, it's not that the car is needed as opposed to the headset, because the need of going from A to B can be satisfied by a 5K second-hand car, no one actually needs a new one.


It’s likely the price of one of these will drop faster than land in Hong Kong though.


That seems like such a narrow subset.


How about everyone taking a long flight or just staying at a hotel etc?

That IMO is where VR glasses are actually a pretty good fit. Carry lightweight laptop through the airport and still get to use a 32” monitor on the go. Granted the current hardware not exactly ideal, but it’s close enough to be a reasonable option.


Don't underestimate the unwieldy shape of these headsets, they aren't very bag-friendly. The Apple design seems to do some compromises to decrease bulk but it still won't nicely slip between other stuff. Portable displays on the other hand, they are wildly underappreciated because so many still haven't the slightest idea that product category exists. They offer a very favorable bulk/utility trade-off and allow day on day scaling between the extremes of the smallest laptop you can find and what could be considered a mobile workstation.


These devices are currently bulky, but you can easily but them and a a bunch of other stuff into an under seat airline bag. The weight and volume is annoying but not a dealbreaker.

Also, I think we can all agree the form factor is likely to improve over time. Portable displays meanwhile have inherent limitations in use ie an airline seat.


Not only swiveling your head around, but doing it with a couple pounds strapped to it. People's necks are going to be swole.

That being said, I've always wanted a wearable monitor so I can lay in bed (or stand, or lay in my hammock, or just have some variety). The chair is bad, and I've spent way too many years (literally) in it. I need options.

I'm a terminal nerd, though, so I don't care too much about all the 4k etc.


The ops folks at a company I used to work for tried a VR workspace to put all of their graphs and terminals in a big sphere around you. With 2k screens, the text got too pixelated to read very quickly. 4k should improve that somewhat, but I'm not sure it will be enough for a great text-based workflow.


Even at 4k per eye, if you imagine a screen at a typical viewing distance, the "dot pitch" of the display is going to just be massively less than a good quality high end monitor sitting on your desk.

We've been waiting like 10 years for that to change since Oculus Dev kit days, and its still not solved today. Advances in pixel density in this space have been incredibly slow.

I think it could be a very long time before a headset can simulate a really great display well enough for me, but other's mileage may vary.

Even with "foveated rendering" the peak dotpitch (the highest pixel density it can acomplish) simply isn't going to be good enough for me - it can't be any sharper than the dot pitch of the panel in front of the eye.

A 5k iMac has 14.7 million pixels - the pixel density needed to do this as well as a "real" display in VR could be pretty massive.


I agree completely. A few months ago, I purchased a Meta Quest Pro. Relative to the Quest 2, the Pro’s resolution blew me away. And it’s still not even close to usable for real work on virtual monitors.


This, totally. I’m interested to see how this compares with the Varjo offerings wrt foveated rendering.

Reading text in VR is generally a horrible experience, and “4K per eye” does not equal even a single 4K screen.

That said I would be happy with 8 1080p screens.


It's not 4K, though. They're not giving a lot of information, but "23M pixels" for two eyes is 11.5M pixels per eye. 4K is 8.2M, so this is 40% more pixels than 4K.


11.5m per eye is still far short of what would be needed to approximate pixel pitch of many of Apple's "retina displays" at typical desk viewing distance display well, FWIW. This a really hard problem with tech we have today.

Whether its 8m or 11m or even 15m pixels isn't the point with regards to using it to replace desktop monitors - the point is the necessary density to compete with excellent real life physical displays is really high.

Your VR monitor only ever really uses a subset of the total pixel count - it still has to spend many of those pixels to render the room around the display(s) too.


The display system boasts an impressive resolution, with 23 million pixels spread across two panels, surpassing the pixel count of a typical 4K TV for each eye.


Thats still enormously less than the dot pitch of a good 4/5/6k monitor in meatspace/real life today - remember, a virtual monitor only ever uses a subset of the total pixels in a VR headset, which is why the pixel count has to be sky high to compete with real life.


Yeah, with VR headsets you generally only get to count the pixels for each eye since parallax vision means that you only have that many degrees of freedom to produce a color.


Was this before the advent of VR headsets that do eye-tracking + foveated rendering? With the tech as it is these days, you're not looking at a rectangle of equally spaced little dots; almost all of "the pixels" are right in front of your pupil, showing you in detail whatever your pupil is trying to focus on.


For what it's worth, this was with an HTC Vive of some kind. However, the screen pixel densities don't change when you do foveated rendering, it's more of a performance trick - the GPU focuses most of its compute power on what you are looking at.


> the screen pixel densities don't change when you do foveated rendering

That's the limited kind of foveated rendering, yes.

Apple has a system of lenses on a gimbal inside this thing. Which is precisely what's required to do the (so-far hypothetical) "full" kind of foveated rendering — where you bend the light coming in from a regular-grid-of-pixels panel, to "pull in" 90% of the panel's pixels to where your pupil is, while "stretching out" the last 10% to fill your peripheral vision. Which gives you, perceptually, an irregular grid of pixels, where pixels close to the edge of the screen are very large, while pixels in the center of the screen are very small.

The downside to this technique is that, given the mechanical nature of "lenses on a gimbal", they would take a moment to respond to eye-tracking, so you wouldn't be able to immediately resolve full textual detail right away after quickly moving your eyes. Everything would first re-paint just with "virtual" foveated rendering from the eye-tracking update; then gradually re-paint a few hundred more frames in the time it takes the gimbal to get the center of the lens to where your pupil now is.

(Alternately, given that they mentioned that the pixels here are 1/8th the size in each dimension, they could have actually created a panel that is dense with tiny pixels in the center, and then sparse with fatter pixels around the edges. They did mention that the panel is "custom Apple silicon", after all. If they did this, they wouldn't have to move the lens, nor even the panel; they could just use a DLP mirror-array to re-orient the light of the chip to your eye, where the system-of-lenses exists to correct for the spherical aberration due to the reflected rays not coming in parallel to one-another.)

I'm not sure whether Apple have actually done this, mind you. I'm guessing they actually haven't, since if they had, they'd totally have bragged about it.


I'm guessing from this comment that you may not know much about optics or building hardware. Both of the solutions you have proposed here are incredibly bulky today, and would not fit in that form-factor.

> The custom micro‑OLED display system features 23 million pixels, delivering stunning resolution and colors. And a specially designed three‑element lens creates the feeling of a display that’s everywhere you look

They have advertised that there are 3 lenses per eye, which is about enough to magnify the screens and make them have a circular profile while correcting most distortion. That's it - no mention of gimbals or anything optically crazy.


>Apple has a system of lenses on a gimbal inside this thing.

Do you have a source for this?


I'm thinking there is confusion with the system used to set the PD (distance between eyes). Of course there are not many details, but it does look like there's a motorized system to move the optics and screens outwards to match the PD of the user.


I think the key to that would be a design of interface which is a step beyond "a sphere of virtual monitors" where zooming was not just magnifying but rather a nuanced and responsive reallocation of both visual space and contextual information relevant to the specific domain.


Therein lies another problem with workspace VR, you still need a keyboard if you are doing any meaningful typing. So you still need a desk, or some kind of ergonomic platform for a lounge chair.

It is a great alternative for gaming in that sense however. Being able to game and be standing up and moving is great.


With screens detached from the input device, it should be perfectly possible to make a good keyboard + trackpad combo for use on your lap, on just about any chair/bed/beach.


4k is awesome for a terminal nerd.

The first time I used a 50 inch 4K screen in full screen tmux/vim, I realized this is the correct way to program.


With such a big terminal screen you might even recreate what an 720p screen can, with 256 colors!

I never really understood why we like to hack character arrays into pixels, when.. we can just manipulate the pixels themselves? I mean, I like and actually prefer the cli interface of many programs, but can’t ever imagine replacing a good IDE with vim.


vim is a good IDE, so I'm not sure what you mean.

I'm not mad about your IDE or anything. I've used some that I could like okay, with vim keystrokes. But vim lives where I live, in the terminal. I can't run your IDE in my environment. I can run vim anywhere.


I use a 32" QHD for a more limited but similar effect. 32" 4k and the text was too small and thus the extra resolution just complicated everything but 32"QHD and a tiling window manager is awesome, I don't use a second monitor anymore after years of doing so.


That's only cause UI scaling sucks on Windows and linux. On MacOS, a 4k monitor works great.


So many apps on Windows, you might get the latest font rendering stack, you might get the old one, even in Windows' own settings UI


I am probably an edge case as I use a tiling WM on linux, there is little UI to be scaled. The only metric I am worried about is max text at my personally readable size. I could change the font sizes on a 4k monitor, but websites are the only non-text UI I interact with and they don't care about your OS settings. Zooming is hit or miss on if it breaks the layout or not. I don't doubt MacOS would be better in general, but for me a QHD 32" is plug and play, most websites work well and no settings faff or zooming.


Wayland implements the exact same supersampling based scaling that macOS has, Wayland scaling is even better performing than macOS'


it doesn't work great, elements are comically too big on 32" 4K or just too big on 27" 4K, you need to scale it to 1080p but then it's too small. MacOS is made for 5K 27" monitors for high DPI (Retina) resolutions or non-high DPI 27" 2560x1440. The only high-DPI 4K screen that works great OOB is the 21.5" 4K Apple display.

* https://bjango.com/articles/macexternaldisplays2/

* https://www.youtube.com/watch?v=kpX561_XM20


on macOS there is SwitchResX and BetterDisplay where you can choose custom scaling options.


Too bad MacOS looks like dog shit on a lot of regular-ass monitors.


Not sure what I'd do at 32" but with a 27" 4K I run it scaled as 1080. Everything is sized how I would expect but text is just much crisper.


32" 4K feels like the sweet spot now, 32" 8K would be a good future upgrade, but we need DisplayPort and HDMI to catch up. 120hz is very nice for desktop usage, as is HDR. Now that my main rig is a 55" 4K 120hz HDR OLED, most other monitors look bad. 14" is still the best size MBP, as sitting closer with the high PPI screen works well to have text take up about the same amount of my FOV. 27" feels small, esp at 16:9. 16:10 was awesome and I'm glad that it and 4:3 are coming back. 16:9 was made for watching movies. 16:10 allows 16:19 content to fit with a menu bar + browser top bar + bottom bar, or just gives extra vertical space. Those ultrawide monitors, especially the curved ones, are just gimmicky. Just give me a gigantic 16:10 or 4:3 rectangle, tyvm.


Aren't you a case in point then?

> the sweet spot is a single curved monitor right in front of me

So you can have that. Exactly the right monitor size, curvature, location - in every room of the house, on the train, at work, in the cafe etc. People with ergonomic challenges are, I would have thought, a perfect market for this.


Yup, this is the reason why I bought an Oculus Quest 2, to use Immersed[0]. The idea to have a huge multi-monitor setup that I could use on the go - carrying it in my backpack - felt really appealing[1].

With the pandemic I didn't really end up needing it that much, plus I had some lag issues which I never bothered solving (by buying a separate wifi dongle) so my usage never really took off, but the idea was solid.

The Oculus headset is a bit heavy/sweaty. Not a dealbreaker per se but with something lighter I could definitely see myself giving it another go.

[0] https://immersed.com/

[1] I work on a single 13" laptop, for portability. I like the setup but I do see the benefit of having large screens. It's just that I can't really move them from one place to another so I'd feel crippled on the road.


Yes I use Immersed regularly, commonly for a couple of hours each day with a Quest Pro. It's pretty good and quite usable. Definitely resolution is one area where improvement would be huge. It's ok currently but I need the monitors very large which creates its own issues (you get to the point where you need to turn your head to read across the screen and realise it's an ergonomic nightmare).

I enjoy it for an hour or two as a nice change, but I couldn't work there all day.


I think the problem is that the headset still seems too inconvenient to use in all of those locations.

I think this stuff will make more sense when these are the same form factor as a normal pair of glasses.


yeah, the friction is key. This is a step forward, I'm sure it'll be amazing that you can just literally put it on and look at your laptop and it pops up as a big screen in front of you. But I think the strap is a barrier. Like you say, glasses form factor is so much better than "strapping" it onto your face. It's rumored Apple has that in its sights for a future model.


No one will create the killer app because they won't have enough people to buy it. They aren't going to sell 100 million of these things. They will sell 1 million to prosumers. But you can't make a killer high-end game on a completely new system with completely new features with such a limited market, they would need to sell it everyone to make money. That's the real problem with AR/VR. You need critical mass in the number of users to justify people building mass-market appeal games and apps. The goggles need to not have a cord, be 1/3 as heavy, and 1/4 the price, and then we will get mass adoption. My gut says we are 3 generations away. But it will happen.


Yes, they are going to sell 1 million. In this generation. Next generation will have non pro model. You can sell ten millions of that. It is not going to kill phones, but it will absolutely slaughter laptops. This generation is basically just devkits.


I don't think it's hit people (including me) that this is not just a headset. It's a full-blown computer.

You can take just the device and a keyboard with you to work anywhere.


Yep. This is huge for those who travel. It’s huge for those who do cad work. And the power available in such a small form factor really opens the door to previously impossible tasks


It seems awfully convenient that the laptop folds down nice and flat. It takes up very little space. Headset like this is still kinda big to carry around with you... Maybe just a preference on my part, but I quit carrying my big can headphones around with me because they were too bulky. I'd never carry around a headset like this. Plus you look like a dick wearing one.


Which is much bigger than a macbook air in a bag, and can do 2 hours at most.


You won’t need a keyboard.


You do. I’m 100% sure that flickering your fingers in the air simple (besides looking like an absolute moron) doesn’t have enough information to accurately type. Also, your arms will fatigue immediately.


If you can position things in AR, you can put keyboard keys down on wood grain and the device can tell where your fingers land.

If you can escape the skeuomorphic trap, many things are possible. A mechanical keyboard is certainly not the universally optimal means of character entry.

Maybe not in this rev, probably not at launch based on the video, but keyboards as we know them are due for an overhaul.


> A mechanical keyboard is certainly not the universally optimal means of character entry.

Funnily enough, I think that this is basically the ultimate limit of touch based systems — humans rely very much on touch, and touch screens’ smooth surface removes every physical hint from the system. Just remember back to how we could compose a whole message blindly in our pockets with feature phones, yet I can’t write a sentence correctly nowadays without constantly looking at the screen.

Now you would even take away that? Don’t get me wrong, I don’t think that the keyboard layout or anything is the optimum, but it is sure as hell closer to it than randomly hitting the table. The mechanical part is funnily being the key part.


It does seem to me there are strong parallels with the iPhone/Blackberry keyboard conversation. Some people will hang on bitterly until the end.

I can thumb touch-type on a cap touch screen with the help of autocorrect. With continued improvement of predictive text and new input methods I think all kinds of things are possible.

Maybe with another technology iteration of haptics you would get positional feedback?


> If you can position things in AR, you can put keyboard keys down on wood grain and the device can tell where your fingers land.

You'll get carpal tunnel syndrome faster than the battery drains if you're actually doing that. One of the main points of keyboards is actually the fact that they absorb some of the shock of typing.

It's actually extremely plausible that the keyboard is the best possible text input method - at least until we find a way to read brain signals non-invasively and decode those into text directly.


My (unchecked) understanding is that carpal tunnel comes about because of the angle of the wrist and the repetitive pushing itself.

If there are no keys to press wouldn’t you have no reason to exert force, and no need to angle your wrist or brace your hand?


Eventually, maybe. In the keynote, there was a vague outline of a virtual keyboard, but (unless I missed it) we never saw that virtual keyboard in use. Instead, the demo pivoted to using a paired Magic Keyboard.


How exactly would you replace a keyboard with anything even slightly as productive?


Just lay down in bed and put physical keyboard with touchpad on your legs. Many times I work from airbnb or hotel that doesn't have proper chairs or workdesk or from coworking hotdesk when travelling.


The GP was claiming a keyboard won't be necessary at all.


I probably won’t, but someone probably will. Productive might look quite different.


If we agree then why are we arguing? I said it would take 3 more generations to hit 100 million, and I said it would happen. My point is that it won’t attract big time developers until then because it will be not be economical for them. But I think apple can grind it out, make it just good enough to attract just enough value to grow just enough hit big numbers in 5-7ish years.


It will attract "big time developers" in version 1 because being first to market on a new platform is an enormous advantage, even if that platform won't have significant market penetration for years.

Angry Birds, Fruit Ninja, etc. were not particularly revolutionary apps and would never top the charts if invented today. But because they were some of the first games on iOS they became multi-billion dollar franchises.


It will "attract" a few. As in, Apple will pay people to develop apps for it, and will basically buy teams to develop apps. I have heard of these deals happening. But you won't be able to make a bunch of money off it for many years, so how much developer talent can they actually attract? The iphone was waaaay different. There was instant utility for the phone that attracted millions, it wasn't insanely expensive like the vision pro, and the apps you could develop were simple and useful. Yes, there will be a bunch of AR apps from iOS you can use instantly on vision pro (I assume, not actually sure), but to develop a full featured app that takes advantage of the interface will be quite hard, and thus quite expensive.


No one bought an iPhone to play Fruit Ninja, though. They bought it to get access to the internet on the go. Essentially the browser was the killer for a phone.


Those games were also like 3$


To be honest, I see much more financial constraints ahead in 5-7 years for the average (even Wester-only) people to think about spending anywhere close to this amount on a luxury device and with the amount of hardware needed even with generational advancement I don’t see it changing.


killer app sells systems, not the other way around.


That’s exactly what I said in a different way. No one will make the killer app because it isn’t economical to do so.


If there was one to make, Apple would make or subsidise it, guaranteed.

Even after reading loads of comments no one can really think of one.


Yet.


Apple isn’t the only one with an XR device. Devs can still hone their ideas now that they have UX direction. The Apple AR SDK has been out for years now too.

The first iPhone also only had 1.4 million in sales. I’m not even sure the App Store was even out until the 2nd Gen.


The original iPhone sold some 6 million units from what I can google.

Steve Jobs himself said 200 days after the launch of the first iPhone that they sold 4 million units.

Source: https://www.macworld.com/article/188823/liveupdate-17.html


You’re right.


The killer app imo is AR instruction. That is:

- you’re looking at some kind of physical thing in the real world you’re “working on” (whatever it may be) - your goggles are pointing out important aspects, telling you what to do next, etc etc.

I always thought something like this for auto repair would be really cool. Of course we need the software to catch up in this regard, since it would have to recognize and overlay fairly complex visual spaces.


Sports referees could also benefit, instant replay. Once there’s a cheaper, lighter versions you’ll see mums and dads running on the soccer/hockey fields with these.


I just think you are thinking of the monitors in an overly literal way.

Imagine a calendar on the wall, but with your meetings and everything dynamic instead of just a static calendar. And it adjusts to show your next meeting extra large as it approaches. No you see useful information in your periphery.

Or perhaps you have application monitoring dashboards on another wall. You don't look at them all the time, but a dedicated space wouldn't be a bad thing.

I see a lot of potential here in the future.


A digital calendar on the wall and a dedicated screen for monitoring are both possible with tech from 10 years ago.

The problem isn’t “we couldn’t do this before AR and now we can”, it’s “my computer already does calendars and monitoring well enough”.


My windows phone could already do everything an iPhone could do at launch, and in 3g no-less. But there is something to be said about putting it all together well and having it all just work seamlessly.


Until it is superseded. Ask Blackberry.


Maybe but every single photo is a person, alone, in a room.

While this is the case for a period of life, its certainly not the case for most of it or an end goal.


This is first-and-foremost a tool for doing work. They show people using it in their living rooms, but I get the impression that the key use-case is to use it in a home office (where you'd already be intentionally isolating yourself to get work done) — or in some other room (e.g. a bedroom) to turn it into a home-office-alike space.


Fair enough though when I am home, I have half an ear for what's going on in the house whether its stuff outside; someone at the door; the cat doing cat things; the kid running around etc.

It's rare even at work that I would want to be so fully immersed. Kind of makes me feel vulnerable, not you?


That niche is killed by their own watches.


Real estate costs more than this head set. I am a VR skeptic. But if someone truly solves the problems, a virtual desktop has obvious advantages even for the rich. I could literally clear out one room and shrink the remaining desk to fit a closed laptop, keyboard and coffee mug. And now my entire workstation is portable and exactly the way I want it where ever I go.


My immediate thought was working on a flight. This guy is like he's got some big curved monitor on his flight. No he doesn't, he's hunched over a laptop screen.

If I could work on a flight on a big screen I'd be thrilled. I really don't like the ergonomics of hunching over a laptop screen.


When I worked at Intel in 1997 we bought one of the first 42" plasma screens on the market to put in our game lab - and I put it on my desk and attempted to play Quake and Descent and other games on it and I couldnt handle it so close to me - it had ghosting and bad lag and poor angular visibility and it was $14,999.00

We turned it into a wall piece that rarely got used.

in 2016 I got a monitor for one of my OPs guys that was 4k and was ~34" and that was still to big to sit in front of - and my OPs guy gave it to me, I hated it and gave it to an eng, and he loved it.

Big screens are for certain people. I have a 70" screen in the living room that I never turn on, my brother uses it exclusively, and I use a 15" laptop as my personal screen.


But its very handy if you're a wealthy nomadic software engineer. I don't want to take monitors with me and I'd like to travel more while working. I'd like to do that with my 12" Macbook air.


Also being a wealthy software engineer, there still isn’t a better multi-monitor mobile solution than this at any price point. If you’re only working from home sure, but I like to cowork with friends in a variety of places.


I use 4 monitors arranged on arms to form a shape roughly like a curved 15360x 4320 display.

I also don't see how VR will come close to replicating the productivity I have in my home office, on any foreseeable timeline.

But when I go somewhere and just use my laptop screen, it's almost laughable how inefficient and annoying it is. The screen is tiny, I am constantly switching apps / virtual desktops, and there is no way to even see my debugger, documentation, and my app running at the same time.

To me, that's what I want VR to fix. The portable workspace. For us spoiled rich engineers sitting in our spacious home offices, the constraints that make VR (theoretically) appealing just don't exist.

(I'm skeptical there are enough people who want this badly enough to pay $3500 for it to fund an entire product category, though... I expected them to come out talking about fitness and health.)


The first question that pops into my head is why you’d work on a curved monitor (of which there still doesn’t exist a high resolution model) as a software engineer. Do you find the workspace on a single curved display sufficient?

My primary concern with the Apple headset is the relatively low resolution of 23M pixels. Our eyes can perceive so much more detail, and I’m afraid the low resolution will reintroduce pixellation as is commonly seen on low end and curved displays.


To me, curved monitor makes complete sense. Edges just become too far with flat displays up close.


It's not just that the edges of the screen are too far, it's that they're at an oblique viewing angle instead of perpendicular to the eye.


If it is 23M pixels per lens, that is still more resolution than a smartphone's screen. Each lens is smaller than a smartphone's screen and the resolution is per eye. I wouldn't be surprised if this actually exceeds the eye's ability to perceive pixels.

The difference between a monitor and the lens of a headset. If you look at a 4K monitor up closely within a region of the screen of two inches in radius, you are not seeing 4K in that region. 4K of pixel applies to the whole monitor not to the eye's field of view as it does to a headset.

If you were using the headset as a monitor, you could zoom in on text and the text can effectively have infinite resolution as it scales up into view.


> if it is 23M pixels per lens, that is still more resolution than a smartphone's screen.

But you don't use your smartphone 1-2" from your eye.


> of which there still doesn’t exist a high resolution model

QHD 32" works great, it's not quite two monitors but if you are using a tiling window manager or spend all your time in editor windows it's perfectly practical.


But the pixels are visible, and text on those displays is so much less legible than on a 200+ ppi display. I simply don’t get how some developers find those monitors to be acceptable and at the same time disregard the Apple headset. Perhaps it’s just lack of vision.


Maybe you have really great eyesight, or sit a bit too close? I can't see them. I have used retina displays as well and while it's clear there is a difference, it's not a practical difference for me. Retina feels nicer but it's the same amount of UI and text on a screen.

4k in VR is very different though, it's 4k per eye not 4k in dots per inch. 4k in VR will feel like a massive downgrade if you enjoy high DPI screens, but I think it should be usable. The state of the art is 12k I think and for people who like working in VR I see 8k on the pimax as the most common recommendation for good text rendering.


Agreed about the non-selling point. I've only ever been able to get my eyes to focus on one thing at a time. So I prefer one monitor. CMD/Alt+tab works for me. If I need to have things side-by-side for some reason I use a window manager and some key combos to quickly rearrange windows. There are very few times that I wish I had another monitor.


Even beyond my neck, the limitation for me is my ability to keep track of the spatial location of that many things, and need to have them all displayed simultaneously. I've really just found the sweet spot to be two displays (with the cost sweet spot for me currently being 1440p, but I imagine 2x4k would be an improvement). Even a third monitor really doesn't improve my ability to do things, so I can't imagine "infinite" impressing either.

For me, the main appeal of VR is its potential for gaming, with a distant second place being more broadly "interacting with things in 3d" (such as 3d sculpting/modeling, or something like VR chat).


don't forget 3d reverse engineering too

being able to spatially interact with disasm code inside IDA pro is going to be a game changer for those who like to take a more topological approach to the art


You can already spatially interact with 3D content on a regular screen. Thousands of CAD people do it all day for a living, they even have specialised peripherals for 3D navigation like the 3DConnexion stuff.


You think Hex Rays is going to support Vision Pro? They barely support two dimensions lol


:(

maybe a nice Ghidra plugin, then?


I don't have high hopes for Swing either.


I honestly don’t really see that working. Especially that apple didn’t innovate on the input-space and that is fundamentally 2D.


This. I used to be a multi monitor type of person but when desktop switching became good (I first experienced this in Linux) I started using a single larger monitor and never looked back.


Turning your head causes you pain? You need to go to the gym, get in shape, or figure out what the hell is causing a natural motion to induce pain and discomfort.


Sitting is a natural motion and hundreds of millions of people have spine problems from that alone.


sitting is natural.

Sitting on a chair, at a desk, staring at a screen, for 8 hours a day, 5 days a week, and then sitting in your car, and then sitting on your couch and never actually walking anywhere, isn't.


Most developers don't have mobility issues. They have 2 / 3 large monitors (or laptop + monitor).

And so in this case they have the ability to access them anywhere, anytime.


Not wanting to turn your head 90 degrees to see your 13th monitor is not a "mobility issue".


Or you could not be ridiculous and just use 2 or 3 monitors like everyone does today.

At least you have the option to put monitors above and below as well.

And completely swap configurations for different use cases e.g. coding versus gaming.


"Everyone" does not use 2 or 3 monitors. Certainly among the software engineers I interact with regularly (at top US tech companies), having multiple monitors is the minority, not the majority.

I agree with the parent that any setup that requires me to turn my head to see all of my screen space is a downgrade, not an upgrade. Even a monitor that's too big (above 30 inches or so at normal desk viewing distance) is bad.

If you like it, go for it, but don't act like it's the only or even most common way to work, even for developers.


I've worked at two out of the FAANG companies and many others. Never seen a workspace in the last decade that didn't either have a laptop and external monitor or multiple monitors.

And there has been quite a bit of research [1] on them with 98% of users preferring dual monitors.

[1] https://www.ie-uk.com/blog/how-multiple-monitors-affects-pro...


> Never seen a workspace in the last decade that didn't either have a laptop and external monitor or multiple monitors.

I've never seen anyone using "a laptop and an external monitor" who actually uses the laptop screen. (Where by "use" I mean "looks at it." They might have it on, but it's usually just idle at the desktop.)

Personally, I plug my laptop into a monitor and then put it, closed, onto a little stand for ventilation. (One of these things: https://www.apple.com/ca/shop/product/HP9X2ZM/A/twelve-south...).


Really? I found this so shocking I just got up and checked and around here 7 out of 11 people have their laptop screen in use.

I use it as a screen for my slack/discord/email and have my two main screens above it. It's true I use my two main screens more, but if I didn't have my laptop I'd want a small third screen to replace it.


I'm with derefr, once I connect up to an external monitor, or two, or three, I close my laptop and put it in a stand. I never use it an extra monitor either.


Have you considered the ergonomics of doing this? I do know a few people who put their laptop up on a pedestal mount so it's in line with their external monitors, which is fine. (These people generally got the largest display-size laptop they could afford, so it makes sense for them.)

But if you have your laptop sitting directly on the desk — presumably because you use its keyboard to type? — then any time you look at its screen, you're straining your neck. There's a reason monitors are on stands that hold them up 8+ inches above the desk — it's so it doesn't hurt to stare at them all day.


I'm not a software engineer or anything like that and I still have three screens including a laptop screen at my desk. Almost everyone at the small NPO I work for have at least 2 monitors including the likes of finance, customer service officers, etc. When I visit other offices it's not unusual to see 2 or even 3 monitor setups. This is common even at government agencies. This may be specific to New Zealand however and not the same elsewhere in the world however I'm sure Australia is in the same boat going by what I've heard from my Australian friends. YMMV. Will be watching this Apple innovation with interest.


Interesting - at the Google office I work at, the vast majority of developers use at least 2 monitors, sometimes 2 monitors + a laptop screen.


I'm a digital nomad. I miss having a spacious multimonitor setup. tried making it work with an occulus quest and immersed VR but the results were disappointing. If they can make it seamless and match the resolution so my eyes don't hurt after a minuite of actually reading code, Its going to be an immediate shutup and take my money moment.


Why wouldn't you use gestures to move the right monitor to be directly in front if you, maintaining some concept on what's on adjacent ones from UI hints?

Really the whole concept of "monitors" feels skeumorphic here. Shouldn't it just be a sphere where you're looking at a concave part with your current app, and can rotate as needed to pull other apps into view?


I can see it being nice if it's like Minority Report, where you can swipe small screens away, etc. Talk and it types. Glance to the left to see how the builds are going, etc. It could also be a nice virtual whiteboard. Usually it's hard to know how nice hardware can be without the apps. And you don't have to be in your office.


>"Constantly rotating my head back and forth from one monitor to another is, quite literally, a pain."

60+yo fart here. Same problem as well. After dicking with 3 32" 4K monitor setup a good while ago I am now down to a single monitor. It is still 32" 4K at 100% scale and feels comfy enough.


As someone who used to have a cheap-ish 3x27" monitor setup, I can confirm neck strain on big triple monitor setups is most definitely a thing. Imagine combining this with carrying the weight a pair of technogoggles like these,and I think it could get tiresome really quickly.


What if the mapping between your neck angle and screen angle wasn’t 1-1?


Then you would likely become dizzy and puke.


Perhaps someone will invent a way to virtually move around within a virtual space. Seems far fetched, I know. But we can still dream.


Decoupling virtual from physical movement is the fastest way to get people puking and giving them headaches.


True. Scrolling windows triggers my nausea.


You don’t need to turn your neck tho, you can turn the environment. And nothing goes off screen, just out of foveal focus.


It's called Beat Saber. :-)


you, or someone in a situation like yours, might at times find it valuable to have like a giant whiteboard in front of you, that you can walk around in front of, and on which you could spatially arrange a bunch of details


putting it in 3D is also an opportunity to fix window management


How so? There have been plenty of 3D window managers and IMO was all just gimmicks, not really contributing to any increase in workflow.

Edit:typo


Scale and Expose both definitely improved workflow





Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: