Hacker News new | past | comments | ask | show | jobs | submit login
visionOS (developer.apple.com)
102 points by soheilpro on June 6, 2023 | hide | past | favorite | 89 comments



SwiftUI, RealityKit, ARKit, Xcode, Reality Composer Pro, Unity ... give me a break :)

These device specific environments and technologies are cumbersome to learn, will exclude everybody with a different device, usually change rapidly which makes code maintenance a nightmare and at some point simply die like flash.

I will start developing for 3D devices when we have a standard similar to HTML or when 3D elements are part of the HTML standard.

A text based way to describe how elements are positioned in space and what attributes they have. With JavaScript support for interaction.

Looks like there is "X3D":

https://en.wikipedia.org/wiki/X3D

At a first glance, it looks a bit like a 3D version of SVG? That could be a good starting point.


maybe you're talking about Universal Scene Description? it's a format Pixar put together and opened up. there's a WWDC talk on Friday titled "Explore the USD ecosystem" about it; I'm hoping this means Reality Composer Pro/visionOS is using an existing open standard for this stuff.

https://openusd.org

edit: it looks like RealityKit and the Apple ecosystem already supports this

https://developer.apple.com/documentation/realitykit/creatin...



I'll welcome VRML back anytime.


For a tiny while that was so futuristic: the Web, in 3D. To ~10 year old me it was like seeing the prototype of a flying car, of course everyone will use this in the future!


> when 3D elements are part of the HTML standard.

They already are. CSS Transforms can already be configured to happen in a full, proper 3D space.

Every polygon of this demo is an HTML element, and yet it is fully and completely three-dimensional: https://keithclark.co.uk/labs/css-fps/

One would just need a browser in VisionOS that supports displaying these elements in actual 3D rather than projected onto the 2D surface of the webpage...


They must have gone only with Unity, because Apple has a hatchet to bury with Epic Megagames, the creator of Unreal Engine.


An open standard that is very capable (but cannot find if it can do streaming over http) is https://github.com/KhronosGroup/glTF

It still misses some important things in the area of shading and animation for full transfer of all qualities you find in a modern 3d creation app like Blender.


Kind of ironic that parent asks for a global standard like HTML and gets 5 different suggestions in the replies ;)


That's the best thing about standards, you have so many to choose from.


Sure, when POSIX gets updated for a proper full stack development experience, and modern hardware devices.



That seems to be a way to define 3D objects and render them in 2D html.

This approach would be useful if 3D devices let you render a view for each eye. And report head and hand movements as events. So one would build a whole 3D engine with the current Browser APIs.

The approach I was thinking about was to tell the 3D device "This object is located at x,y,z position 123,40,8". And the device does the rendering. It's probably much faster, as the device probably has a lot of hardware optimized for 3D rendering. Let alone things like AR, where you have to analyze a given video input, figure out that there is a real table in a certain position in space where you can put 3D objects on, calculate the physical interaction and shadows etc.

Not sure which approach is better. Time will tell.


The first approach doesn't work because it'll give the user motion sickness. The device must be able to interpolate between "frames" faster than you can update them, and providing the info needed for you to do that is a privacy issue.


WebXR is the way to do this, and a reasonable more supported equivalent to react-vr is probably @react-three/xr which where the stack is react --> three.js --> webXR --> WebGL.

https://github.com/pmndrs/react-xr

Apple announced support for WebXR on VisionOS as well.

I think it will be a lot more interesting once webGPU hits too, as it will be closer to native-level GPU programming but portable between both native and web contexts.


i actually see this sort of commentary a lot, and i suppose it's apt for HN and other tech fora where people want open standards, but for the life of my i don't understand why we shouldn't develop with technology designed _with_ the hardware. this is apple's forte, and i've done a lot of generic development in the past, as well as multi-cloud cloud computing and the only takeaway i have is that they always provide inferior experiences for the sake of some nebulous goal

the point of apple vision pro is to run apple hardware with apple software, and people _want_ that.


It is mostly from people that see UNIX everywhere, and are stuck in CLI applications mindset, or shipping Chrome disguised as application framework.


Because the alternative is a walled garden, for better and worse. Why don't most games work on Mac's? One big reason is closed APIs, especially as hardware improves.

Why is HTML becoming a ubiquitous API? It's free and open.


Yea no wonder the world is chock full of slow frameworks building on top on unnecessary cruft (ala chrome), and no one seems to understand the underlying tech anymore.


Been thinking since 2015 how to explore information as a 3D structure: plurid. The stable library is for now only for React.


GLTF does a pretty good job. It definitely doesn't cover everything, but it's a good scene descriptor.


I doubt Apple will get on board, but this seems like the correct answer to me.

https://github.khronos.org/glTF-Sample-Viewer-Release/


Have you found anything that lets you make DOM elements 3D and have a material?


What's Flutter's VR story?


OpenXR seems to be a good platform.


I wonder if all of this will turn out to be another WatchOS. It’s comprehensive, and frankly cool. But will that be enough to tip the scales in favor of a killer app?

Every new platform needs a killer app. Xbox had halo. iPhone had maps. watchOS didn’t really have one. Hopefully visionOS will.


The killer app is your current workstation. This thing is worth it alone just by virtue of being able to expand my workstation to an indefinite number of virtual monitors wherever I am alongside my laptop.

This is possible on current VR headsets, but the resolution isn't there for any of them, including the Meta Quest Pro. This is the first one with resolution enough to push it over the threshold.


> being able to expand my workstation to an indefinite number of virtual monitors

Can it do that though? In the presentation, they only showed 1 virtual monitor replacing you laptop's screen - essentially just mirroring the screen once.

Everything else were what looked like native apps, so likely as restrictive as iPad apps, but in (virtual) space.


I remain unconvinced that a lot of people will want to spend most of the time they are awake with VR/AR goggles strapped to their face.


If someone told me as a kid that I'd spend more than half or my waken hours with a screen in front of me I'd have called bullshit on them as well.

On principle if this device was better than our current laptop screens I think it would be a no-brainer. With it being so early in the tech yet, reportedly heavy (why did they think metal was a good idea), and with only so much resolution, it's still hard to justify as a work expense.


> I remain unconvinced that a lot of people will want to spend most of the time they are awake with VR/AR goggles strapped to their face.

Nobody suggested that it should be used all day for most people.

However, when I watched the keynote, I immediately thought how much nicer it would have been at the peak of the COVID pandemic to be using an Apple Vision Pro for all of my Zoom calls on the living room couch (or the porch) instead of being stuck at my desk with at the desktop.


Your workstation already has access to an indefinite number of virtual monitors.


Not simultaneously


.......I think this is how they plan to stuff everyone back into open office floorplans.


You cannot look simultaneously at multiple places anyway.


I mean, watchOS has a killer app already: Health and Fitness.


Apple’s killer app has always been software finesse and hardware craftsmanship. It feels nice to wear and use an Apple Watch.

But cutting edge tech ages like milk. Still prefer a Garmin with a longer battery life.


I mean, watchOS has a killer app already: Telling the time (without picking up your phone).


Kinda, but Fitbit has a range of fitness trackers that do those things with a longer battery life while costing half as much.

I'm reminded of "less space than a nomad", and it's that same magic sauce that might make or break it.


And yet it seems most people choose to pay twice as much to not use Fitbit.


Hence second paragraph


> watchOS didn’t really have one.

What? It's had a few over the course of the product. One of its amazing feature sets is the workout system, general health monitoring, etc.


> I wonder if all of this will turn out to be another WatchOS.

The killer app for watchOS is health + fitness: exercise, sleep monitoring, etc.

Also, the Apple Watch has 60% of the worldwide smartwatch market's revenue [1] at around 50 million units.

[1]: https://appleinsider.com/articles/23/02/22/apple-watch-domin...


This is exactly what will happen. Apple will throw the tech at it. Developers will build things. Then whatever sticks they will lean into.


I use my Apple Watch Ultra every day to track sleep, while integrating with the other Apple products that I own. For me that justified the price point given that I am doing my _hardest_ to improve my sleep (and am not that affected by anxiety about sleep data).

So far, it has more or less panned out as I want to. One thing I'm not happy about is the temperature tracking as I want to know when my temperature minimum is, but Apple doesn't give me that.

Other features are nice to have. One that's really nice to have is Apple Pay on your watch. With public transport (Amsterdam), it becomes seamless to go on a ride.


> Every new platform needs a killer app.

Maybe not. This is a popular argument but I think it's too simplistic. There must be different killer apps for different people. For the AR, there are meaningful applications especially for the Industry use. It's also meaningful for entertainment and games. It's an extension to existing platforms with a new interface. I think there will be multiple use cases for this platform.


> Every new platform needs a killer app. […] watchOS didn’t really have one.

So that’s why the Apple Watch failed, isn’t the number one smartwatch, isn’t the number one watch and hasn’t sold over 100 million units! /s


AR city builder would be amazing with hand tracking, where you can pick up buildings and place them, or just interacting with the mini world.


It just needs an addictive chat app people spend 10 hours a day in.


Perhaps it will be what finally pushes me to try out Furcadia 20 years after learning about it…

https://en.wikipedia.org/wiki/Furcadia


What are the health implications of watching a screen this close for many hours every day? Is there scientific consensus on this or are we exploring uncharted waters?


AR/VR doesn't work the same way as a 2D screen because of the lenses. Your eyes are essentially focusing at infinity.

It can lead to completely different problems, mostly in children. Adults instead get terrible motion sickness if you're not very careful.


No idea about the eyesight, but a personal anecdote about the eyetracking for the cursor/selection: It's an existing assistive technology and a few years ago I tried it for fun for a day. It can get tiring and you really exercise the muscles moving your eyeballs. I suspect it's something that you train / get used to in the end, but it's definitely an unexpected strain on the body. I suspect that's going to become a new thing people experience like the texting thumb RSI.


Focal distance with most of the VR headsets are around 1.5 meters. (Thanks to the optics)


There is no scientific consensus, especially long term. But with all things, if it's not what nature designed us for, it's probably not great, if not terrible for us.

Then add the developmental result of having a child with this strapped for most of his puberty (you know it's gonna happen), and I am glad I'm too old to get sucked into this fad.

I feel alienated by how positive the reception for this thing is.


Since everyone responding seems to be missing the point: are there electromagnets in this, or anything else that could impact brainwaves, cell behavior etc?


Just wrap your head with tinfoil before putting it on.


You won't be able to wear this for hours every day since the battery won't last more than 2 hours.

Luckily, we're being constrained by battery tech.


It can be plugged in for long term use.


It seems a lot of people have missed this detail. Additionally, it looks like the battery pack has a usb-c/lightning port on it (possibly for charging while you keep using it).

It's visible in the following image: https://media.wired.com/photos/647e8a46e416fd283a85c2da/mast...


MKBHD mentioned you could connect the battery pack to another another battery pack.

Making their own battery pack essentially a dongle instead of just having a USB-C adapter for their cable is so Apple it hurts.


Looks like a battery pack though, surely you could just buy a few or just have it connected to a power supply.


Battery pack is currently priced in the thousands.


Not really excited about Apple trying to shove SwiftUI into everything. I really hope you'll have the option to use UIKit here.

SwiftUI is great if you're building simple things, but a huge pain in the ass for anything remotely complicated / with a lot of mutable state, which is what I'd expect VR experiences to look like code-wise.


visionOS supports existing apps for iOS and iPadOS, so UIKit is supported.



If you wanted to develop for visionOS, you're probably going to need to buy your own Apple Vision unit, right?

Looks like there's already a hurdle to developing for Apple Vision. The first visionOS developers will probably come from established companies rather than indie devs (who helped grow the iOS ecosystem).


There's a pretty robust emulator they're shipping end of the month. There will also be dev centers in London, Munich, Shanghai, Singapore, Tokyo, and Cupertino. Finally, you can submit your app to them and they'll record a session with it, and you can ask them to focus on certain issues. I suspect once they start in-store demos, devs will be able to test their apps in person.


Why shouldn't there be? I'd be highly skeptical of any "developer" trying to build the cutting edge on brand new prototype hardware but didn't have $4k in their runway to get a single dev unit. This isn't "libre foss runs on toaster oven" so, y'kno, why not?


The SDK is coming out next month way before the devices launch so I suspect there will be some simulation options. I'll be playing about with it for sure - I want to see what gaming HUDs I can bring to life!


For everyone wondering why Vision Pro will succeed where the others have failed, I think this is your answer. Apple will provide a nice set of UI primitives that make building apps easy enough, but more importantly will give a consistent user experience and ease people into this brave new world.


Also the weight of a $2.5 trillion, arguably the most influential and trendy ever company throwing its money and influence around


IF it succeeds, this may be the reason, that much we agree on. But will it succeed? I find it doubtful.


“Inside a Full Space, an app can use windows and volumes, create unbounded 3D content, open a portal to a different world …”

How far do these portals reach? How many lightyears can we expect?


9.521e-4 ly, assuming positional errors not exceeding 1 mm and a standard float64, or 1.773e-12 ly (16.78 km) if it uses float32.


What do you think the chances are of any old schmuck getting thier hands on a development kit?

I'm a web dev, and have absolutely no experience with Apple or VR development. But I would absolutely kill to get one of these dev kits for my own use.


I guess I can understand why but by having its own entirely new OS could be (emphasis on could, not saying it will happen) what sinks this. Not even iPad started with its own OS, it was originally iOS and then just forked off into its own thing which was still very similar to iOS, even to this day.

Though maybe thats what they wanted to avoid. iPadOS sits in an awkward position between MacOS and iOS, maybe Apple wanted to avoid a repeat of that.


Errr, well it is basically iPad OS on the app layer. The "new OS" is the RTOS that handles passthrough, lighting, etc


My guess is that this actually is macOS or ios but with a new UI.


It's an iOS derivative.


So, does it allow sideloading like macOS, or is it more like iOS in this regard? I hope it's the former since it's literally a Mac SoC inside.


I think it will be more like iOS for three reasons:

1. Apple don't want to be associated with smutty content, which it will inevitably be used for with sideloading (people will find an App Store approved way anyway)

2. The headset is far more intimate than a phone; better controls are needed to protect privacy and security

3. Although the device has an M2, it is not just a head mounted Mac, and visionOS is supposedly a realtime operating system


My iPad Pro also has an M2 inside of it but it’s still on the iPhone model of software distribution (which sucks).

That said I’m hoping for something closer to the Mac in terms of software. Basically I want Emacs on the headset and no policies or DRM that violates the GPL to tell me otherwise.


Any clue yet if Safari will support WebXR or not? A huge amount of development choice for this platform may hang in the balance there.


There has been experimental WebXR support in Safari since 2022, but looking at the WebKit bug tracker [1] it doesn't seem finished yet. I guess the next few days will tell us if this is a priority for them.

[1] https://bugs.webkit.org/buglist.cgi?bug_status=__open__&comp...


I wonder if Safari will provide feedback about current environment transparency to adjust contrast etc


So was xrOS a red herring?


"xrOS" is visible on the simulator title bar in the platforms state of the union. Just didn't get picked by marketing ultimately. I'm glad; "XR" as a term sounds too overly techy. The marketing is bringing this futuristic thing down to earth, making it relatable.


xrOS was likely one contender for the project until Marketing and execs came together on the final names. They chose a few and trademarked a number of them: https://9to5mac.com/2023/05/23/reality-pro-os/#:~:text=As%20...




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: