It’s compatible with any Bluetooth controller, but I see no reason it would be a nonstarter.
Remember it has full hand tracking. If you want to hold something e.g. a lightsaber or whatever, there is no reason not to. An inert plastic prop should work just fine.
I'm not a huge gamer, aside from fitness games like Thrill of the Fight, but I've tried a bunch of VR games just to see.
And there is just no way that a lightsaber game could be good without haptic feedback, in a world where haptic feedback exists. And I think many other kinds of game.
YouTube guy MKBHD even called out the lack of haptics in his initial impressions video, not even for a game: the butterfly flew over to him in the Apple demo, and he held out his finger, and when the butterfly landed on it... nothing. And that was kind of jarring, he said. (And it would be.)
And yeah, the Oculus controllers wouldn't nail the butterfly on finger demo, but if they had controllers, the demo would be a hawk landing on your forearm. (And that would work, even though it doesn't quite make sense that your palm would vibrate when a bird lands on your forearm... but haptic feedback is weird.)
By haptics you mean a buzzer? That doesn’t replicate any kind of real-world experience.
But again - there is no reason gamers can’t have a control, but it’s silly to use a game controller to interact with a computing environment when you can use your hands.
I am not sure how it works, but what PlayStation calls "rumble". In the light saber game, you can feel it when your light saber hits your opponent, or your light sabers clash, and it absolutely adds to the experience immensely. I think almost all players of those types of game would prefer to have that feedback, barring some kind of disability or something.
I don't think you need the haptic vibration function for interacting with floating menus and the OS, although again it helps for button presses, which is why all smartphones now feature haptic feedback.
But the other reason to use a controller in general-purpose OS use scenarios is precision. If you can directly touch something, then by all means that is the best. But if the menu to be interacted with is too far away, say 8 meters away, all current systems I have seen make you shout a beam out of your hand to the button or object, then do some gesture to click.
A controller is way more accurate for this, kind of how a mouse is more precise for most people than a trackpad. But even more so.
So on all of Meta's systems so far, the controller can more precisely highlight and click things at distance. And I think this holds true for all other currently-available systems as well.
What Apple Vision Pro is bringing that is new, though, is the eye-tracking. Supposedly, it is as good as, or perhaps even better, at selecting an object at distance. If so, then yeah, controllers wouldn't really provide a significant advantage for most non-game activity.
You mean... so we could like, battle our small children in our kitchens with lightsabers?
My kids indeed do have sword toys that vibrate and make sounds, so I have done this. And I'm sorry to have to report that it is... substantially less compelling than fighting Darth Vader in VR. (Perhaps not for them, though.)
Noone wants to use vr with a 2D game controller (a la xbox)- that completely destroys the point. You want spatially tracked controllers that fit each hand and are meant for vr. Even if there are third party controllers that get around the tracking problems, developers won’t have a controller standard to work against
The rest of the article puts it in context. They have had bad experiences with the photos app prompting them with inappropriate ‘memories’, and this is reminding them of that.
I know. Many of us have the photos app show us things we didn’t really want: dead pets, dead relatives, exes, etc. I think most rational adults can handle that and turn it off instead of getting super dramatic about it. Are these people similarly traumatized when they open a desk drawer or photo album and happen to see an old photo that evokes sad memories? Or so they only seek the dramatic trauma points when writing about tech on the internet?
> Are these people similarly traumatized when they open a desk drawer or photo album and happen to see an old photo that evokes sad memories?
This sounds like a plot point from a 1970s detective movie. People use their phones to take photos these days, but before that they used digital cameras.
Some kinds of data are always end-to-end encrypted, even with the default standard data protection:
Passwords and Keychain
Health data
Home data
Messages in iCloud (only when iCloud backup is disabled)
Payment information
Apple Card transactions
Maps
QuickType Keyboard learned vocabulary
Safari
Screen Time
Siri information (excluding Siri Shortcuts)
WiFi passwords
W1 and H1 Bluetooth keys
Memoji
Yeah, all Apple users know that "this software needs to be updated" means "the software itself is fine, the developer just needs to pay us money". It's obvious, not misleading or an attempt to create FUD at all.
Why doesn't Apple say what you said? "The software itself might or might not be fine, the developer just needs to pay us money"? Because then their extortion racket would be laid bare to consumers.
Open source and not-for-profit software has been put at a disadvantage here, which I think is very bad for several reasons.
I think you know that downloading an unsigned binary from the internet and executing it on your personal machine is utter stupidity from a security point of view.
However, there is a space of potential solutions to this problem, many of which don't involve giving Apple money.
Somehow Apple chose a solution which would involve developers giving Apple what is for many people and open source projects a significant sum of money.
Then, Apple decided to not directly tell Apple users that the thing standing between them and the software they downloaded is that Apple believes the developer needs to give Apple money.
> Somehow Apple chose a solution which would involve developers giving Apple what is for many people and open source projects a significant sum of money.
Perhaps if the open source community had provided a solution that actually served the needs of end users in this regard, Apple could have adopted it.
> Then, Apple decided to not directly tell Apple users that the thing standing between them and the software they downloaded is that Apple believes the developer needs to give Apple money.
You’ve admitted that they are solving a real problem, therefore this is not an honest representation of what is going on.
I mean, lots and lots of people DO object to that pricing, ever since it was released every single mention of the Studio Display has mentioned it's overpriced, but it also sells well so the Vision Pro will too
A MBP has a keyboard, several hours of battery life, and a couple decades of precedent that has allowed a large catalog of applications to form. The iGooglyEyePro isn't there yet.
From the keynote you can see that it supports Bluetooth keyboards and a giant set of apps and media out of the gate, and its OS is a continuation of that couple of decades of legacy.
All you have to do is try one in an Apple store when they come out, and then you’ll be able to get a better understanding.
It’s easy to just claim it’s worthless without firsthand experience. It’s not so easy to discount the giant investment and expertise that have been brought to bear on developing it.
Maybe I’m old but I recall a time in my life where I never thought a gps, cell phone, palm pilot, etc would be very useful even after seeing them in person and seeing other people using them. Many of them I bought as I just liked tech and they collected dust. Until a certain point they did become useful. Usually around iteration #3. But I’ve personally found that v1 is not usually for me on these types of things. I’m more like the average consumer as i just don’t value the novelty of tech as much as initial adopters do. I’ve invested too much in failed products and categories at this point.
The investment on building a product is not even remotely considered by the market when a product comes out. If there’s no worthwhile utility, it’s going to be DOA and they’ll have to keep plowing money in hoping they can get to that iteration #3 where the value proposition becomes apparent. It’s probably why they want to release it and let third party devs create the value while they refine the tech and likely improve the consumer investment (eg. non Pro version coming next I assume)
Value as a toy/trinket is obvious. But it's not exactly priced as such. So it will not sell. Comfort aside, I'm not even convinced why I would want to wear this for long periods of time. It needs some sort of killer app to really help me connect the dots.
We obviously just have different takes. I would not be surprised at all if you are more correct than me. But I also don't view "Apple of all companies" to be infallible, this could very well flop. They are more capable than most companies in their ability to absorb the losses for an extended period of time to see if the market ever materializes.
Edit: so I did some external reading and basically agree with this take while respecting your bullishness
> What matters is can we find the killer App – the unstoppable use-case. ..... The Success of Vision pro and Apple’s push into “Spacial computing” rests with developers coming up with amazing ideas. Otherwise, it’s bloody cool, but isn’t compelling enough.
It’s hard to accept that you can only see this as a toy or trinket. No serious observer is characterizing it that way even if they are uncertain about its future.
I’m not saying apple is infallible. I’m saying it’s silly for people who have so little experience by comparison to be confidently dismissive. It’s obvious that Apple is less fallible than random commentators. I’m not saying they can’t fail, only that if you don’t don’t see the value, that should make you curious about why you don’t see it and why Apple does.
Wow man you really are hell bent on this and how I should feel about it. IMO pretty much all consumer tech starts as a toy until some higher utility/use case is found. They’ve not initially presented one, that I feel is compelling, so it’s still in the toy category. Obviously apple sees and wants things to mature (likely via third party dev efforts).
It reminds me of when the first iPhone came out. I was so excited. Then I learned it didn’t have GPS. I felt like screen and touch stuff was cool, but that location was what was needed to make it really shine. It didn’t get it until v3 and I waited. In that case, I at least knew what I was waiting specifically. This is more of a wait and see type thing.
I think Apple will probably have more success with this than any one else would. But I just don’t see it as a “want” and certainly not as a “need” at this point. Like I’ve said, when price comes down or a killer app surfaces that’s another game and I may change my opinion. I’m talking initially taking off as a product/category, as not likely. It will be a slow hard battle similar to others in this space.
If you're a business with a consumer facing app, you don't want to exclude 5% or 10% of your users. Most of the client projects I've worked on had a requirement to support iOS(-4) which is super painful from a development standpoint and usually the number of users on a device with iOS four versions old are in the range of 2%. But I get that it's tough for a financial institution or a streaming media company or a telco to exclude potentially 2% of users.
That chart doesn’t say you lose 10% of your market share.
It only says that 10% of your users can’t upgrade to the latest version of your app, which is a totally different proposition.
You say for some companies, this is unacceptable, which is obviously true.
It’s not obviously true for a lot of companies, so it’s worth questioning whether this is just a dogma that is handicapping developers and slowing progress in many cases.
It’s obviously not black and white. I’ve worked on bigger, more mature, products where not giving all users access to the latest features was unacceptable. And I’ve work on products where dropping an old version early only impacted like 2 users and it was acceptable. And big companies where it was okay to let ild versions lag because the model and api layer was super stable. Ultimately it’s a product decision not some unquestionable dogma and there no reason engineering can’t contribute to the discussion. It just turns out that the last two major versions is a good rule of thumb for most scenarios. It works very well in practice.
I agree that truthfully we're not talking about excluding those users, but only preventing those users from upgrading to the latest version of the app. But for lots of businesses those two things are viewed as effectively the same, because they believe they are adding value to the new version of the app that will result in greater user stickiness and direct or indirect revenue opportunities.
Honestly I'm thrilled anytime I'm working for a client that has a iOS(-2) rule versus an iOS(-4) rule.
Remember it has full hand tracking. If you want to hold something e.g. a lightsaber or whatever, there is no reason not to. An inert plastic prop should work just fine.