Hacker News new | past | comments | ask | show | jobs | submit login

The platform is AirPods, Apple Watch, and the yet to be announced AR product.

Just like with the M1, if you squint, Apple is testing and iterating in the open. The spatial audio is a good example but so is the Watch’s auto-detected hand washing countdown.

There are public sprinkles of this coming platform elsewhere, such as in Apple fitness workout HUD Rings widget. The proximity-based handoff is another.

The author is correct that Siri is not a good platform but for reasons they do not identify.

Voice based interaction model is weak from a UX perspective. But for Apple it is even weaker because the company is unable to use any of the unique advantages it holds over the competitors.

For example, apple’s array of services, control over the technology stack, reliable and secure intra-device communication, the App Store, the iPhone as a unified configurator, access point, update manager and biometric authenticator.

You can’t pull that stuff out of a hat.

Siri sucks. I have a few HomePods, use plenty of homekit and try to get the most out of it. But it is bad at almost everything it sets out to do.

Siri is clearly not the focus for the company, and if anything it sent competitors scrambling to own a space Apple doesn’t even want.

The interaction model includes physical hardware, like the big crown on the new APMs, but I suspect it is likely going to be based largely on eye movement. Something not too twitchy.

The enormous amount of sensor data from watch and AirPods are like the gps and gyroscope of iPhone. Apps can require either or none.

So I think the author is right that AirPods are important but they are not the center. They are a component of the next platform.

In my view, it's a mistake. Apples disinvestment in AI is why I typically pick the flagship Google phone over the iPhone nowadays.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact