Hacker News new | past | comments | ask | show | jobs | submit login
Apple announces new accessibility features, including eye tracking (apple.com)
500 points by dmd 21 days ago | hide | past | favorite | 286 comments



> Vehicle Motion Cues is a new experience for iPhone and iPad that can help reduce motion sickness for passengers in moving vehicles.

This excites me so, so much! I can't really use my phone as a passenger in a car without getting motion sick after 1-2 minutes. This seems like it might be a promising thing to try.


Vaguely related anecdote.

I used to get bad nausea from aggressive physic-y VR games. But I heard people claim it can be grinded through. So I did that, and they were right. I can do VR games without needed to vomit although it’s still uncomfortable.

However… I am now much more sensitive to non VR motion sickness. :|


I played games my whole life and was shocked I had near instant VR motion sickness in sim racing. Can confirm it can be grinded through, recognize the feelings and stop immediately.


Very similar experience. My instinct would be to fight the sickness and push through, but in reality you need to stop immediately and try again in a few hours. Your tolerance tends to build exponentially!


I have had good luck with just closing one eye. But that is very tiring to do for long periods.


A similar app that's on Android: https://play.google.com/store/apps/details?id=com.urbandroid...

Unsure if it actually works though, my personal test results are mixed.


Have you noticed any correlation between how hungry you are and how fast motion sickness kicks in?


Yes, sort of. I don’t necessarily have to feel hungry but if I’m on an empty stomach or just haven’t eaten in a while, the odds I get motion sickness are much higher.

If I’m riding somewhere to go get dinner, I have to sit in the front passenger seat. After dinner with a full belly? Throw me in the back and I’ll be fine.


I'm not sure why, but I feel like I only get motion sickness in the back of Priuses. It must be something about their braking curve.

I don't sit in enough EVs to tell if they're the same.


Some people never really learn how to use one-pedal driving, so they end up just going back and forth between accelerating and decelerating. That'll make me motion sick in a hurry, and I bet that is fairly universal (among people prone to motion sickness in cars, that is). So in that sense, any EV or hybrid is potentially a problem, depending on the driver.


Ah yes, I never get motion sick, except for when I'm in the car with just such a driver: A person with sine-foot.


This is giving me flashbacks to the time I took an uber home from the airport in Dallas. 25 minutes of an older gentlemen (65ish) just modulating between gas pedal and brake pedal the entire time. It was awful and I wish I wasn't such a coward at the time and had told him after he dropped me off.


Tangential and very late to the thread, but,

In the racing/track world, any time spent off the accelrator or brake is literally leaving time on the table.

The idea being; you are going at 10/10 until the exact point you know you need to go 4/10 at an apex and apply full braking. Then getting right back up to 10/10.

Similarly, I notice that i do a lot of this "sine wave" driving (IE , gas, brake, gas brake) when I'm heavily loaded with passengers or on a curvy road where I don't quite know the ratios. So I'm often not expecting to need to brake so heavily but, for safeety of course, need to.


Yea, I get it for driving on track, but not on the highway just managing space between you and the car in front.

Re: your curvy road, I would say at that point you might just be driving too fast for the load in your vehicle. When passengers are onboard, a smooth ride will beat everything 99% of the time. If I have to suddenly break for a sharp corner, then I apologize and slow down so it doesn't happen again.


Teslas are especially bad for me. I think it’s the rough suspension and fast acceleration/deceleration


The instant-on power and braking takes some getting used to. For the folks who have trouble mastering it, my recommendation is chill mode. It has a much softer acceleration profile, mostly eliminating the harsh starts you might be experiencing.


Chill mode is all upside in my book. There's still a ton of power when you need it, it's easier on the tires (and thus your wallet), and you get jerked around less.


I suspect most people's interaction with Prius' are Uber rides. Maybe Uber drivers just get bad habits from the platform incentives (drive fast = get more rides)


Well, I definitely sit in the back mostly in Uber rides and that's supposed to matter somehow.


Toyota's hybrids are the worst. I never get motion sick except as a passenger in any Toyota hybrid


It’s really interesting you say this. Is this a known correlation? I feel like now that you mention it, it’s incredibly fast if I’m hungry.


I went on a cruise, and had significant (for me) motion sickness that only got better once I ate --- of course, I was avoiding eating because I didn't feel well, so that seems like the wrong choice.


It is a known correlation.


I regularly drive two family members around—one gets motion sick much faster and more frequently when hungry, while the other gets motion sick the same either way.

Does make me wonder what the difference is there.


I have not. For me, it does not matter. The ride begins - the motion sickness kicks in


I have motion sickness... It's so hard to movearound for me and I am still not able to find what works best for me


I wonder what new voices will be added to VoiceOver? We blind people never, ever thought Eloquence, an old TTS engine from 20 years ago now, would ever come to iOS. And yet, here it is in iOS 17. I wouldn't be surprised to see DecTalk, or more Siri voices. More Braille features is amazing, and they even mentioned the Mac! VoiceOver for Mac is notoriously never given as much love as VoiceOver for iOS is, so most blind people still use Windows, even though they have iPhones.

I was expecting to see much better image descriptions, but they've already announced a ton of new stuff for plenty other disabilities. Having haptic music will be awesome even for me, adding another sense to the music. There are just so many new accessibility stuff, and I can't wait to see what all is really new in VoiceOver, since there's always new things not talked about in WWDC or release notes. I'm hoping that, one day, we get a tutorial for VoiceOver, like TalkBack on Android has, since there are so many commands, gestures, and settings that a new user never learns unless they learn to learn about them.


>I'm hoping that, one day, we get a tutorial for VoiceOver

Maybe it's not feasible for you but if you're ever near an Apple Store, you could definitely call them and ask whether they have an accessibility expert you could talk to. In the Barcelona Apple Store for example, there is a blind employee who is an expert at using Apple's accessibility features on all their devices. He loves explaining his tips and tricks to anyone who needs to.


My friends that use synthetic voices prefer cleanliness of the older and familiar voices. One friend listens at about 900 WPM in skim mode and none of the more realistic voices work well at those rates.


Every once in a while I'll hear a blind person's phone audio while I'm out and about and it sounds like an unintelligible stream of noises, but they're interacting with it and using it and getting information from it. It's amazing, a whole other world of interaction from what I experience with my phone. I kind of want to learn how they're interacting with it.


Here's a great demo by a blind software engineer that explains some of it: https://youtu.be/wKISPePFrIs?si=YOLcW9b2uyLXLn59


Back in the 90s we reverse engineered / patched the classic MacInTalk to speed up playback. Our testers had it cranked so fast as they navigated the UI that to me it sounded more like musical notes than text.


The image description stuff is already surprisingly good - I noticed when I got a photo text while driving and it described it well enough for me to know what it was.


Same, a family member send a photo while I was driving and over Carplay it was described fairly accurately.


It’s sometimes awesome, and often extremely basic. “Contact sent you a picture of a group of people at an airport”. Amazing. “Contact sent you a screenshot of a social media post”. Useless. We know iOS can select text in pictures, so Siri can clearly read it. It knows it’s SoMe, so why not give me the headline?


> It knows it’s SoMe, so why not give me the headline?

There's certainly a non-zero number of times this could go horribly wrong (I'm guessing CarPlay doesn't have any sense of who is in the car with you) and defaulting to not reading them out is the safest option (but they could definitely add a toggle for this, yeah.)


The same is true with reading out text messages. I’ve disabled it for CarPlay now after receiving a mildly raunchy text with a car full of colleagues. It’s still useful on the headphones though.


Only if you have apple headphones though. If you've got some other thing, for some reason, it doesn't know how to tell you anything.


I’m hoping this shows up in iOS 18.


"Wife sent you a photo of a nude woman in the shower"


This is a good time to remind everyone that tomorrow, May 16th is Global Accessibility Awareness Day (GAAD) (https://accessibility.day), and that there are over 176 events worldwide going on to celebrate the process we are all making at improving accessibility in our products -- any plenty of learning opportunities for beginners and experts.


Accessibility settings are really a gold mine on iOS for device customization (yes, I agree, they shouldn’t be limited to accessibility).

I’m particularly interested in the motion cues and the color filters for CarPlay - I have color filters set up to enable every night as kind of a Turbo-night shift mode (deep orange-red color shift), would love to do the same for CarPlay.

I also completely forgot iOS had a magnifier built in!


Accessibility features tend to be superpowers though, and I'm glad Apple gates them behind permissions and opt-ins. We all know of applications who try to trick the user into granting them inappropriate access to the device through the Accessibility APIs. I think DropBox still begs you to grant them Accessibility access so its tendrils can do who-knows-what to your system.

With great power comes great responsibility.


Guaranteed that marketers are salivating at the idea of eye tracking on apps and website. It's an amazing feature that absolutely needs to be gatekept.


I wonder if it'll use the same architecture as visionOS; where the vision tracking events and UI affordances are processed and composited out-of-process; with the app never seeing them.


That's probably how it'll go because it's the path of least resistance. A button will already have a listener for tap, so the OS translates the vision tracking into a "tap" and triggers the relevant code. There's no point telling the app about vision tracking because apps wouldn't already have a handler for that event. And for privacy reasons, there's no need to start now.


iPadOS has hover states and pointer events; those could be arguably trigerred by eye tracking.


It varies. Things like keyboard control or that kind of thing, absolutely, but mostly I've used it for stuff like "don't make an animated transition every time I change pages like an overcaffienated George Lucas" or "actually make night shift shift enough to be useful at night". I also use the background sounds to augment noise cancellation while taking a nap. All of those are just useful things or personal settings, not necessarily attack vectors.


My favorite is the "allow ANC with just one AirPod in". I have no idea why this would be an accessibility feature. If I turn on ANC, then I don't want it to be disabled just because I'm listening with one ear!


Well, they aren't really limited to accessibility, but they are hidden there. It's sort of like a convenient excuse to get UI designers off your back if you want to ship customization.


FYI you can also make turbo night shift by scheduling toggling of white point balance, yep, in accessibility settings


I love accessibility features because they might be the last features developed solely with the benefit of the user in mind. So many other app/os features are designed to steal your attention or gradually nerf usefulness.


I often use them to get around bad UI/UX (like using Reduce Motion), or to make devices more useful (Color Filters (red) for using at night).

Even outside of this, even able-bodied folks can be disabled due to illness, surgery, injury, etc. So it's great to see Apple continuing to support accessibility.


The red color filter for outside when trying to preserve night vision is a great tip. Some apps have this built-in but much better to have the OS change it everywhere.

Recommend creating a Shortcut to toggle this setting.


Not just for outside, but also at public gatherings like concerts! I went to a concert last month and used a red color filter to record a couple short videos without being a big distraction to the audience behind me.

Dim backlight + Red color filter can make the screen almost invisible to those around you.


I made a Shortcut that drops the white point, turns on the red filter, zeroes the audio and turns down the brightness. On wake-up it reverses these.

It’s attached to an automation for sleep and wake focus it works really well. I added an explicit run requirement for wake so that I could sleep in without getting blasted in the face with white. There’s a notification with a run button which I can wait until I’m truly up to hit.

https://www.icloud.com/shortcuts/4bcfd8fc02074316aaae3503d07...


Thanks for the reminder. Wish I had remembered this feature during the aurora photography I was doing. I set the phone brightness to the lowest setting but the red filter would have helped even more.


The issue I've seen when the app itself offers a red filter is if that app calls an OS native widget like a keyboard does not get filtered. The system level accessibility feature does filter the OS widget. I would almost rather the app's setting to just enable the OS filter, but I can understand why that might not be possible.


You can also add it to the accessibility shortcut, available anywhere by triple-clicking the power button.


I think you can just add it to Control Centre, no need for shortcuts. I made an app for reading in the dark, minimising the amount of light hitting your eyeballs, but I'm still using the red color filter every night.

The app is overall darker and more "strict" with how it handles content (esp. images) though: https://untested.sonnet.io/Heart+of+Dorkness and midnight.sonnet.io

Overall, reduced the number of photons is a fun metric to play with when building something/messing with prototypes.


> Recommend creating a Shortcut to toggle this setting.

Huh?

It's not built-in?

Android (or at least Moto) has it for years, auto enable on the schedule or the sunrise/sunset.


There's a built in "night shift", but that just changes the colour temperature, it doesn't make everything monochrome red.


No, that's (part of) what Color Filters is for.


iOS has also had “Night Shift” for several years. The parent is talking about a full on red colour filter, like astronomers might use.


Kinda feels like you could’ve done more of a cursory glance to see what functionality was actually being talked about before going for the “Android already does this!!” comment.


Kinda feels like you could've be more like [0] than being an ass, while totally missing what I was surprised about the lack of automagik, not of the feature itself.

https://news.ycombinator.com/item?id=40371449


I don't love that solid UX gets pushed under the accessibility rug, as an option you might never find.

I don't care how cynical it sounds, user experience became user exploitation a long time ago. Big Tech have been running that gimmick at too-big-to-fail scale for the last decade or so.


Let's say you're a developer at a big software company (not necessarily Apple, this happens everywhere) and you want to add a new optional setting.

The bar is pretty high. There are already hundreds of settings. Each one adds cost and complexity. So even if it's a good idea, the leadership might say "no".

Now let's say this same setting makes a big difference in usability for some people with disabilities. You just want to put it in accessibility settings. It won't clutter the rest of the UI.

You just turned your "no" into a "yes".


The iPhone has a hidden accessibility setting where you can map and double and/or triple tap of the back of your phone to a handful of actions. I use this to trigger Reachability (the feature that brings the entire UI halfway down the screen so you can reach buttons at the top) because phone screens are so damn big that I can't reach the opposite top corner with my thumb even on my 13 mini without hand gymnastics. And the normal Reachability gesture is super unreliable to trigger ever since they got rid of the front Touch ID home button.


Double tap is reachability for me and triple tap is to make the display very dim so that at night at the lowest brightness setting, I can get it even lower. It resets after a while so even if I forget to switch it off my screen won’t stay dim for the next few days while I wonder why it’s so damn dark.


>The iPhone has a hidden accessibility setting

This isn’t “hidden”. It was even called out and demonstrated in the iOS 14 keynote.


Perhaps I could rephrase that it's hidden within Accessibility settings, not that it's an accessibility setting that is furthermore hidden.

Most people don't go into that menu to look around for things they might want to use cause features that almost everyone could benefit from are alongside settings for people with visual and hearing impairments.


Unironically calling this feature "hidden" is why things are the way they are now. It's not hidden! You can find it if you go through the settings app! But because it isn't in your face all day every now and then people will talk about this "super secret feature" and then a PM somewhere has to make a nag feature to advertise it.


Accessibility benefits everyone, but in the basics you’re right. Too many simple straightforward options are now strictly inside accessibility. At least on the Apple side.

And don’t get me started on hidden command line settings.


You're getting a lot of agreement from other HN users, but I'm not sure it's fair to criticize Apple for putting these kinds of features under Accessibility.

There's nothing that inherently "locks out" people who don't have a recognized disability from exploring these features. Furthermore, most of Apple's "accessibility" features are related to Vision/Hearing/etc (and categorized as such), so I think it's reasonable to consider them accessibility features.

Clearly based on other comments here, plenty of people discover these features and find them useful.


> Too many simple straightforward options are now strictly inside accessibility

From outside, it feels like these are the only people with the freedom to improve the user experience at all. So they have to hide their work in the Accessibility preferences.


The thrust of my point, which I should have explained better, is that 'accessibility' is very tightly overlapping 'power user'.

I'm glad the configuration is there, but it's hard to discover, so in and of itself it isn't accessible to find.


> Too many simple straightforward options are now strictly inside accessibility.

<cough> Reduce Motion. Is it an accessibility feature or does it just get rid of an annoyance and is good for everyone?


> Reduce Motion. Is it an accessibility feature

Yes--some people can become ill with certain types of motion on a screen [1].

[1]: https://www.a11yproject.com/posts/understanding-vestibular-d...


Yes they can, but I don't and I still hate needless animations and turn them off. The point is, why is it in "accessibility" when it should be more visible?


It's plenty visible right where it is to people who don't have odd hangups about "accessibility".


So would people with perfect eyesight and no motion sickness discover it?

Personally I forgot about it on my latest phone because I had a fresh prescription for my glasses and didn't need to enlarge the font at the moment :)


Saves battery too along with cross fade transitions :)


I'm here to intentionally get you started on hidden CLI settings. Learn me somethin'!



not OP, but macOS has a ton of options available with arcane commands. my favorites are the auto-hide dock speed setting, the third hidden window minimise animation, and the hidden system accent colours usually locked to the colourful iMacs


> developed solely with the benefit of the user in mind

Hopefully accessibility features are never artificially segmented to higher priced devices.


Some of them are, at least on Apple's side, but it's always for a good technical reason. Screen recognition is only available on devices that have a neural chip, things that require lidar don't work on devices that don't have lidar and so on.

Google is worse at this, Talkback multi-finger gestures used to be Pixel and Samsung exclusive for a while, even though there was no technical reason for it.

Apple has a different problem, many accessibility features aren't internationalized properly. Screen recognition still has issues on non-english systems, so do image descriptions. Voice Over (especially on Mac) didn't include voices for some of the less-popular languages until very recently, even though Vocalizer, their underlying speech engine, has supported them for years. Siri has the same problem.


At least in the US, they kind of can't be. The disability community is pretty up front about lawsuits.


iOS 17 audio image descriptions for blind people via Image Magnifier should work on all iPhones, but do not work on iPhone SE3 and iPhone 11 Pro. Audio image descriptions do work in iPhone 12 Pro. Lidar in 12 Pro increases accuracy, but should not be mandatory. Hopefully this is a bug that can be fixed, since text descriptions were still functional on the lower-end devices.

Source: purchased devices until finding one that worked, since Apple docs indicated the feature should work on all iPhones that can run iOS 17.

Edit: audio descriptions in Magnifier are non-functional on iPad Air, working on M2 iPad Pro.


That's because the ADA has no enforcement mechanism other than lawsuits, isn't it? Our whole legal disability rights infrastructure is designed to be driven by lawsuits, and sits inert if nobody sues.


That's actually a good thing, especially if there are people specializing in bringing these lawsuits en-masse.

Over here in Europe, where there's no lawsuit culture, some laws (not necessarily accessibility-related, our legislation in that area is far weaker) are violated almost without repercussions. When the government is the only party that can bring charges and the government is complacent / ineffective, nobody actually gets charged and nothing gets done.

There's also the problem of incentives, if you can get a lot of money from a lawsuit, you have a far better incentive to sue and find a competent lawyer. You may even deliberately look for and sue violators as a moneymaking scheme, some companies in the US do this. This puts pressure on businesses to comply with the law. Even if you're piss-poor and never sue anybody, you still benefit.

If this lawsuit culture doesn't exist, the best you can do is write a report to the government and hope they actually act on it. Many people don't know how to write these, and since there's no money in it, getting a lawyer to do it for you is an expense that nobody is going to help you recoup.

The people handling these reports don't help either, they're usually 9-to-5 salaried employees who aren't judged too hard on performance, so they have far less of an incentive to actually pursue cases.


I wonder if that would be legal, at least in the US. That feels like it'd be a violation of the ADA?


It would only be a violation if it's purely software locked.

If it requires a chip that supports specific operations, and entry tier devices have an older chip, that wouldn't be a violation.


Every attention thief is absolutely thrilled at the idea of tracking your eyes. Let’s all imagine the day where the YouTube free tier pauses ads when you’re not actively looking at them.

Shit. I’m turning into one of those negative downers. I’m sorry. I’ve had too much internet today.


If this is at all like the eye tracking in Vision Pro, it is only available to the OS and apps are not given access to the data.


Until people complain that Apple is being anti-competitive by not making vision tracking open, or allowing third-party eye-tracking controls, etc. etc.


System one is, but advertisers could always roll their own and see if they can get away with "you can only view this content if you give us permission to use your camera".


At least on iOS, I can't imagine that happening - apps are not allowed to demand you grant permissions unrelated to the actual functionality. From App Review Guidelines (5.1.1) Data Collection and Storage, (ii) Access:

> Apps must respect the user’s permission settings and not attempt to manipulate, trick, or force people to consent to unnecessary data access. For example, apps that include the ability to post photos to a social network must not also require microphone access before allowing the user to upload photos.

Lots of iOS apps today really want to mine your address book, and constantly spam you with dialogs to enable it, but they don't go as far as disabling other features until you grant them access, because they'd get rejected once someone noticed.


Fortunately apps in the EU don’t have to pass though Apple’s anticompetitive review process, so developers are free to ignore that rule if they simply distribute the app via an alternative store.

Unfortunately, poor Americans cannot taste the freedom that Europeans have to be abused by developers.


Fortunately, the Americans who create these products extract more value from the EU than they invest, or they wouldn’t bother trading with you <3


I was thinking similar (only without the snark); but then I realised this is almost certainly not compatible with GDPR.


It's not that hard to come with ways to circumvent system restrictions, after all, advertisers are a fierce adversary and have shown many clever ways of invading users privacy in web browsers and mobile apps. In the case of eye tracking I could see a situation where the system perhaps feeds the "malicious" app in question with a hint of which widget is being currently gazed by the user. You could then just build a giant grid of invisible widgets covering the whole app window and use that to reconstruct the all the eye tracking happening inside your app.


The system doesn't supply data like widget highlight states to apps for exactly that reason.


This is not about technical restrictions and finding weird legal loopholes, Apple's guidelines don't work that way. It's the spirit that matters, not the letter.


This would not pass the App Review process


Until the EU comes in and forces them to ‘open up’ the functionality in the name of ‘fairness’…


If you look at the current things the EU has forced them to open up (app distribution and NFC payments), both of them are things that Apple was already actively monetizing.

To compare this to a non-monetized accessibility feature is a bit disingenuous.

The whole notion of ‘fairness’ is that gatekeepers should allow others to compete on even footing. That can be fulfilled by granting EITHER everybody OR nobody access to the market (but not: only yourself).


for now - it is naive to think this is safe


Eye tracking has been on iOS for many years (as an option for faceid called attention).


Avoidable via iPhone SE3 and iPad Air, which both use TouchID.


Wait for human attention detection to become mandatory to view DRMed content on the telescreen.


Watch the Black Mirror episode “Fifteen Million Merits”, to see how this might end up.


Fifteen million merits is maybe the least about the future of any Black Mirror episode. It reads best as entirely a comment on the society we already have.

A big clue is that the world in it doesn’t make a ton of internal sense and the episode makes absolutely no effort to smooth that over. The questions it raises and leaves open without even attempting an answer are on purpose. You’re supposed to go “god, bicycling to make little pellets? Just to buy useless trash? God, why? It’s so pointless,” or, “they truly look down on and are mean to the people who get demoted, even though they’re basically guaranteed to end up that way someday, when their health or good fortune run out? Just cruelty for no reason that’s highly likely to hit them some day, too? That’s insane!”

… because actually it’s about now (or, the year it came out) and the point is to get you to connect the dots and realize we’re doing the same crap and just don’t notice. It’s only “sci fi” as some sleight of hand to give you an alien’s eye view of our own actual society as a way to prompt reflection. The core story of the corruption of a “revolutionary” to serve the purposes of the system is some bog-standard media studies stuff pertaining to today, not sci fi. The reveal that they could just fucking stop all this and go outside, it’s not some alien world or a post apocalypse after all is yet another thing that’s supposed to make you go “what’s wrong with them? Oh. Right. It’s us.”[1]

Long winded way to say: we can’t “end up” at Fifteen Million Merits future-world, because it’s just our own world we’re already in.

[1] Some read the final scene as depicting yet another viewscreen, but this is such a wildly weaker reading as far as the how effective the scene is that I can’t believe it’s intended that way.


Wouldn't a lot of the companies that build in accessibility do it from a viewpoint of gaining an even wider reach and/or a better public image?

I don't see optimizing for that as bad. If they think we'll love the product more by making it better for a given audience, especially if I'm in that audience, I'm happy. Does that mean this company now gets richer? Perhaps, and that's fine by me


Funny, I was just thinking it was so that they can get more attention-economy eyeballs for ads.


This will happen. These features are always ushered in as ways to make someone's life easier, and often that is exactly what it does, for a time, before some product manager figures out how they can maximize profit with it.

Growth at all costs, I guess.


Don’t say “I guess” as if you aren’t the one making the rather baseless accusation. What other accessibility features have been abused?


Apple doesn't have product managers. (More importantly, the hardware has been technically capable of eye tracking since Face ID was added.)


Can you name a single accessibility feature where this has happened ever? Kinda seems like you just made up some fake reality.


Accessibility features stand out as user-centric developments, love that


That’s a profound and surprising insight. You’re absolutely correct.


Accessibility features can be used to steal attention too


This is why Apple is the best. They’ll make features like this because it’s the right thing to do, even if it won’t really make any difference to 99.99% of their customers.

They really earn consumer loyalty.


This was the right thing to do, but I doubt “the right thing to do” was the primary motivator for it. This is smart marketing that helps position them as the more caring, inclusive, and capable tech brand. It also expands their reach more than a fraction of a percent. 13% of Americans have some form of disability.


I feel like it also helps them get an edge in the healthcare sector, which has buckets of money in the mix.


75+% of Americans become disabled (mostly temporarily) in their lifetime. So it will affect most people someday. And everyone dies in the end.


Tim Cook in response to a shareholder proposing scrapping accessibility to improve ROI:

“When we work on making our devices accessible by the blind, I don’t consider the bloody ROI”.


That's just more "cause marketing." The first ever $3T company definitely considers the bloody ROI in every decision -- whether the CEO is willing to admit it publicly or not.


The accessibility features are also useful for automated UI testing.

>They’ll make features like this because it’s the right thing to do

While the positivity is great to see -- I'd temper that expectation they're simply doing the right thing and definitely acting in their perceived interest. People can and often do the right thing -- large companies rarely do.


> People can and often do the right thing -- large companies rarely do.

There are companies which try (and succeed) to be honest "citizens". Berkshire and Costco have quite good reputations.


I read this very skeptically.

When I hear eye tracking I immediately think of advertisers targeting me based on what I look at, NOT quadriplegics using Apple devices.

Maybe I'm a cynic


I'm hoping they'll add this to macOS too. Eye tracking would be great for end-user UX research.

I'd also like to see what sort of weird stuff people come up with to use eye tracking. Games could use that in interesting ways.


Do you realize how bad Apple's accessibility issues were before? This is just marketing and I doubt many people are going to drop thousands of bucks to ditch years of tooling they're already intimately familiar with (aka, Windows), but this is an effort to try and entice some people. That's all it is. Marketing.


I have better than 20/20 vision (yes, really) and now mobility problems, but there are some macOS accessibility features that I love.

One is Zoom: I hold down two modifier keys and scroll, and can instantly zoom in to any part of the screen. It is extremely smooth, the more you scroll the higher you zoom, instantly and with high frame rate. Great when you want to show someone something, or just zoom into a detail of something if the app doesn’t support it or is two cumbersome. Or “pseudo-fullscreen” something.

The other one is three finger drag on the trackpad. Why that isn’t default is beyond me. It means that if you use three fingers, any dragging on the trackpad behaves as if you held the button down. It’s so convenient.


The default way to drag on a trackpad is something I’ve never gotten good with so I always enable drag lock the second I get a new laptop. Ideally I would switch to the three finger gesture but after 15 years of drag lock I just can’t get my brain to switch over.


It's better to have an easy way to hold a mouse button via keyboard with your left hand and continue to use one finger on the touchpad rather than do the whole three fingers to drag


Not for me, no.

But sounds like something accessibility options may also provide, and I can see how it may be better for a lot of people.


Which one have you tried?


Honestly? Only three finger drag. But just earlier I was holding coffee with the left hand, while using the trackpad to the right hand, and was glad I had three finger draw. I like doing "basic" mouse operations with only one hand (including right click: two fingers), and three finger drag works extremely well for me. I barely even think about it, it's automatic and an easy gesture for me.

In turn, I am interested why you do think that your way is (apparently universally) better?


Yeah, that's what I thought - you haven't used it before confidently rejecting

The other way is better mostly because it doesn't interrupt the drag when you move fingers up (e.g., when you've started drag right at the right touchpad side and don't have enough space), though there is an invisible timer hack that tries to address it, and since 3 fingers take more space on the touchpad, so you're losing range

> coffee with the left hand

Still possible with the keyboard, though loses on convenience, likely better to dedicate some area of a touchpad for a touch+move to drag

> with only one hand (including right click: two fingers),

This doesn't make sense, literal right lick is 1 finger of 1 hand, same as bottom-right corner tap

And it's not universally better, don't make it up, just better than what I've explicitly mentioned


> > coffee with the left hand

> Still possible with the keyboard

While drinking it? When using the mouse, I’m often relaxed, sometimes for example having my forearm lying in front of the keyboard.

> > with only one hand (including right click: two fingers),

> This doesn't make sense, literal right lick is 1 finger of 1 hand, same as bottom-right corner tap

What doesn’t make sense? Right click is two finger tap for me, which is super common. I don’t see the downside either.

Your way has advantages, my way has advantages, so far I don’t see a strict ordering that applies to everyone or even every situation.


> sometimes for example having my forearm lying in front of the keyboard. But you can still reach space or right command key with your middle/ring finger? That would allow you to use 1 finger to drag, not worry about lifting your dragging finger while dragging and get more range

> What doesn’t make sense?

You listing that gesture as some kind of alternative while the default is already 1-handed

> I don’t see the downside either.

The downside is that scrolling with two fingers might misfire when you change your mind and cancel scrolling right away, so place 2 fingers and lift them up, thus triggering a right click

> so far I don’t see a strict ordering that applies to everyone or even every situation.

And you'll never see it with your eyes closed. You need to actually try this stuff to be able to judge it, to see whether you drink coffee often enough as a % of your computer time to make the worse gesture better because it frees up your hand and the other one-hand alternative is not as convenient, or whether the mistakes of breaking a drag gesture on a bunch of files/objects in a graphics app leaving them in the wrong place are bad enough


three finger dragger for life here. whenever I see colleagues struggling to drag items on a Mac, I show them three finger dragging and it blows them away. total game-changer!


> and now mobility problems

Meant to write “no mobility problems”, but too late to edit now.


If accurate enough it seems like eye tracking could be useful beyond accessibility to eg control iPads without dragging one’s greasy fingers all over the screen. iPad already supports pointers.


I was thinking the same, it would make having an iPad under my monitor much less cumbersome


macOS has had a version of eye tracking for a while, it's really fun to try out.

System preferences -> Accessibility -> Pointer Control

Then turn on the "Head pointer" option.


Cool. I'm a bit unsettled that my camera's green dot didn't turn on for it though.


VisionOS has this too. I mapped it to a triple click of the button for when eye tracking becomes inaccurate.


Cool! That works surprisingly well. But how do you click while using this? Clicking on the trackpad doesn't work, when it's tracking my head.


That's a separate option. The option above head tracking in those settings allows for adding your facial expressions (smiling, blinking etc) as shortcuts for clicks.


Eye tracking coupled with the show grid feature would seem like using a computer the way that people do in movies https://www.youtube.com/watch?v=UxigSW9MbY8


It's basically how the Apple Vision Pro mainly works.


I definitely get a good amount of motion sickness when using my phone while in a car so I'm super interested about the motion sickness cues and if they'll work. The dots look like they may get in the way a bit but I'm willing to take that tradeoff. My current car motion sickness mitigation system is these glasses that have liquid in them that supposedly help your ears feel the motion of the car better (and make you look like Harry Potter)


I wonder if Vision Pro has enough microphones to do the acoustic camera thing? If so you could plausibly get "speech bubbles over peoples heads", accurately identifying who said what.

I imagine that could be pretty awesome for deaf people.


Deaf people I know certainly wouldn't want to be wearing a headset, say, in a restaurant. But a phone app or tablet that can do live text-to-speach would be good enough in many cases if it can separate voices into streams. Anything like this available?


I don't want to say it's certainly not available, but I doubt it. Maybe someone has done something trying to recognize different voices by differences in pitch/volume/...

The vision pro has six mics, which likely enables it to locate the sources of sound in space by the amount of time it takes a sound to reach each mic (i.e. acting as an "acoustic camera"). Tablets and phones aren't going to have that hardware unfortunately.

And yeah, obviously wearing a headset is a pretty big compromise that's not always (or maybe even often) going to be worth it. This was more of a though of "the hardware can probably do this cool thing for people already using it" than "people should use the hardware so it can do the cool thing".


google yesterday also open sourced their accessibility feature for android and windows that controls cursor using head movements and facial gestures

https://github.com/google/project-gameface


I'm a believer in accessibility features. The difficulty is often in testing.

I use SimDaltonism, to test for color-blindness accessibility, and, in the last app I wrote, I added a "long press help" feature, that responds to long-presses on items, by opening a popover, containing the label and hint. Makes testing much easier, and doubles as user help.


Accessibility is for everyone, including you, if you live long enough. And the alternative is worse. So your choice is death or you are going to use accessibility features. – Siracusa


I aimed for the upvote button but they’re so tiny that my fat finger hit the downvote button by accident and then I had to retry the action. This is what people mean by accessibility is for everyone all of the time.


I have a tremor and I run into this issue on HN all the time. I need to zoom in a lot to be sure I'll hit it.


I don’t (yet) have accessibility challenges beyond glasses but hitting the tiny arrows is incredibly difficult. How come HN doesn’t update to be more accessible? It’s been a long time… I’m surprised it hasn’t been talked about by the team there.


As someone who has carried out accessibility audits, I can unfortunately attest to this topic being a blindspot in tech circles. I remember hanging out with fairly senior frontend devs from a FAANG company who didn't know what purpose skip links served on websites. It can also be an uphill battle to advocate for remediation once design and development work is already baked in.


Yep. And I think there’s an interesting implicit bias where younger / mornjunior developers often get tasked with things like that, so they see no problem


Do you think Paul Graham or Garry Tan give a shit about accessibility?


No.


Try https://www.modernhn.com if you haven't already. UI elements have more spacing around them, especially if zoomed in.


Hadn’t heard of this before, it looks great! Need it on mobile though, and would be happy to pay a reasonable fee for it.


You can install the addon on mobile firefox.

Ironically because of addons I now use firefox exclusively on mobile.


Same here with essential tremor. Rarely do people think of us.


I've hidden and reported so many posts on the front page.

I can only hope that the algorithms take into account that there's a decent chance someone trying to hit a link or access the comments will accidentally report a post instead.


Every once in a while I wonder where a post disappeared to. Eventually I worked out I was accidentally hiding them, found dozens of hidden posts I was interested in when I looked at my profile.

I still do it all the time. It's a problem on both desktop and mobile. I've send mail about it to dang before and did get a response, definitely a known issue.


This may be an unpopular opinion but I'm just happy zoom works as expected on this site and don't mind if once in a while I have to correct a misvote.

I appreciate the information density, and that HN hasn't adopted some "contemporary" UI that's all whitespace and animations.

(And yes I agree the buttons are awkward)


Zoom the website in, the browser has accessibility built in and hackernews zooms in fairly well.

Edit: I seem to be missing something as this is getting downvoted. I genuinely cannot use HN under 150% zoom so thought this was a basic comment I was making.


Accessibility isn't just about possibility, it's about ergonomics.

You could integrate a differential equation by lining rocks up in a large desert as a computer, but you wouldn't say that solution is "accessible" to a human in the same way it would be with a CPU, a monitor, and functioning sight.


Ah, so people are annoyed that the zoom isn't appropriate by default, I get it. Thanks I was getting extremely confused. That being said I use different zoom levels for different websites, and the browser remembers them for me, I like how the feature works right now, and I have loads of myopia.

If people made it "big enough" to be inclusive by default in some websites I'd have to zoom out as well. So my point is that to me is more important to zoom in correctly (many websites don't), than be "big enough" to start with.


I mean, even at 175% zoom, the vote arrows are pretty close together on a touchscreen. And I’m barely 30.

I always end up double-checking whether the ‘unvote’ link is present. If it says ‘undown’, I know I’ve made a mistake (or vice versa).


I agree about zoom levels on different websites when on desktop. Zoom works differently on mobile vs desktop which is its own challenge as far as ergonomic use. On mobile, to get the up/downvote buttons to the size I want them, I'd have to scroll side-to-side to read comments.


> I seem to be missing something as this is getting downvoted.

I wouldn't downvote you, but I think those who did have read your comment as saying that it (hn design) is fine and they disagree.

(I'm not saying that's what you've said)



By any chance, do you know where said the above quote? I tried searching online, could not find it.


By any chance I know, because I clipped it in Overcast.

It was on Accidental Tech Podcast #415, 26:19. The topic starts at 25:40

https://atp.fm/415


Music haptics can be a cool way to teach someone how to dance and “feel the beat”


I'm severely hearing impaired and enjoy going to dance classes - swing, salsa, etc. If I'm standing still, I can easily tune into the beat. But once I start moving, I quickly lose it on many songs; dance studios aren't known for having large sound systems with substantial bass. I don't know that this specific setup would fix anything -- it would need some way of syncing to the instructor's iPhone that is connected via bluetooth to the studio's little portable speaker. But it's a step in the right direction.

While on the topic, I can hear music but almost never understand lyrics; at best I might catch the key chorus phrase (example: the words "born in the USA" are literally the only words I understand in that song).

A few months ago I discovered the "karaoke" feature on Apple Music, in which it displays the lyrics in time with the music. This have been game changing for me. I'm catching up on decades worth of music where I never had any idea what the lyrics are (filthy, that's what they are. So many songs about sex!). It has made exercising on the treadmill, elliptical, etc actually enjoyable.


> A few months ago I discovered the "karaoke" feature on Apple Music, in which it displays the lyrics in time with the music.

JSYK Spotify has had this for years too, its just under the "View Lyrics" button but it does highlight the sentence/word karaoke style. It used to be a "spotify app" back in that genre of the service.


That doesn't surprise me too much. I never really used Spotify or any music app much in the past. This discovery was a convergence of things - switching from Android to iPhone, a renewed exercise routine, and most importantly a new iPhone and hearing aid combo that allowed the iPhone to stream directly to my hearing aid, removing the need for bulky headphones. Phone calls are also streamed directly to hearing aid. Game changing!


I had a Deaf friend who had dual cochlear implants about 10 years ago. I was blown away to learn they had Bluetooth at the time and she could beam music directly into her brain.


There's a lot of interesting things we can do with haptics since they're relatively cheap to put in stuff. Hopefully accessibility gets the software and applications further along soon


Using haptics in music to enhance rhythm perception and dance skills. Sounds really cool!


I am excited for Vocal Cues - my main frustration with Siri is how poorly it comprehends even explicit instructions.

One thing I wish Apple would implement is some kind of gesture control. The camera can detect fine details of face movement, it would be nice if that were leveraged to track hands that aren't touching the screen.

For an example, if I have my iPhone or iPad on the desk in front of me, and a push notification that I don't need obstructs the content on the screen, I would love to be able to swipe my hand up towards the phone to dismiss it.


iOS 17 Image Descriptions are quite good, but audio descriptions don't seem to work on non-Pro devices, even though text descriptions are being shown on the screen and the audio menu is present and activated. Is that a bug?

Even on Pro devices, audio image descriptions stop working after a few cycles of switching between Image Magnifier and apps/home. This can be fixed by restarting the app and disabling/enabling audio image descriptions, but that breaks the use case when an iPhone is dedicated to running only Image Magnifier + audio descriptions, via remote MDM with no way for the local blind user to restart the app.

On-device iOS image descriptions could be improved if the user could help train the local image recognition by annotating photos or videos with text descriptions. For a blind person, this would enable locally-specific audio descriptions like "bedroom door", "kitchen fridge" or specific food dishes.

Are there other iOS or Android AR apps which offer audio descriptions of live video from the camera?


All these features look amazing! That car motion sickness feature especially. Can’t wait to try it!


At least I put off the phone while I was in the car. Not the case now. Thank you Apple but I'd rather be sick while looking at your phone in a car.


This is one major advantage to Apple sharing a foundation across all their devices. Vision Pro introduced eye tracking to their systems as a new input modality, and now it trickles down to their other platforms.

I am surprised CarPlay didn’t have voice control before this though.


Is this really likely to be downstream of the Vision Pro implementation? I would think that eye-tracking with specialized hardware at a fixed position very close to the eye is very different to doing it at a distance with a general purpose front facing camera.


Typically eye-trackers work by illuminating the eyes with near infrared light and using infrared cameras. This creates a higher contrast image of the pupils, etc. I assume Apple is doing this in the Vision Pro. Eye-tracking can also be done with just visible light, though. Apple has the benefit of knowing where all the user interface elements are on screen, so eye-tracking in this on the iPhone or iPad doesn't need to be high precision. Knowledge of the position of the items can help to reduce the uncertainty of what is being fixated on.


So there isn’t much more to it than getting a good resolution image of the eye from any distance, every millisecond.

Precision is the issue because we are mostly moving our eyes in about 8 directions, there’s no precision because we don’t know how to measure focusing of our eye lens with a camera yet (unless that too is just a matter of getting a picture).

Squinting would be the closest thing to physically expressing focusing. So the camera needs to know I’m looking left with my eye, followed by a squint to achieve precision. Seems stressful though.

Gonna need AI just to do noise cancelling of involuntary things your eyes do like pupil dilation, blinking.


Not the hardware side, but the software side. Implementing all the various behaviours and security models for the eye tracking and baking it into SwiftUI, means that it translates over easier once they figure out the hardware aspect. But the iPads and iPhones with FaceID have had eye tracking capabilities for a while, just not useful in UI.


The 'tracking eyes' part is different, but once you have eye position data, the 'how the eyes interact with the interface' could be very similar.


It would be amazing if it gets carried over to the Mac.


CarPlay devices aren't really powerful.


CarPlay devices (car components) are essentially playing a streaming video of a hidden display generated by the phone. CarPlay also lets those devices send back touch events to trigger buttons and other interactions. Very little process is done on the vehicle.

BTW if you are plugged in to CarPlay and take a screen shot, it will include the hidden CarPlay screen.


CarPlay is rendered by the phone itself, so it's not strictly a function of how powerful the car infotainment is. You've been able to talk to Siri since the beginning of CarPlay so additional voice control is really just an accessibility thing


Some cars already have a voice control button on the wheel for their existing system which, if done correctly, is overriden by Siri+CarPlay. Which is really nice when it works.


I’m able to control carplay by just saying “Hey Siri.” Siri’s abilities tend to fluctuate based on what Apple is doing on the server, the phase of the moon and whether I remembered to sacrifice a chicken that morning, but otherwise, it seems to work fine.


My wife is a hospice nurse and from time to time she'll have a patient without any ability to communicate except their eyes (think ALS) - for these folks in their final days/weeks of life this will be a godsend. There are specialized eye-tracking devices, but they're expensive and good luck getting them approved by insurance in time for the folks in need near the end of their lives.


Eye Gaze devices(tablet with camera + software) cost around $20K, even if it offers 1/4 of the features this is good news for those who can't afford it.


Don't be ridiculous. Solid hardware and software combos for windows cost a small fraction of that. The convenient and decent Tobii PCEye costs like $1,250 and a very nice TMS5 mini is under $2,000. Your bullshit was off by at least an order of magnitude.


Let's be fair and compare similar products. Do you have any examples of $2000 mobile devices that support eye tracking on the OS level? The products you mention look like they're extra hardware you strap to a Windows PC. Certainly useful, but not quite as potentially life-changing as having that built into your phone.


I wonder if this announcement had anything to do with the bombshells OpenAI and Google dropped this week. Couldn’t this have been part of WWDC next month?


Tomorrow (today in some timezones) is Global Accesibility Awareness Day: https://en.m.wikipedia.org/wiki/Global_Accessibility_Awarene...


As Terramex pointed out this is tied to a particular, relevant event.

It’s also pretty common for Apple to preannounce some smaller features that are too specialized to be featured in the WWDC announcements. This gives them some attention when they would be lost and buried in WWDC footnotes.

It is also probably an indication that WWDC will be full of new features and only the most impactful will be part of the keynote.


They do it every year at the same time. Also, it’s a small announcement, not a keynote or the kind of fanfare we have at WWDC or the September events. This does not seem calibrated to be effective in an advertising war with another company. All this to say, probably not.


I think it's more of "clearing the decks" for stuff that didn't make the cut for WWDC. I assume WWDC is going to be all about AI and they couldn't find a good spot to put this announcement. "Clearing the decks" isn't a very kind way to refer to this accessibility tech since Apple has always been better than almost everyone else when it comes to accessibility. I don't see this as "we don't care, just announce it early" as much as "we can't fit this in so let's announce it early".


As noted elsewhere, Apple always does their accessibility announcements in advance of WWDC.


Can't wait for my son to try this. He has general coordination issues due to a brainstem injury but the eyes are probably the part of the body he can better control. I'm not a fan of Apple's software and didn't have a great experience with the Vision Pro but I am excited to try it out.


Haptics in the Music app will be great but it’s not exactly “new”, considering my haptic music app has been out for several years now: https://apps.apple.com/us/app/phazr/id1472113449


One of the most important accessibility features they could bring back are physical home buttons on at least one ipad.

I am completely serious. I work with a lot of older adults, many of whom have cognitive challenges, and the lack of a physical button to press as an escape hatch when they get confused is the #1 stumbling block by a mile.


When there were physical buttons, it was very popular in Asia to enable the accessibility option to put a virtual one on screen instead, because they were afraid the physical one would break. So it was kind of useless in the end.


> because they were afraid the physical one would break

On early models it was actually quite common for the button to stop working after a year or two. The durability has since improved, but habits die hard.


I have a pet-theory that certain older folks are cautious with technology because they grew up with stuff that would break expensively if you pressed the buttons in the wrong order.

Then when they see younger people just randomly explore-clicking to get things working--because their experience is that it's a safe tactic--that can gets misinterpreted as expertise, leading to: "Wow, kids these days just know technology."


It actually improved by no longer being a real button, instead it was a fake indent with a pressure sensor that did a haptic "click". But it still took up a lot of space on the front.


It’s not physical, but if the problem is they need something visible rather than a gesture, then you can put an always-on menu button on the screen that has a home button in.

https://support.apple.com/en-sg/111794


I agree. I have a last gen base iPad with Touch ID and a home button. I am pretty tech savvy but actually prefer this form factor.


Touch ID is so drastically superior to Face ID in so many common “iPad scenarios”, eg. laying in bed with face partially obscured.

I don’t understand Apple’s intense internal focus on FaceID only. FaceID with TouchID as an option when a device is flat on the table or when your face is obscured is so much nicer.


I actually can’t use Touch ID, at least year-round.

I have naturally very dry skin, and no matter how much moisturizer I used in colder months, I had to constantly have it relearn my fingerprint, to the point where I would just give up and use a passcode instead.

Face ID, on the other hand, has been flawless since my first phone with it in 2018 (minus some mask-related challenges during the pandemic). It and ProMotion are basically the two reasons I bought an iPad Pro over an iPad Air.


My mother had the same issue. TouchID would stop recognizing her finger less than a day after setting it up.


> I don’t understand Apple’s intense internal focus on FaceID only.

It isn't. Just a few days ago the new iPad Air has Touch ID.

Only iPad Pro uses Face ID. For iPad Pro users who use it for work and unlock it hundreds of times a day when in an office or during a commute, Face ID is vastly superior.


FaceID builds profitable user habit loops. More real-estate on the screen to show things, easier to thoughtlessly purchase things if your password is a glance at the camera, etc.

I don't think this is a user-focused decision, I believe it's a profit-focused one.


You can't buy things just by looking at the camera, there are forced button presses and the App Store tends to make you enter your password.


While it's a significant step forward for accessibility, it also invites us to consider how such technologies could integrate into everyday use for all users. This could enhance ease of use and efficiency, but it also requires careful consideration of privacy safeguards.


This is the beginning of the end for the mouse — not just on phones, but on desktops, everywhere. I highly recommend everyone to at least schedule a free Apple Vision demo at the local Apple Store.


This is awesome and I love the UX, although I can't help but feel a bit sad that we need to always rely upon Apple and Microsoft for consumer accessibility.

It would be so great if more resources could be allocated for such things within the Linux ecosystem.


Bummer the eye tracking is iOS only. I’ve been wanting focus-follows-eyes for decades.


I share the sentiment. I've long noted in situations with grids of terminal windows where focus-follows-eyes would be so much faster and present less friction than using hotkeys or focus-follows-mouse.


My first thought upon seeing the Haptic Music feature is to wonder how long until they make compatible headphones and I can relive my high school years, walking around listening to nu-metal and hip-hop with a Panasonic Shockwave walkman.


Surely, this would be a headphone feature possible without having to be supported by the player.


It's funny how the post is about enhanced surveillance technology and the text sentiment of the comment thread is overwhelmingly positive.

Surveillance technology: >:(

Surveillance technology with the word "accessibility" in the title: :)


of course it’s surveillance technology.

an iphone constantly broadcasts your location to third parties, can deduce where you work and live, understands the unique shape of your face, has a built-in microphone and multiple kinds of cameras, stores all of your conversations, stores all of your photos, stores your health information, can figure out when you go to bed.. all on a completely closed and proprietary operating system.

it’s like asking “why hasn’t anyone mentioned that we’re all using a website right now”


Very curious to see how well eye tracking works behind a motorcycle visor. Thick leather gloves, bike noise, and a touch screen and audio interface are not much fun.


Gotta make surf they got all the eye balls then make sure they don't get to date anybody and become homeless.


I hope this works for people with one eye. There are dozens of us—dozens!


Didn’t Mark Rober say he worked on some motion sickness stuff at Apple?


Nice to see features from Vision Pro make it onto other Apple products


This will have just enough capability limits and annoyances for you to be convinced to purchase a Vision Pro. It also makes eye tracking more widely accepted by making it available to everyone by using their existing devices.


while we're on topic of accessibility I'd like to point out the following lovely facts:

* on Android, in the builtin popup menu for copypasta, the COPY button is wedged in tightly (by mere millimeters) between the CUT and PASTE buttons on either side of it. A harmless one packed in between two destructive, lossy ones, and usually irreversible.

* the size of an adult human fingertip is no secret (and in fact 99.99% of humans have at least one!)

* Google's staff consists of humans

* Google supposedly hires ONLY "smart" people and their interview process involves some IQ-like tests

* Android has had this obvious anti-pattern for many years now. perhaps 10+?


Reminder: eye tracking means an always-on camera.

I'm not fully against it, but it's good to keep in mind the implications.


They keep announcing these bombastic accessibility features while simple things like tabbing remain frustratingly broken. The macOS “allow this app to access that” dialog supports shift+tab, but not tab.


https://support.apple.com/guide/mac-help/use-your-keyboard-l... - Keyboard navigation

https://support.apple.com/guide/mac-help/navigate-your-mac-u... - Full Keyboard Access (accessibility feature, goes beyond just tabbing between elements)

It's annoying that tabbing between UI elements is off by default on macOS. It's one of the first things I turn on with a new mac.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: