Hacker News new | past | comments | ask | show | jobs | submit login
Blind Apple engineer is transforming the tech world at only 22 (mashable.com)
397 points by davidbarker on Sept 4, 2016 | hide | past | web | favorite | 111 comments

There's a lot to gripe about with Apple, but not when it comes to accessibility. I am quadriplegic and usually that means putting up with some terrible subset of functions for any given application, tablet, phone or whatever but not with Apple. They've made it so that anything able-bodied person can do with an iPhone, I can do with their Switch Control interface which I use connected up to the chin controller on my wheelchair. It's all done via a little Bluetooth bridge called a Tecla shield and it's absolutely marvellous.

There really isn't another company in the world, certainly not another phone Manufacturer that has thought of people with motor skill problems in the way that Apple has. When people think of accessibility they think of the blind and that deaf, people with motor function problems are definitely the red-headed stepchildren of the accessibility software world.

Apple, pay your fng taxes but thank you very very very much for the accessibility software, my life would be much much poorer intellectually without it.

Microsoft also deserves applause in the accessibility stakes.

My cousin's child is quadriplegic with cerebral palsy and a lot of his eye controlled tech runs on Windows

I'm going to have to disagree with this quite a lot, your cousin's child is using I control tech running on Windows, not built into Windows.

That I control tech exists for every platform, but there is only OS X that you can take out of the box as a quadriplegic and with minimal help from a Designated Pair of Hands™ be up and running and independently messing around on HN.

Believe me, if I could buy a cheap laptop running Windows or Linux preferably and get the same user experience I get from OS X I would do it in a heartbeat. But in the past decade of trying, nothing comes close to OS X at the moment and even then OS X has only been this accessible for the past two or three additions.

All Microsoft did in this instance you're talking about is make sure that the drivers worked, and I trackers are essentially just mice as far as the computer is concerned, all of the clever stuff is done inside the eye tracker itself.

Just my two pence worth. But you know, It's 2 pence that I've earned through a decade of being a quadriplegic geek getting very annoyed at not being able to use my computer like everybody else.

I long for the day that I could use voice dictation software on Linux, any Linux distribution would be fine, then I could switch to some nice open source software and I would be a very happy pussycat. But none exists as far as I'm aware.

Edited to add: for clarity and excuse the odd grammatical mistake, Dragon doesn't make spelling mistakes but it does put the wrong word in the wrong place sometimes.

Microsoft has good built in automation capabilities. Almost any UI element can be accessed via winapi ( SetWindowsHookEx https://msdn.microsoft.com/en-us/library/ms644990(VS.85).asp... etc )

This is much better for overall scriptability IMHO than AppleScript

Then StickyKeys etc are valiant attempts at a complex area with so many different user types

You and I have different definitions of 'built in'. MS has chosen to have 3rd party deal with much of the accessibility functionality which is a reasonable choice for them, but it's not directly a windows feature. Want a screen reader use Jaws, etc etc.

OS X has the same APIs (Accessibility). AppleScript doesn't use them, but VoiceOver and Xcode's UI testing do.

AppleScript can and does use the accessibility APIs to drive UIs, it's just not very well documented (like much of AppleScript in general :( ).

Pretty much anything used to drive a UI on OS X is going through the accessibility APIs to do so.

AppleScript has been the absolute saviour for me on multiple occasions, the amount of scripts I have that trigger at the click of one button is getting to be a bit outrageous.

So for instance I will have a button on my switch control on-screen keyboard, and when it's pressed it sends a request To If This Than That, which communicates with my LIFX lightbulbs and flashes all of them in the house blue notifying my carer that I need assistance. That's just one of many. I love AppleScript.

To be clear I control my computer with my right index finger which gives me left mouse click, a small reflective doton my glasses which gives the cursor movement and voice dictation software which enables you to leave comments on Hacker News.

Thanks for sharing your setup. It's been a pleasure to discuss the issue with you. You've inspired me to check out AppleScript a bit more in detail.

On Windows, Autohotkey is my goto for automation

We just hired a 100% visually impaired student and he said that most MS apps provide enough metadata cleanly to play with screen reader apps like JAWS, and that the real bane of his existence was GTK-based applications on Windows.

I think GTK+ apps on Windows are the bane of everybody's existence. What a crappy windowing framework.

> We just hired a 100% visually impaired student and he said that most MS apps provide enough metadata cleanly to play with screen reader apps like JAWS, and that the real bane of his existence was GTK-based applications on Windows.

Not just GTK. Qt, Java, Electron-based... on Windows at least, the only way to create a truly screen reader accessible app is to use native controls. And I mean native controls, via the Windows API, not just ones which look native. Qt has some academic accessibility support but it's practically unusable.

Ironically, given the article I'm commenting on, the situation is worse on OS X.

I guess thank the ADA. Both Apple and MS don't want to be left outside of places because of it

"The ADA? What on earth could the ADA have to do with a PC making a beep? Well it turns out that at some point in the intervening 25 years, the Win32 Beep() was used for assistive technologies – in particular the sounds made when you enable the assistive technologies like StickyKeys were generated using the Beep() API. There are about 6 different assistive technology (AT) sounds built into windows, their implementation is plumbed fairly deep inside the win32k.sys driver.

But why does that matter? Well it turns out that many enterprises (both governments and corporations) have requirements that prevent them from purchasing equipment that lacks accessible technologies and that meant that you couldn’t sell computers that didn’t have beep hardware to those enterprises."

Source: https://blogs.msdn.microsoft.com/larryosterman/2010/01/04/wh... (search for ADA)

I read that text too and that's what I was thinking when I made that comment

When I was working as a contractor for the federal government, I got to learn how to do accessibility testing on both desktop and mobile devices.

To echo your sentiment, Apple devices were far-and-away the easiest to use from an accessibility standpoint, both as a tester, and as a developer.

A friend of mine is blind and I've heard him echo this sentiment. I tried TalkBack for Android a few months ago just to see how he uses his phone and it was a buggy piece of crap.

Recently I was using a Mac without a mouse, and tried to figure out how to turn on "Mouse Keys", which lets the keyboard emulate the mouse. The problem is, the only way to do that is to tick a checkbox in System Preferences, and I couldn't figure it out way to check it with only a keyboard. Tab, arrow keys, nothing would move the cursor to the box so I could check it.

Anyone else want to give it a try?

Just did it. Cmd+Space to bring up Spotlight. Type "Syste" and hit [enter]. The search box in the top-right corner should be active. [Tab] to it if it's not. Type in "Mouse" and arrow down to "Make mouse and trackpad easier to use". Hit [enter]. Hit [Tab] twice to move focus to "Enable Mouse Keys". Spacebar.

I have a feeling that I got lucky. There's another setting under "Keyboard" > "Accessibility", at the bottom, called "Full Keyboard Access". Mine is set to "All controls". I have a feeling yours is set to "Text boxes and lists only", which may be the default.

So I am kinda proving your point here :)

Indeed -- if it's set to the default (Text boxes and lists only), there seems to be no discoverable way to change the setting, and thus no discoverable way to turn on Mouse Keys.

(The non-discoverable way is to press Ctrl-Fn-F7; it's non-discoverable because the only place these instructions are shown is the Keyboard -> Shortcuts tab, but you can't get to that tab without a mouse.)

Keyboard access is definitely something Windows does better than Mac, what with all the `[Alt],` shortcuts you can quickly chain together. My favorite: [Alt],e,s,v in Excel, for pasting values.

You could probably be told that instruction over the phone... but knowing you needed to call it or what to call.

Very close Apple. Just a little more polish for your next update.

I've long thought that box should be checked by default (but I need it "on" so I would say that ;-) )

Anyone with mild RSI and a functioning TAB key would likely agree.

Try doing this: http://osxdaily.com/2010/02/26/use-the-tab-key-to-switch-bet...

Then when I go to the panel, I tab and it only highlights the left-side of the portion. Confused me at first but after mashing some keys I found that you need to press up/down on the left page to select the "mouse portion" THEN tab again and it will go "into" the place with the mouse keys. Here's a gif http://imgur.com/a/SedKz.

That's so weird -- everything's the same for me up until the Tab part. When I press Tab, nothing happens. "Mouse & Trackpad" just stays highlighted.

Edit: See nearby replies -- the reason you were able to do this is that you had previously changed a default setting which has no obvious way of being changed without a mouse.

Yeah, I've had this enabled for years.

In System Preferences » Keyboard » Shortcuts you need to have enabled Full Keyboard Access. The shortcut to toggle it is (fn-)Control-F7. After that you can tab through all controls and navigate to the Mouse Keys checkbox and check that.

Thanks -- this works! Too bad it's totally non-discoverable.

i'm disappointed that we haven't made more progress with voice interfaces. Many people are happy with the keyboard/mouse as their primary input devices but combined with subtle gestures like you might get with Google's Soli and we should get a better user interface for everyone.


Good trackpads/touchpads with gestures aren't that far away from Soli, once you accept that people are lazy and don't want to fight gravity with their hands.

How's that going to work with VR?

What an awesome thread. I am blown away, inspired.

> Apple, pay your fng taxes

Tim Cook says Apple is the U.S.'s largest taxpayer.


Does that mean they should only have to pay enough to be considered the largest? Or perhaps they should pay what the legislation dictates, regardless of how much that is.

They probably are paying what the legislation dictates, and no more. If they aren't, it's the IRS' job to enforce that and apply the appropriate penalties, not commenters on HN.

>If they aren't, it's the IRS' job to enforce that and apply the appropriate penalties, not commenters on HN.

If the IRS doesn't do its job because e.g legislators leave loopholes, or are in bed with big corps, then it's up to "commenters on HN" to call both on them. That's what being a citizen is about.

No one outside of the US cares, they should pay taxes in each country in which they deign to do business.

They did. Then the EU retroactively decided it wasn't enough. Even Ireland (the county the money is supposedly owed to) says it's ridiculous.

0.005% is what they paid in 2014 according to the EU Commission, giving them an unfair competitive advantage towards other companies who cannot afford to set up complex schemes to avoid paying taxes like them. A EU country can decide their own tax rate, but not favor a particular company over all the others.


> Even Ireland (the county the money is supposedly owed to) says it's ridiculous.

Ireland is accused (and convicted) of giving illegal tax benefits to Apple. Of course they don't like the verdict.

the people of Ireland should be the ones responsible for how they tax their constituents.

It's crazy people support the EU telling a sovereign country how to tax their citizens.

of course, that's supposedly what they signed up for by joining the EU. it's no wonder countries want to leave it tho. who voluntarily decides to give up their soverignity?

> who voluntarily decides to give up their soverignity?

Everyone who chooses not to live in anarchy; it's a very normal and healthy thing to do.

Groups of people cooperate to agree on rules that achieve situations far superior to anarchy, where might makes right and little progress is made. My various levels of localities "give up" sovereignty to national government, which "gives up" sovereignty to international institutions. My person "gives up" sovereignty to all of the above. That way we can coordinate things on a local, national, and international scale as needed, from taxes to trade to health care to nuclear disarmament.

In this case, Europeans felt the need to agree on rules to prevent a race to the bottom in tax benefits to corporations. Everyone benefits by following these rules.

FWIW, he say's Apple is the worlds largest tax payer.

But perhaps they should still be paying more - but that's a discussion for another time.

I'm pretty sure they are paying their taxes. To pay the minimum allowed by law is their fiduciary duty.

It's a common misapprehension but it's just not correct. There is no such thing as a fiduciary duty to minimize taxes. See https://en.wikipedia.org/wiki/Fiduciary#Fiduciary_duties_und....

Best interest of the company usually includes producing the most income for shareholders, which involves minimizing expenses. Taxes are an expense. It may not be written down in stone but it follows logically from other principles.

If what you're saying is true then it's easy to provide evidence: just find a case where a company has been successfully sued for not minimizing their taxes sufficiently. I've never heard of such a case and I can't imagine how it could exist and not be mentioned every time a company receives criticism for evading taxes.

What you're saying is ridiculous. No company would be sued for overpaying their taxes but their accountants would certainly be fired.

Assuming you're from the US, every year you have the choice to take the standard deduction or itemize. Do you ever feel guilty for choosing the option that lessens your tax burden? Why would you hold a company to a higher standard?

If you think that companies should pay more taxes, then push your government to close the loopholes they're using and/or raise taxes. If you're counting on people to 'do the right thing' when they're directly incentivized not to you're going get nothing but disappointment.

The original comment said it was their fiduciary duty to minimize taxes. It's not - exploiting loopholes is a choice, not a legal obligation.

You're talking about incentives which is a different discussion. There are incentives that go the other way too: evading taxes looks bad, it's bad for business if society crumbles for lack of revenue, morally it's wrong to benefit from infrastructure, education, etc and not contribute back. But that's not the point I was addressing.

> just find a case where a company has been successfully sued for not minimizing their taxes sufficiently

If I owned a company overpaying taxes I'd seek to have them acquired (priced using a tax-efficient income assumption) or management replaced. Suing would be unlikely.

That said, I agree with your general point that no fiduciary obligation is likely violated. It's just bad business.

If you believe in natural selection, then those companies that do better at minimizing taxes are more likely to survive than those that don't, meaning the tax minimizing companies we see now are simply the result of a natural progression.

I know that you are saying something that sounds true to you, so I am going to walk through the logic so that you can re-evaluate how much attention you are paying to critical thinking on these matters.

What you're proposing is referred to as 'Social Darwinism'. In this application it is used in a pseudo-logical fashion to beg the question entirely and falsely imply the justification of a lemma.

Your argument is as follows. "If you believe in (1) [natural selection applied to companies], then (2) [causal hypothesis: '1' leads to 'type'], meaning that (3) the [companies of type] we see now are the result of [natural selection applied to companies]."

I hope it is apparent when you look carefully that the 'companies of type' we are seeing now would be simply the result of natural selection in any case, that there is no support for the causal hypothesis presented, and that the causal hypothesis is not in fact a logically necessary implication of the original lemma.

The reason that this might feel meaningful, rather than be at first glance a tangle of errors, is that it involves the juncture between a hypothesis for which you feel you have evidence (and do!) with a core belief. Your hypothesis: "companies that do better at minimizing taxes are more likely to survive than those that don't." Your belief: "markets are efficient and the best economy arises when companies are allowed to compete without interference."

The difficulty here is in the confusion of the two intellectual contexts. Natural selection is a descriptive theory about how what is came to be. Market libertarianism is a prescriptive theory about how what should be can come to be. They are not interrelated. Under condition of Soviet Market, natural selection will still exist. Even if regulatory capture helps a company survive instead of perish, libertarian theory still advocates against it.

If you believe in natural selection applied to companies, then what we have here is the natural and almost predictable extinction event of those companies that over-optimized on minimizing taxes.

However, the belief in "natural selection applied to companies" is essentially meaningless once you decouple it from action or advocacy; coupled to advocacy, it is incoherent. It inherits absolutely no scientific justification or intellectual rigour from biology.

I think what you are saying is that minimizing taxes as a financial optimization strategy has nothing to do with natural selection.

The popular term is 'not even wrong'. There is a premise that companies existing and developing strategies in a competitive economic environment can best be thought about in terms of 'natural selection'. This premise is innately flawed, and has a long history of being misapplied for ideological reasons. It is a completely pointless exercise when the ideological cachet of 'Nature' is removed from consideration. Companies do not evolve on the basis of heredity. Natural selection teaches us nothing about them.

Minimizing taxes as a financial optimization strategy have everything to do with competitive advantage. If we were genuinely looking to take lessons from biological history, it would teach us that subsequently perishing as a consequence of that adaptation is entirely natural and normal. 'Natural selection' is whatever happens. It is often conflated with various notions of progress, merit, laissez-faire, et cetera, but seldom honestly.

No scientific claims were made in my comment. In the social sciences, including economics, the actors are much more difficult to study in controlled settings, so statements tend to be much more qualitative.

Selection bias is a real thing, however. For example, "All politicians are corrupt" doesn't mean they all are, but given corrupt as a competitive advantage, the ones that weren't corrupt are not successful enough to be known. You don't have to be corrupt to be a politician, but you might have to to be a successful one. Businesses are much more like that given hyper competitive environments.

One thing that frustrates me significantly about accessibility and assistive technology (specifically for the web at least) is the lack of consistency implementing standards. It's much more insidious than most other standards inconsistencies because rather than your page looking slightly off in one browser, you can have your site reading perfectly on two of the major screen readers and the third refuses to read it.

It's a massive waste of resources to find the right combination of tech that ends up magically working for all three. The running joke at my workplace is "fix JAWS, break VoiceOver; fix VoiceOver, break NVDA" etc. And yet it's not the kind of thing you want to let go because not many people will have the chance to switch to another device/reader to see if the site works there. Nor should they have to.

Holy tamole you're not even remotely kidding about this.

I spent several weeks working on a project that had to pass an accessibility audit. Came in partway through, spent weeks getting things to pass (and there were false positives with their scanning tool). Finally we were 'done', but we then learned that just meant the next step of the audit, where they'd use JAWS to pick random app screens and 'just see how things work'.

They worked horribly. But... I didn't have JAWS to test against. I used the company's one license of JAWS on a remote desktop (everything had to be remote - we were not allowed to pull the code local at all). JAWS installed, then died. Wouldn't really work on remote desktop properly. Used NVDA - that worked (awkward, but it worked). Spent days trying to 'patch stuff', and it made everything worse under JAWS.

The 'solution' was "make changes, then email someone at company X to run it through JAWS, and they'll email back what it said, then you can fix things based on that". That was really their solution. Project was put on hold shortly after that, and it was sunset a few months ago (after ... literally millions of dollars poured in over several years).

One of the big lessons from that - build accessibility in from the start - you can't arbitrarily graft this on years later and expect anything to actually 'work'.

This is an absolutely ridiculous way to run an audit. No wonder accessibility on the web is a mess.

to be clear, less of this was a problem with the audit company, and more with the company I was contracting to (who were submitting their system for audit).

However, the audit company only gave us half of what was needed, in a sense. The first half, we'd get back a list of "problems" (which, again, were problems under JAWS, although some were problematic under NVDA as well). The results had links to "best practices", so we could head off some other things ahead of time.

After that was passed, though, they'd just use the system randomly, and determine what was 'good' or 'acceptable' or not. It felt like we had the goalposts constantly moved. I kept getting asked "when will this be 'done'?" and I kept saying "I don't know what done is, and we only get to learn part of what the definition of 'done' is every few weeks, so... effectively, there is no answer. It's done when it's done". Which, of course, no one wanted to hear.

EDIT: and to be clear on your point, it was an absolutely ridiculous way to develop web software. Build a PHP app, but you're not allowed to use composer, or bring in any external/thirdparty code whatsoever (the network was completely blocked from giving us access to even pull down code). They'd gone out of their way to make sure it was not possible to use any external code, so things like security hashing and whatnot were all hand-rolled.

In my experience, TalkBack works great. It is especially nice that in concert with Chromium's aria implementation, it also works quite well. The menace has been VoiceOver, which is inconsistent. In web views, VoiceOver segments incorrectly on elements and will regularly ignore aria labels. Not sure how well JAWS works, NVDA seems to work well enough, and I've admittedly not tried Orca with a web browser.

Though honestly, I think that we should give up on trying to present exactly the same interface structure to people who are completely blind. An application design which works well for sighted people can easily seem incredibly convoluted when described by your screen reader. I've not had time to build out a framework for this, but I think it would not be hard to start delivering something better. The idea being that you would provide a structural description of interactions and content, and the framework would present this through whatever means is available on the platform. This way, in combination with platform detection, it will always present something functioning.

I wrote up some ideas about a scriptable cross platform accessibility integration system called aQuery, like jQuery for Accessibility.


Morgan Dixon did some wonderful work at the University of Washington called Prefab. Some of the links from my page to his papers are broken, but here's his web site and a demo:



Prefab: The Pixel-Based Reverse Engineering Toolkit Prefab is a system for reverse engineering the interface structure of graphical interfaces from their pixels. In other words, Prefab looks at the pixels of an existing interface and returns a tree structure, like a web-page's Document Object Model, that you can then use to modify the original interface in some way. Prefab works from example images of widgets; it decomposes those widgets into small parts, and exactly matches those parts in screenshots of an interface. Prefab does this many times per second to help you modify interfaces in real time. Imagine if you could modify any graphical interface? With Prefab, you can explore this question!


Imagine if every interface was open source. Any of us could modify the software we use every day. Unfortunately, we don't have the source.

Prefab realizes this vision using only the pixels of everyday interfaces. This video shows how we advanced the capabilities of Prefab to understand interface content and hierarchy. We use Prefab to add new functionality to Microsoft Word, Skype, and Google Chrome. These demonstrations show how Prefab can be used to translate the language of interfaces, add tutorials to interfaces, and add or remove content from interfaces solely from their pixels. Prefab represents a new approach to deploying HCI research in everyday software, and is also the first step toward a future where anybody can modify any interface.

That is very cool, thank you for sharing. I'll see if I can put some weekends into my part of the solution. Prefab seems like it may be the only hope for making existing proprietary software accessible; I want to make it easy for businesses with accessibility requirements to do the right thing well.

Maybe in the future we won't need computer vision to reverse-render display elements, one can dream.

I think it's great to use JavaScript to control and combine the best of both worlds: accessibility APIs plus screen scraping / selective screencasting / pattern recognition / computer vision.

For example, you could use the accessibility APIs to find the screen position of the video window in the Skype application, perform facial recognition and tracking, and screencast the video onto a texture of a VR chat application.

I have worked with some Blind clients being paid to fix computer problems in general. i find the Microsoft windows users Prefer "windows eyes" over JAWS. I have a Blind friend and have invested time with Knoppix adreine and the latest most recent is with Sonar Linux. http://sonargnulinux.com/ (Gnome with Orca, based on arch) Orca is pretty decent using firefox. We recently added in Kodi with the screen reader addon and its a whole new world.

Orca I think does not get enough attention.

Alot of software I us is difficult to screen read due to (lazy) programmers just not labling things. nothing worse then moving around a software and all you hear is push button, push button and not knowing what they do. When you take the time to leave feedback it seems to be like talking to a wall no one does anything.

I am going to make sure I label and be more descriptive now inspired by this section of comments on my own web pages.

Assistive tech is still really in the niche of cottage industry. Here you are relying on the individual competencies of well intentioned specialists.

I remember a poingant article by Melanie Reid of The Times (UK) who broke her back horse riding. Melanie vividly described an assistive device trade show with every type of small manufacturer featuring their kit

> Castor told Apple reps how amazed she was by the iPad she received as a gift for her 17th birthday just a few years earlier. It raised her passion for tech to another level — mainly due to the iPad's immediate accessibility.

Just goes to show how different things can be between perception and reality. I've worried that the move away from keyboard-based inputs would increase the marginalization of disabled computer users. Now that I think (more) about it, the physical form of a tablet can be similarly as interpretable as a keyboard, with the added bonus that app designers don't have the choice to build (and prioritize) a mouse-based interface -- I'm assuming that mouse-driven interfaces are especially difficult for the visually-impaired [0].

The uniformity of interface that iOS imposes is probably especially useful for the visually-impaired, provided that they have employees (like the one featured in the OP) who are on the engineering and design teams.

[0] http://webaim.org/articles/visual/blind

A blind guy I know loves android significantly more than any other platform. To him he can just run his finger over the screen to "feel" what's there by listening to a voice that is WAY too fast for me to even really understand.

I've tried it myself (mostly to test out how some of our web apps work through that system) and it's actually pretty damn intuitive after you spend like 15 minutes getting the basics down.

Makes me wonder what could be done with a pressure sensitive screen and something like electrostatics to produce "texture".

Honestly I don't think that "textured" screens would help all that much, audio can just convey so much more information.

For example, when you have the android accessibility stuff on, when you scroll in any kind of list, it makes these tones. They start at one pitch at the top, and end at another pitch at the bottom. After a bit of use, you start to intuitively understand where you are in the list just by the pitch it's making.

Other things like vibrations for when you are hovering over important elements and more really lets you "see" what is there pretty well.

What would improve it is something 3d-touch-esque. Being able to have a light touch be "hover" and a hard press be "touch" would make it much easier than the current system (IIRC it's double-tapping for touch, and other things like 2-fingers for scrolling)

What if you could produce braille on a touch screen?

There are a few such things out there, http://www.popsci.com/new-touch-screen-design-could-display-...

The cost to produce them is probably very high, I imagine, especially when alternatives (like good audio/voice stuff) can do the job.

That's remarkable.

Years ago, I imagined a mouse with haptic feedback. You'd feel it tock when your pointer passed over a boundary. Different sensations for different boundaries and areas.

I didn't even think of some kind of braille like reading device. That'd be great.

iOS has excellent assistive technology. A good friend of mine who is blind (not totally, can see shadows) adores his iPhone as it allows him to go out independently with ease. He can text and use Maps to get around in places he has never been before. He has said to me on multiple occasions how the iPhone literally changed his life.

When I asked him to show me how he used it I was blown away at just how excellent the blind UX is in iOS.

I saw a GREAT conference talk earlier this year that discussed a number of reasons why phones and tablets are so valuable for visually-impaired users. It's definitely worth watching if anyone is interested in the topic and can spare ~30 minutes:


I have quite a few blind readers to my website at http://www.confessionsoftheprofessions.com. I'm not sure if they came before or after I made the decision when starting the website to write out every single infographic in detail with a text-friendly version. It takes me an extra 10-20 minutes, depending on the length of the infographic, when I'm processing my articles, but it has its advantages for both readers and SEO.

It was actually unintentional when I started doing it and I had only done it with search engines in mind, because they can't read infographics, and I wanted them to be able to categorize the article better. Turns out, there are hundreds to thousands of people who also can't read infographics.

The side effect was that it began attracting a good amount of people who just need everything written out for good reason. Not just the blind, but I have gotten emails from non-blind people who are autistic, dyslexic, or just have trouble reading who use e-readers both on computer and mobile devices thanking me for doing it.

Highly recommended for everyone to start taking the handicapped into mind when it comes to the Internet. EVERYONE is using the Internet. Handicapped, blind.. they aren't just sitting there twittling their thumbs. They are searching the Internet!

Everyone (especially sighted developers!) should try enabling a screen reader on their phone once in while. Of course it's great for finding accessibility bugs, but it's also a cool trick to be able to read emails and stuff without someone being able to look over your shoulder.

If you want to try the Android screen reader, turn on accessibility with Settings -> Accessibility, TalkBack On. Then, tap a control once to hear what it is, then double tap to activate. (This is very important to know, or else you won't be able to turn TalkBack off again.)

Tip: don't try TalkBack in your office, or you will rapidly drive your officemates crazy.

I often have to help a family member who is blind. Trying to use a phone with Voiceover or Talkback is the most horrific experience. Each button or key press takes 3 time-sensitive taps, while it speaks the contents of that button/key in a robotic voice.

I say this as a sighted person, but is this really the best we can do? If you're blind I can imagine how important it is that it works at all, but it still sucks.

Annoyingly, what my family member really needs for their use case is an old-style phone with tactile keyboard - if it weren't for the fact that they also need to voice dial, hear who's calling, and hear incoming texts.

I have pondered trying to use talkback to operate my phone while i need my eyes elsewhere.

so they can listen over your shoulder instead?

I guess you mean while using earbuds…

This thread comes at a funny time. I just saw my cousin and roommate come back from surgery. He had the same surgery before in one eye, now he's impaired the other as well. Now it's just waiting and hoping for recovery.

The thing he wants to do the most, in his current blinds state, is not rely on other people. Accessibility allows him to be that independent, empowered person. Thank you to everyone who furthers that cause.

Apple is doing great things here. I welcome feedback and suggestions from any visually impaired programmer that can improve the usability of the D programming language website for them. dlang.org

In the article it mentioned 70% of blind people are unemployed. That seems like a big problem.

Is "transforming the tech world" a bit strong? I get turned off by all this headline hyperbole.

"Apple hires blind woman" would be a much fairer title.

Though I never met her when she was at Michigan State, I've got a number of friends that did and she was a rockstar developer at the school even as an undergrad.

She gets mentioned on my Twitter on at least a monthly basis and it's kind of neat to see someone who was once a local on the front page of HN.

> rockstar developer

Could you expand a little on what you mean by this?

It means "exceptionally good, and known for it"

Apple's accessibility features even useful as workarounds for hardware failure. If one of the buttons fails, turning on the accessibility features to: on/off, screen-lock, volume, etc.

The assistive touch widget on my iPhone has meant I can wait a little before replacing it (failed home button).

It gets more extreme here in Thailand though - my sister in law and mother in law, (and I am led to believe, a lot of other people) use the assistive touch widget exclusively, instead of using the working home button, as they don't want the hardware button to break/wear out.

I don't quite understand the logic myself, but its definitely a thing.

Same in Taiwan, everyone seems to use Assistive Touch there. I guess it's a bit like force-killing all apps before putting your phone away. It feels good to invest some effort into taking care of your expensive device.

It's the iOS equivalent of never removing the protective film from the factory. Keeps your device looking pristine, too bad you never get to actually see it.

I've seen people here with 5+ year old TV's with the promotional stickers still on them, obscuring part of the screen.

Yep, sometimes my trackpad would get stuck or glitchy so I enabled an option that let me use the keyboard instead. (It lets you move in 9 directions, centered around the 'I' key) There are also other useful options like one that lets you zoom in with ctrl+scroll which helped me tons of times to examine individual pixels or magnify small text.

Glad I'm not the only one who thought of this. I'm a VoiceOver user (totally blind), but when my lock/power button broke on my iPhone it took me a while to figure out that I could just use assistive touch to avoid having to buy a new phone.

This reminds me of a few things with respect to Apple and accessibility (listed in chronological order here).

* In a speech Tim Cook made in December 2013 on receiving the "International Quality of Life Awards" at Auburn University [1] (link goes to the exact time in the video when this was said):

> "These values guide us to make our products accessible for everyone. People with disabilities often find themselves in a struggle to have their human dignity acknowledged, they frequently are left in the shadows of technological advancements that are a source of empowerment and attainment for others, but Apple’s engineers push back against this unacceptable reality, they go to extraordinary lengths to make our products accessible to people with various disabilities from blindness and deafness to various muscular disorders."

* In early 2014, in the context of an NCPPR representative asking Apple CEO Tim Cook to "commit right then and there to doing only those things that were profitable", this is how Tim Cook responded [2]:

> When we work on making our devices accessible by the blind," he said, "I don't consider the bloody ROI."

* In July 2015, marking 25 years of the "Americans with Disability Act", Tim Cook tweeted: [3]

> "Accessibility rights are human rights. Celebrating 25yrs of the ADA, we’re humbled to improve lives with our products. #ADA25"

Regardless of profit motives and the higher prices of iDevices (in comparison to others), Apple is leading on a few different fronts, one of them being accessibility and its commitment to improving it continuously. This kind of inclusivity, which should be a default, is what progress of humankind is about.

[1]: https://youtu.be/dNEafGCf-kw?t=272

[2]: https://www.macobserver.com/tmo/article/tim-cook-soundly-rej...

[3]: https://twitter.com/tim_cook/status/624584736862679040

This reminds me of the "SignAloud" gloves that vocalize American Sign Language.


It's exciting to think about how tech is unlocking accessibility barriers. Huge :+1: to Apple for innovating in the accessibility space.

Great article. I became interested in how blind programmers work when a blind programmer emailed me about 20 years ago with some questions about the code in my book Portable GUI Applications with C++. Apple, and other companies who support people with disabilities, deserve a lot of credit, but people who work through their disabilities greatly impress me.

Was just talking about this yesterday with my newly blind friend.

The unemployment rate among the blind is at least 63% right now. It's remarkable.

If any developers are thinking about getting started with accessibility, I would definitely encourage you to!

Here's a short video of my co-founder talking about our journey with this in our startup, and how you can take some first steps: https://youtu.be/g0bjrTCKxZw

A bit of a technical question about blind people, especially birth-blind people. Does the spatiality of the screen cease to be a factor? How do they "visualize" internally a computer interface, especially a GUI one? Thanks.

Could we get a less descriptive title? Not clicking on clickbaits.

this is marketing tactics. how would she know if the ipad she received is working great out of the box? seriously, our screens are not yet tactile or programmably textured or termperatured yet. she should work on that stuff perhaps.

Just curious, but have you tried a brand-new iDevice out-of-the-box as a disabled person? I have. I am quadriplegic and within five minutes of help from a able-bodied person, I had as much access to my iPhone as any able-bodied person does.

Sometimes good news is just good news.

Just noting that if accessibility was front and center, the tablet wouldn't be priced exorbitantly. Let's take this for what it is, good PR

That's an extremely cynical take.

Sad, but true. There couldn't be a better PR plug than this for Apple, as they continue to price their products outside the affordability of the disabled. If 70% of the blind are unemployed, how can they afford an iPad?

Go check the price of JAWS. By your logic anyone making great is just doing that for PR.

> Go check the price of JAWS. By your logic anyone making great is just doing that for PR.

Worth noting, perhaps, that very few blind people actually own a JAWS license that they paid for themselves. The majority[citation needed] either get grants from charities, financial help if they're in work or studying, or use a pirated copy.

They are probably getting disability benefits, and there are many private charities that assist the blind. Very likely that a blind person who needs a tablet or PC can get one at minimal to zero cost.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact