There really isn't another company in the world, certainly not another phone Manufacturer that has thought of people with motor skill problems in the way that Apple has. When people think of accessibility they think of the blind and that deaf, people with motor function problems are definitely the red-headed stepchildren of the accessibility software world.
Apple, pay your fng taxes but thank you very very very much for the accessibility software, my life would be much much poorer intellectually without it.
My cousin's child is quadriplegic with cerebral palsy and a lot of his eye controlled tech runs on Windows
That I control tech exists for every platform, but there is only OS X that you can take out of the box as a quadriplegic and with minimal help from a Designated Pair of Hands™ be up and running and independently messing around on HN.
Believe me, if I could buy a cheap laptop running Windows or Linux preferably and get the same user experience I get from OS X I would do it in a heartbeat. But in the past decade of trying, nothing comes close to OS X at the moment and even then OS X has only been this accessible for the past two or three additions.
All Microsoft did in this instance you're talking about is make sure that the drivers worked, and I trackers are essentially just mice as far as the computer is concerned, all of the clever stuff is done inside the eye tracker itself.
Just my two pence worth. But you know, It's 2 pence that I've earned through a decade of being a quadriplegic geek getting very annoyed at not being able to use my computer like everybody else.
I long for the day that I could use voice dictation software on Linux, any Linux distribution would be fine, then I could switch to some nice open source software and I would be a very happy pussycat. But none exists as far as I'm aware.
Edited to add: for clarity and excuse the odd grammatical mistake, Dragon doesn't make spelling mistakes but it does put the wrong word in the wrong place sometimes.
This is much better for overall scriptability IMHO than AppleScript
Then StickyKeys etc are valiant attempts at a complex area with so many different user types
Pretty much anything used to drive a UI on OS X is going through the accessibility APIs to do so.
So for instance I will have a button on my switch control on-screen keyboard, and when it's pressed it sends a request To If This Than That, which communicates with my LIFX lightbulbs and flashes all of them in the house blue notifying my carer that I need assistance. That's just one of many. I love AppleScript.
To be clear I control my computer with my right index finger which gives me left mouse click, a small reflective doton my glasses which gives the cursor movement and voice dictation software which enables you to leave comments on Hacker News.
On Windows, Autohotkey is my goto for automation
Not just GTK. Qt, Java, Electron-based... on Windows at least, the only way to create a truly screen reader accessible app is to use native controls. And I mean native controls, via the Windows API, not just ones which look native. Qt has some academic accessibility support but it's practically unusable.
Ironically, given the article I'm commenting on, the situation is worse on OS X.
But why does that matter? Well it turns out that many enterprises (both governments and corporations) have requirements that prevent them from purchasing equipment that lacks accessible technologies and that meant that you couldn’t sell computers that didn’t have beep hardware to those enterprises."
Source: https://blogs.msdn.microsoft.com/larryosterman/2010/01/04/wh... (search for ADA)
To echo your sentiment, Apple devices were far-and-away the easiest to use from an accessibility standpoint, both as a tester, and as a developer.
Anyone else want to give it a try?
I have a feeling that I got lucky. There's another setting under "Keyboard" > "Accessibility", at the bottom, called "Full Keyboard Access". Mine is set to "All controls". I have a feeling yours is set to "Text boxes and lists only", which may be the default.
So I am kinda proving your point here :)
(The non-discoverable way is to press Ctrl-Fn-F7; it's non-discoverable because the only place these instructions are shown is the Keyboard -> Shortcuts tab, but you can't get to that tab without a mouse.)
Very close Apple. Just a little more polish for your next update.
Then when I go to the panel, I tab and it only highlights the left-side of the portion. Confused me at first but after mashing some keys I found that you need to press up/down on the left page to select the "mouse portion" THEN tab again and it will go "into" the place with the mouse keys. Here's a gif http://imgur.com/a/SedKz.
Edit: See nearby replies -- the reason you were able to do this is that you had previously changed a default setting which has no obvious way of being changed without a mouse.
Tim Cook says Apple is the U.S.'s largest taxpayer.
If the IRS doesn't do its job because e.g legislators leave loopholes, or are in bed with big corps, then it's up to "commenters on HN" to call both on them. That's what being a citizen is about.
Ireland is accused (and convicted) of giving illegal tax benefits to Apple. Of course they don't like the verdict.
It's crazy people support the EU telling a sovereign country how to tax their citizens.
of course, that's supposedly what they signed up for by joining the EU. it's no wonder countries want to leave it tho. who voluntarily decides to give up their soverignity?
Everyone who chooses not to live in anarchy; it's a very normal and healthy thing to do.
Groups of people cooperate to agree on rules that achieve situations far superior to anarchy, where might makes right and little progress is made. My various levels of localities "give up" sovereignty to national government, which "gives up" sovereignty to international institutions. My person "gives up" sovereignty to all of the above. That way we can coordinate things on a local, national, and international scale as needed, from taxes to trade to health care to nuclear disarmament.
In this case, Europeans felt the need to agree on rules to prevent a race to the bottom in tax benefits to corporations. Everyone benefits by following these rules.
But perhaps they should still be paying more - but that's a discussion for another time.
Assuming you're from the US, every year you have the choice to take the standard deduction or itemize. Do you ever feel guilty for choosing the option that lessens your tax burden? Why would you hold a company to a higher standard?
If you think that companies should pay more taxes, then push your government to close the loopholes they're using and/or raise taxes. If you're counting on people to 'do the right thing' when they're directly incentivized not to you're going get nothing but disappointment.
You're talking about incentives which is a different discussion. There are incentives that go the other way too: evading taxes looks bad, it's bad for business if society crumbles for lack of revenue, morally it's wrong to benefit from infrastructure, education, etc and not contribute back. But that's not the point I was addressing.
If I owned a company overpaying taxes I'd seek to have them acquired (priced using a tax-efficient income assumption) or management replaced. Suing would be unlikely.
That said, I agree with your general point that no fiduciary obligation is likely violated. It's just bad business.
What you're proposing is referred to as 'Social Darwinism'. In this application it is used in a pseudo-logical fashion to beg the question entirely and falsely imply the justification of a lemma.
Your argument is as follows. "If you believe in (1) [natural selection applied to companies], then (2) [causal hypothesis: '1' leads to 'type'], meaning that (3) the [companies of type] we see now are the result of [natural selection applied to companies]."
I hope it is apparent when you look carefully that the 'companies of type' we are seeing now would be simply the result of natural selection in any case, that there is no support for the causal hypothesis presented, and that the causal hypothesis is not in fact a logically necessary implication of the original lemma.
The reason that this might feel meaningful, rather than be at first glance a tangle of errors, is that it involves the juncture between a hypothesis for which you feel you have evidence (and do!) with a core belief. Your hypothesis: "companies that do better at minimizing taxes are more likely to survive than those that don't." Your belief: "markets are efficient and the best economy arises when companies are allowed to compete without interference."
The difficulty here is in the confusion of the two intellectual contexts. Natural selection is a descriptive theory about how what is came to be. Market libertarianism is a prescriptive theory about how what should be can come to be. They are not interrelated. Under condition of Soviet Market, natural selection will still exist. Even if regulatory capture helps a company survive instead of perish, libertarian theory still advocates against it.
If you believe in natural selection applied to companies, then what we have here is the natural and almost predictable extinction event of those companies that over-optimized on minimizing taxes.
However, the belief in "natural selection applied to companies" is essentially meaningless once you decouple it from action or advocacy; coupled to advocacy, it is incoherent. It inherits absolutely no scientific justification or intellectual rigour from biology.
Minimizing taxes as a financial optimization strategy have everything to do with competitive advantage. If we were genuinely looking to take lessons from biological history, it would teach us that subsequently perishing as a consequence of that adaptation is entirely natural and normal. 'Natural selection' is whatever happens. It is often conflated with various notions of progress, merit, laissez-faire, et cetera, but seldom honestly.
Selection bias is a real thing, however. For example, "All politicians are corrupt" doesn't mean they all are, but given corrupt as a competitive advantage, the ones that weren't corrupt are not successful enough to be known. You don't have to be corrupt to be a politician, but you might have to to be a successful one. Businesses are much more like that given hyper competitive environments.
It's a massive waste of resources to find the right combination of tech that ends up magically working for all three. The running joke at my workplace is "fix JAWS, break VoiceOver; fix VoiceOver, break NVDA" etc. And yet it's not the kind of thing you want to let go because not many people will have the chance to switch to another device/reader to see if the site works there. Nor should they have to.
I spent several weeks working on a project that had to pass an accessibility audit. Came in partway through, spent weeks getting things to pass (and there were false positives with their scanning tool). Finally we were 'done', but we then learned that just meant the next step of the audit, where they'd use JAWS to pick random app screens and 'just see how things work'.
They worked horribly. But... I didn't have JAWS to test against. I used the company's one license of JAWS on a remote desktop (everything had to be remote - we were not allowed to pull the code local at all). JAWS installed, then died. Wouldn't really work on remote desktop properly. Used NVDA - that worked (awkward, but it worked). Spent days trying to 'patch stuff', and it made everything worse under JAWS.
The 'solution' was "make changes, then email someone at company X to run it through JAWS, and they'll email back what it said, then you can fix things based on that". That was really their solution. Project was put on hold shortly after that, and it was sunset a few months ago (after ... literally millions of dollars poured in over several years).
One of the big lessons from that - build accessibility in from the start - you can't arbitrarily graft this on years later and expect anything to actually 'work'.
However, the audit company only gave us half of what was needed, in a sense. The first half, we'd get back a list of "problems" (which, again, were problems under JAWS, although some were problematic under NVDA as well). The results had links to "best practices", so we could head off some other things ahead of time.
After that was passed, though, they'd just use the system randomly, and determine what was 'good' or 'acceptable' or not. It felt like we had the goalposts constantly moved. I kept getting asked "when will this be 'done'?" and I kept saying "I don't know what done is, and we only get to learn part of what the definition of 'done' is every few weeks, so... effectively, there is no answer. It's done when it's done". Which, of course, no one wanted to hear.
EDIT: and to be clear on your point, it was an absolutely ridiculous way to develop web software. Build a PHP app, but you're not allowed to use composer, or bring in any external/thirdparty code whatsoever (the network was completely blocked from giving us access to even pull down code). They'd gone out of their way to make sure it was not possible to use any external code, so things like security hashing and whatnot were all hand-rolled.
Though honestly, I think that we should give up on trying to present exactly the same interface structure to people who are completely blind. An application design which works well for sighted people can easily seem incredibly convoluted when described by your screen reader. I've not had time to build out a framework for this, but I think it would not be hard to start delivering something better. The idea being that you would provide a structural description of interactions and content, and the framework would present this through whatever means is available on the platform. This way, in combination with platform detection, it will always present something functioning.
Morgan Dixon did some wonderful work at the University of Washington called Prefab. Some of the links from my page to his papers are broken, but here's his web site and a demo:
Prefab: The Pixel-Based Reverse Engineering Toolkit
Prefab is a system for reverse engineering the interface structure of graphical interfaces from their pixels. In other words, Prefab looks at the pixels of an existing interface and returns a tree structure, like a web-page's Document Object Model, that you can then use to modify the original interface in some way. Prefab works from example images of widgets; it decomposes those widgets into small parts, and exactly matches those parts in screenshots of an interface. Prefab does this many times per second to help you modify interfaces in real time. Imagine if you could modify any graphical interface? With Prefab, you can explore this question!
Imagine if every interface was open source. Any of us could modify the software we use every day. Unfortunately, we don't have the source.
Prefab realizes this vision using only the pixels of everyday interfaces. This video shows how we advanced the capabilities of Prefab to understand interface content and hierarchy. We use Prefab to add new functionality to Microsoft Word, Skype, and Google Chrome. These demonstrations show how Prefab can be used to translate the language of interfaces, add tutorials to interfaces, and add or remove content from interfaces solely from their pixels. Prefab represents a new approach to deploying HCI research in everyday software, and is also the first step toward a future where anybody can modify any interface.
Maybe in the future we won't need computer vision to reverse-render display elements, one can dream.
For example, you could use the accessibility APIs to find the screen position of the video window in the Skype application, perform facial recognition and tracking, and screencast the video onto a texture of a VR chat application.
Orca I think does not get enough attention.
Alot of software I us is difficult to screen read due to (lazy) programmers just not labling things. nothing worse then moving around a software and all you hear is push button, push button and not knowing what they do. When you take the time to leave feedback it seems to be like talking to a wall no one does anything.
I am going to make sure I label and be more descriptive now inspired by this section of comments on my own web pages.
I remember a poingant article by Melanie Reid of The Times (UK) who broke her back horse riding. Melanie vividly described an assistive device trade show with every type of small manufacturer featuring their kit
Just goes to show how different things can be between perception and reality. I've worried that the move away from keyboard-based inputs would increase the marginalization of disabled computer users. Now that I think (more) about it, the physical form of a tablet can be similarly as interpretable as a keyboard, with the added bonus that app designers don't have the choice to build (and prioritize) a mouse-based interface -- I'm assuming that mouse-driven interfaces are especially difficult for the visually-impaired .
The uniformity of interface that iOS imposes is probably especially useful for the visually-impaired, provided that they have employees (like the one featured in the OP) who are on the engineering and design teams.
I've tried it myself (mostly to test out how some of our web apps work through that system) and it's actually pretty damn intuitive after you spend like 15 minutes getting the basics down.
For example, when you have the android accessibility stuff on, when you scroll in any kind of list, it makes these tones. They start at one pitch at the top, and end at another pitch at the bottom. After a bit of use, you start to intuitively understand where you are in the list just by the pitch it's making.
Other things like vibrations for when you are hovering over important elements and more really lets you "see" what is there pretty well.
What would improve it is something 3d-touch-esque. Being able to have a light touch be "hover" and a hard press be "touch" would make it much easier than the current system (IIRC it's double-tapping for touch, and other things like 2-fingers for scrolling)
The cost to produce them is probably very high, I imagine, especially when alternatives (like good audio/voice stuff) can do the job.
Years ago, I imagined a mouse with haptic feedback. You'd feel it tock when your pointer passed over a boundary. Different sensations for different boundaries and areas.
I didn't even think of some kind of braille like reading device. That'd be great.
When I asked him to show me how he used it I was blown away at just how excellent the blind UX is in iOS.
It was actually unintentional when I started doing it and I had only done it with search engines in mind, because they can't read infographics, and I wanted them to be able to categorize the article better. Turns out, there are hundreds to thousands of people who also can't read infographics.
The side effect was that it began attracting a good amount of people who just need everything written out for good reason. Not just the blind, but I have gotten emails from non-blind people who are autistic, dyslexic, or just have trouble reading who use e-readers both on computer and mobile devices thanking me for doing it.
Highly recommended for everyone to start taking the handicapped into mind when it comes to the Internet. EVERYONE is using the Internet. Handicapped, blind.. they aren't just sitting there twittling their thumbs. They are searching the Internet!
Tip: don't try TalkBack in your office, or you will rapidly drive your officemates crazy.
I say this as a sighted person, but is this really the best we can do? If you're blind I can imagine how important it is that it works at all, but it still sucks.
Annoyingly, what my family member really needs for their use case is an old-style phone with tactile keyboard - if it weren't for the fact that they also need to voice dial, hear who's calling, and hear incoming texts.
I guess you mean while using earbuds…
The thing he wants to do the most, in his current blinds state, is not rely on other people. Accessibility allows him to be that independent, empowered person. Thank you to everyone who furthers that cause.
She gets mentioned on my Twitter on at least a monthly basis and it's kind of neat to see someone who was once a local on the front page of HN.
Could you expand a little on what you mean by this?
It gets more extreme here in Thailand though - my sister in law and mother in law, (and I am led to believe, a lot of other people) use the assistive touch widget exclusively, instead of using the working home button, as they don't want the hardware button to break/wear out.
I don't quite understand the logic myself, but its definitely a thing.
* In a speech Tim Cook made in December 2013 on receiving the "International Quality of Life Awards" at Auburn University  (link goes to the exact time in the video when this was said):
> "These values guide us to make our products accessible for everyone. People with disabilities often find themselves in a struggle to have their human dignity acknowledged, they frequently are left in the shadows of technological advancements that are a source of empowerment and attainment for others, but Apple’s engineers push back against this unacceptable reality, they go to extraordinary lengths to make our products accessible to people with various disabilities from blindness and deafness to various muscular disorders."
* In early 2014, in the context of an NCPPR representative asking Apple CEO Tim Cook to "commit right then and there to doing only those things that were profitable", this is how Tim Cook responded :
> When we work on making our devices accessible by the blind," he said, "I don't consider the bloody ROI."
* In July 2015, marking 25 years of the "Americans with Disability Act", Tim Cook tweeted: 
> "Accessibility rights are human rights. Celebrating 25yrs of the ADA, we’re humbled to improve lives with our products. #ADA25"
Regardless of profit motives and the higher prices of iDevices (in comparison to others), Apple is leading on a few different fronts, one of them being accessibility and its commitment to improving it continuously. This kind of inclusivity, which should be a default, is what progress of humankind is about.
It's exciting to think about how tech is unlocking accessibility barriers. Huge :+1: to Apple for innovating in the accessibility space.
The unemployment rate among the blind is at least 63% right now. It's remarkable.
Here's a short video of my co-founder talking about our journey with this in our startup, and how you can take some first steps: https://youtu.be/g0bjrTCKxZw
Sometimes good news is just good news.
Worth noting, perhaps, that very few blind people actually own a JAWS license that they paid for themselves. The majority either get grants from charities, financial help if they're in work or studying, or use a pirated copy.