Computers are presently absolutely terrible for the visually impaired and there is no product I can find that even begins to address it.
As the article said screen readers right now are ass backwards, it might read to you, but you still need to be able to see move the mouse and do anything with it.
Screen reader accessibility can be archived by coding pages in clean and standards based HTML. alt attributes are basic, but often are missing or have unclear text. Form elements need to be properly labelled, links need clear descriptions etc.
I agree that screen readers aren't the greatest technology, but the truth is developers of websites and applications aren't doing their part to make their products accessible.
Some in the hacker community are working on voice solutions to help program with off the shelf programs, for example:
Someone could probably use the same solution to help the visually impaired.
Throw in the new motion gesture technology, like that by Intel: http://www.intel.com/content/www/us/en/architecture-and-tech...
And soon we'll all have more natural ways to interact with computers.
The tactile feedback from a key press is at best only a small proportion of that. You get to know you mushed a key, but not what (if anything) it changed. Assistive tech using keyboard input will generally fill in that gap with voice feedback. Gesture input with haptic and audio/voice feedback should be usable in some scenarios. I'm currently working on a couple of iOS apps in the fitness space that have this as a key part of their UI.
I've mixed feelings about conversational UI personally but one of the things that makes accessibility so hard is that there really is no 'one size fits all' solution.
As the author signposts, people who lose their sight later in life tend to struggle a great deal more than those of us lucky enough never to have known anything different - we had the advantage of naturally developing all the appropriate strategies and skills while our brains were still plastic.
That group would doubtless benefit from an assistive tech stack that 'just works' and doesn't require a bunch of training and knowledge acquisition to get even the simplest things done.
How do you anticipate programming as you become legally blind? Will you start dictating code and having it read back to you?
In the future, I think there is a lot of potential for pairing a Conversational UI and a natural language programming...language. Eve looks really promising, and could change the way that everyone writes software, and not just the blind: http://eve-lang.com/
Here is a demo: https://youtu.be/VZQoAKJPbh8?t=46m52s
> When it came to the people I met, many who have been blind since they were very young and who have worse vision than I will ever have, I was blown away. The term disabled does not apply to them. They are extremely independent, good natured, and very successful in whatever field they chose to be in.
Instead of saying that the term disabled does not apply to successful etc people, I think better phrasing would be that meeting these people changed what he considered disabled to mean/imply.
Other than that, I like the overall message. I do think that it's good idea to focus on making purpose-built applications for vision (or otherwise) impaired people, maybe even trying to grow a small niche ecosystem.
That said, I didn't think this was the author's intent. But I can't count how many times someone didn't think I was blind because I wrote well, and they couldn't conceive of how someone who couldn't see could write a coherent sentence. :) Try applying for jobs in that expectational landscape and, well, it's challenging and is why I don't even like working for others anymore.
I usually ask those people if they'll let me drive their car.
I identify as blind or low-viz and wouldn't want to take away from others who also identify as such or as disabled.
That line was meant to be a response to the fears that it seemed like my family and friends had about my diagnosis. Fears, that I wouldn't be able to remain independent or be successful. You know how loved ones generally fear the worst possible scenario. Anyway, wording it this was was terrible, and I'm sorry. I'll see if I can make an edit and rephrase things.
I coach at a tennis club for players with low or no vision and we have a standing joke around how we're technically classed as 'vulnerable adults' but this is very much not the impression that people take away from meeting is for the first time. It's funny from where we're standing.
Are screen readers difficult to use? Sure, but no more so than typical desktop environments. Describing them as "requiring special training from organizations like the Lighthouse for the Blind" feels like a bit of a stretch. Would some folks require training? Sure, just as they might require training to learn OS X if they're familiar with Windows, but that's true of anyone sighted or not, and hardly a requirement. Also, while I agree that licensing costs for some screen readers are enormous, just about every mobile and desktop platform has at least one free, competent alternative. Someone newly blind or blind after years of sighted life might certainly need training, but in general I find picking up a new screen reader to be no different than learning any other semi-complex app. If using Word/LibreOffice is part of your job then you sit down, learn it and become competent with it over time, or you get trained. Categorizing screen readers as being different and worse in this regard feels unfair.
"All of the time and energy that goes into creating the perfect user experience for an app is wasted, or even worse, adversely impacting the experience for blind users." Again, I strongly disagree with this. The problem is that UX time and energy is misspent almost exclusively on visual aesthetics, or even worse, on a certain type of visual aesthetic. For instance, say you use icon fonts because they're a cool thing to do. That choice has accessibility impact beyond blindness. Had a UI/UX designer focused on universal design and on crafting experiences that both look well and perform nicely from a keyboard/switch user's perspective, that interface will be visually appealing, nice to use via the keyboard, easy for users with low hand dexterity using switch controllers, etc.
Again, kudos to you for driving innovation in this space. I wish access tech didn't always feel like an industry stuck a few decades in the past, or wasn't being driven forward by companies with a medical-model concept of disability who view us as a market to be pitied rather than sold to. To that end, I like seeing exciting tech in this space, even if I can't imagine ever preferring to shop by a conversational interface. But I read headlines like "Apps are a nightmare for the visually impaired" and think "huh, I'm building an app for the visually impaired and hope it isn't a nightmare." And I worry that, rather than learning about how universal design can build experiences that look and work well for everyone, software developers and companies will read this post and think "I don't have to make my app accessible because blind people are using special custom interfaces anyway, so when a blind person tells me my app isn't accessible, I can pretend ARIA attributes are hard and write them off." Again, not trying to be negative, I just wish folks understood that we don't need separate interfaces and experiences.
I also agree that we shouldn't be creating separate experiences for the blind. I think it's generally acknowledged that they end up being worse than a combined interface, never getting the resources or new features that experiences for sighted users get.
Where we seem to disagree is on the roll that screen readers play in limiting the usability of technology. On the one hand they are amazing because they provide access to technology that would otherwise not exist. On the other hand, by virtue of the way they function-mapping a 2 dimensional visual experience into a one dimensional stream of audio-using a screen reader can only be so efficient.
This lack of usability puts access to technology beyond the reach of many who are less tech savvy than you or I, and given that the vast majority of people losing their vision in the US are the elderly, there are a lot of people who fall into that category. What's worse, the rate of vision loss is set to double as baby-boomer's age out.
I totally agree that the medical-model of Accessibility sucks, but I think Screen Readers fall into that category. They seek to adapt an experience designed for others to the needs of the disabled. Conversational interfaces have the potential to create a consumer quality experience, that by it's very nature is accessible (at least to the blind).
And accessible by default is the best possible outcome.
Anyhow, I look forward to reading more about your SDK. Where can I learn more? I'm building an app that could benefit from a conversational UI on top of the traditional one and would be interested in reading up on what you offer, particularly as it's meant for blind users too.
One thing screen readers are super good at is exposing shitty IA design, which is regrettably common.
That said, it cuts both ways. There is a public transport app in the UK (Traveline GB) that as a low viz (legally blind) user I find incredibly frustrating to use, but my no viz pals absolutely love.
In this case it seems the IA is there but the visual interface to it is worse than what voiceover exposes.
Accessibity is hard.
That's a very low bar. Expert computer users like you and me have no problem with a screen that has even thousands of pieces of information on it. We use contextual cues to help us navigate that space, relying on good information design. But most people find current computers overly complicated. I think one of the reasons smartphones have eclipsed PCs is that the 4 inch screen forces you to cut down to a more human scale of choice complexity.
But that's why I am finding myself agreeing with OP that conversational interfaces are the future, for the same reason. They will force us to design down even further, into a one-dimensional UI. Choice, choice, choice, choice, etc.
That UI will be a slight simplification on the current phone UIs, and of course multiple threads can be shown on the screen at once to allow better use of screen real-estate for those who need it. Essentially, you just add lots of these one-dimensional UIs to the screen and multi-task. But the capital-T Truth is that even without a screen reader, using a Desktop computer and indeed vision itself is a largely linear process. Experts get fast at it, but we still must scan about to discover new information and places to interact. I think we will be surprised how much of our computational life actually becomes conversational.
One of the most exciting things to me about conversational interfaces is that writing automation software that uses them is much more trivial than writing automation software for a big complex GUI. And building AIs that live in them is also easier.
In general, I think there is something very important in the space of UI for beginners, mobile users, AIs, and people with disabilities that is happening. A kind of crystalization of intent. I'm only hearing people like OP start to talk about it now, but I actually think this is the next big wave in UI. And I do agree with you that most time spent improving UIs is wasted in exactly the way you describe.
So yes, I generally agree that conversational interfaces are a good direction, and I'd like to incorporate them into the app I'm currently building. I just worry that the tone of this article makes the current state of the art sound terrible, when it's less the state of the art and more the implementation details of that state that makes things difficult.
If you build conversation into the core of your UI framework in an intrinsic way, it means everything written in that framework will be done right. Every 2D interaction has a first class conversational analog because conversation is the base interaction primitive and all of the 2D interactions exist within that space.
I might have 10years, 20, 30... we don't know.
Although NVDA is free, its not as high a priority for some reason.
> For the first time accessible technology for the blind can drive innovation for everyone else, rather than having to play catchup, and that’s pretty cool
I wouldn't say for the first time though. Many solutions designed to address accessibility needs result in improved UX for all users. My team puts a lot of emphasis on accessibility for this reason. plug - I'm currently hiring a web accessibility engineer too so if you're interested find me on twitter @kenwarner