She has an Android phone, but it's not even close in terms of the accessibility options provided. Apple is excellent in this regard.
The differences probably come more from the splintered android landscape and Apple's stricter review process.
Source: met him
Anything except the price that made you/her go with an Android as phone?
What a great citizen of the IT community and the world. It would be a great loss if new management came in and decided that all of this wasn't profitable.
They made their decision on what their externally-facing corporate culture would be. It's no one's fault but their own, that people hold them to a higher standard than they may hold others.
What a great citizen of the IT community and the world.
They're leading the day-to-day normalization of the idea that end users don't get (and shouldn't have) ultimate control over their devices. Calling them a "great citizen" is an incredibly frustrating thing to read.
Or said another way, the vast majority of users don't care to have complete control over their devices and place more value on Apple's approach.
I think many people feel 'empowered' using an iPhone just the way it is. It allows them control over their lives, not the technology.
Control is just the ability to direct a thing to behave in line with your intentions. I think apple understands that more than others, even if that's not always in line with what us nerds want it to mean.
In the video it looks like they've given a blind person ultimate control over their device.
(I know what you meant by "ultimate control," but don't think it's a very human thing in your context.)
- Easy to turn on/off with triple tap home button shortcut
- Doesn't require looking at the screen at all
- Double, two finger tap to play/pause podcast player or music
- Automatically reads incoming notifications
If you want to see more of it in action, you can check out the Apple Design awards, where VoiceOver engineers demo apps:
https://developer.apple.com/videos/play/wwdc2015/103/ (36:30, disclaimer: I work on this app)
Misusing a word is literally the least interesting thing about conradev's post.
For the record though, since it seems important to you - there was no "heavy downvoting". Even if there had been, please consider that you don't speak for HN, and couldn't possibly know the motivation.
You're right, I don't speak for HN. But note that your comment here got flag-killed, and your previous comment got downvotes. Some people want to know why their comments get downvotes and flags. I'm telling you: low value English usage nitpicks get downvotes.
The user takes a picture of the object with their phone and asks a query, and it is uploaded to a crowdsourcing platform (Mechanical Turk, I think) where workers answer the query, e.g. "What are the ingredients in this can of food?"
"combines automatic image processing, anonymous web workers, and members of the user's social network in order to collect fast and accurate answers to their questions."
Voiceover is currently an extra layer over a visual interface. So, it amplifies the gesture cost of a visual+touch UI.
After writing a NLP interface for a productivity app, I learned that most of the verbal interaction took less time the visual alternative. It didn't matter if you were sighted or blind. Fitt's law meant that navigating a visual+touch UI was too slow.
So, I'm now working on a verbal+touch UI. In that regard, voiceover users are power users.
1) use voiceover when headphones are connected
2) use tapic engine for clues about marked events
This year, pervasive Bluetooth headsets make 1) a lot more viable.
Because sighted users may use headsets a bit less, have also included graphic cues.
"VUI" - first time I've heard of a VUI. Like it.
How about you?
On the other side, I'm concerned that people have to use proprietary software to enable their sight. I doubt that there is open source software for blind people of similar quality. Am I wrong? I hope there is such software.
What would RMS do if he had a choice between using a mobile phone with proprietary software to enable sight and remaining blind?
Everything an able-bodied person can do with an iPhone I can do with my chin, usually with Microsoft and android and other open source offerings the accessibility software is always some subset of options and not the full experience. With Apple I get to use every aspect of every device despite the fact I can't move my arms and legs.
I would absolutely love to switch to Linux as my daily operating system, but I can't because the development has been done and it just isn't possible for me to use them as they are. So stuck with Apple I am. If Linux had the kind of support Apple did, I would switch in a heartbeat but they don't unfortunately.
They really are world leading in providing accessibility software for those of us with profound disabilities, and I've spent a decade looking.
Full disclosure: I help beta test the last couple of versions of the accessibility software for iOS.
As someone who's used Linux for many, many, many years... I want to find fault with your conclusion that Linux and Open Source in general can't do this, but everything I've seen suggests I'm not very likely to be able to.
Even for able-bodied people, Linux isn't all that accessible - Some examples include poor font rendering (Yes, chrome renders nicely, but it's nowhere near consistent across all the applications I use day to day), another example is the fractured UI landscape, where apps tend to either be KDE, Gnome or Java, and each platforms idiosyncrasies leaks in leaving things like iconography, menu placement and organisation, and general UI styles different between each application.
I can't help but think, until we can sort this style of issue out, we have no chance of sorting out true accessibility. I'm reminded of XKCD's competing standards comic, where the most likely outcome here is probably yet another competing "standard".
I read Free Software: Free Society and I was amazed at how RMS's economic plan was basically 1) unis fund early development, 2) developers flesh out the ecosystem for free, 3) users start flooding in and, 4) these users magically start paying(or the hardware manufacturers do).
Back in the 60s it might have looked like the majority of effort to develop software just took some time by developers, but nowadays in many areas the developer time is a fraction of the overall effort required. It takes huge investment and if there is one thing Capitalism is good at, it's the productive and efficient deployment of capital. It's right there in the name.
The big difference they've made is that they've created tools to make integration with Voice Over so easy, that it's a no brainer to do what little work needs to be done to make an app accessible and to increase demand for it.
They've managed to turn a burden into something that has an obvious economic benefit for app developers.
Also I'm fairly sure that there is actually a good RoI. It might not be massive but in the long term having the monopoly on phones for blind people surely pays off.
I don't believe they mandate it (how would that work for games?), but Apple provides excellent tooling and support throughout the system and it's built into all native controls so adding it to an application requires relatively little effort. See joshaidan's comment above: https://news.ycombinator.com/item?id=13846160
And yes, there are open source and free options available. iOS' accessibility APIs are closed to third parties and VoiceOver is the only option on that platform. However, there is free and open source screen reading software for Windows, Linux and Android.
I can't say what the state of Orca, the free Linux screen reader, is these days, but I know development is still going on. Orca wasn't so much of a problem in the past when I tried it, it was more that it could be a real pain to get everything working together. Think reasonable low-latency sound output for speech, driving a braille display through the BRLTTY software, getting the screenreader runnign at the login screen etc etc. I hope that has improved by now, but I only interact with Linux through SSH sessions or local text console these days.
Then there is NVDA for Windows. A free and open source screenreader mainly developed by two blind guys. On many fronts it has feature parity with the very expensive commercial offerings and even surpasses the commercial offerings on certain points. I use it as my daily driver.
In the past I also used a Mac near fulltime, but the VoiceOver of Mac OS became to buggy for my professional work. Also, usually updates only came when the OS was updated, so fixes and new features could take a while. So, long story short, open screenreader on a closed operating system that provides stable APIs seems to be the best of both worlds for now.
He considers his efforts to have been wildly successful.
Makes sense. I don't eat meat, but I'll eat the meat of someone else who's decided to make that tradeoff.
People don't eat meat for many reasons. Those reasons may or may not include a moral component. If it does include a moral component, there are varying amounts of pragmatism you can add to it. I eat meat, but I'd happily switch over to vegetarianism if, say, I lived in India and the vegetarian food there is amazing, and everyone around me was also eating vegetarian.
> He considers his efforts to have been wildly successful.
You could look at it as a problem, or as an opportunity.
I think it's excellent that someone (in this case Apple) is setting a high standard for others (libre or not) to follow.
see, that assumes you have infinite time & money at your disposal. But most of us have jobs and other commitments which mean we can't just drop everything to spend years of our lives researching and building something
The same he does now - not use any mobile phone.
This is hard to replicate in the open source world where the model is "everyone contributes what they have an economic incentive to create".
On the other hand, regular command line tools are fairly accessible to blind users (I imagine, though I wouldn't like to read a man page in Braille, or by TTS), though of course the usual issues of inconsistencies between tools are magnified.
iOS VoiceOver ranked as one of the 3 most popular screen readers, second to JAWS for Windows  (closed source, paid) and above NVDA for Windows  (closed source, free, donations encouraged).
ChromeVox  is AFAICT the only free and open source screen reader that came up in the survey, at 1%.
-  https://accessibility.blog.gov.uk/2016/11/01/results-of-the-...
-  https://www.nvaccess.org
-  http://www.freedomscientific.com/Products/Blindness/JAWS
-  http://www.chromevox.com
Apple has guided developers into using this technology in apps they write. As an example, Apple's VoiceOver screen reader technology uses widgets that are tagged with information about what kind of control they are. Developers can use this same tagging information when writing automated tests for their apps, which is a great way to enable developers to add assistive technology to their apps without making it difficult.
My favorite option I've discovered is the magnifier. When it's on, you can triple-tap the home button to open the camera and turn the iPhone into a little hand lens, even if it's locked. The shutter button doesn't save the picture; it just freezes it (with some fancy optical stabilization) so you can pinch and zoom even further with your fingers. This is more convenient than swiping around on the lock screen to open the ordinary camera.
It's really helpful to read signs and restaurant menus. Settings -> General -> Accessibility -> Magnifier.
I'm not blind, but I have played with the iPhone accessibility features and it really is surprisingly easy to navigate around the phone and apps with it.
The haptics in the iPhone 7 also let you feel buttons, which is very cool and could be put to good use in accessibility.
Anyway, it's really cool how well the VO stuff seems to work on Apple devices, including OSX. A lot of companies tend to just do the bare minimum to meet regulatory code.
However, I am not a disabled person, just a curious dev, so I'd definitely defer to someone with more experience using accessibility tools. Just saying it looks pretty good on the surface to me.
Allows him to do a lot of stuff that wasn't possible before.
The state of the art are still two lines of text in fairly big braille devices (for ~$5k-$10k). There is some very interesting work being done at the University of Michigan on a new mechanism based on microfluidics, bit it remains to be seen if that mechanism will reach a form factor even approaching a phone.
Edit: That was sort of tongue-in-cheek, but I could see being able to run your finger around the screen and when you hit a 'dot' there's a haptic response.
-- it's very basic
Yes, it might not work magically in every app, but with a good choice of apps, a blind person should be able to do most of the normal activities.
If I'm wrong, please, let me now.
That seems like it would be offensively dismissive to people who have to deal with this issue.
Apple puts a phenomenal amount of work into allowing people with all sorts of different handicaps and disabilities to be able to use their devices. I remember being in college before the iPhone was out. I had a blind classmate who had a special Nokia Symbian phone with thousands of dollars of software on it just so he could use it. All paid for by insurance, and it crashed all the time.
I don't think I've ever seen an article say a bad thing about Apple's assistive technologies. They are built-in, they don't cost extra, they basically work everywhere. Developers don't even have to put in that much work to make use of it. It's a core part of the system which means that when they decided to build watchOS it was already there and ready for people with disabilities. When they updated the Apple TV to use a version of iOS it got lots of the accessibility features as well.
Basically accessibility seems to be a fundamental part of Apple's UI philosophy, while on most other platforms support for it is pretty uneven. I'm sure the TalkBack team work hard at it for example, but at Apple accessibility isn't just the responsibility of one team, every team that works on UI is responsible for accessibility.