
Sound: AR's Unsung Modality - bolamike
https://arinsider.co/2019/10/02/sound-ars-unsung-modality/
======
moron4hire
I've been building a VR app at work, and one of the requirements is
accessibility (well, that's a requirement for all software, but we're taking
it very seriously).

So I've been working really hard on audio interactions for people with low- or
no-vision. I'm finding that, for simple stuff, it's completely possible to
build 3-DOF, maybe even 6-DOF, interfaces, completely audio-based.

One of the first things I did was create a system to make sure that every
clickable thing in the scene had interaction sounds. When you hover over
something that is interactable, it makes a "fwip" sound (in addition to
animating the graphic). When you leave the object, it makes a "fwup" sound.
The leep in usability from this simple feedback of "this is interactable" was
pretty huge.

I've integrated good speech-to-text and text-to-speech (thanks, Microsoft
Azure Cognitive Services! Easy to use, fast, and extremely high quality!).
That has gone a long way to really improving the experience, even while I use
it as a person with full-vision.

But I've also been experimenting with some things like making the currently
selected object hum a little bit. The further away from the center of your
view it is, the more it hums, thereby anchoring it in your spatial reference,
even when you can't see it. With the limited FOV of VR and AR headsets, this
is helping to overcome some of the object permanence issues and helping find
lost things.

I'm also considering making "the closest non-selected object in view" hum in a
different tone, growing louder the _closer_ you get to it. So when you are
scanning around an area, you'd hear objects come into and out of view before
actually needing to hover over them.

Anyone, just some of my thoughts.

