Hacker News new | past | comments | ask | show | jobs | submit login
Signtime.apple: One-on-one sign language interpreting by Apple (signtime.apple)
286 points by popcalc 6 months ago | hide | past | favorite | 116 comments



In an era where having customer support at all is not a given, paying real human beings to offer customer support through ASL is pretty next level.


Agreed! However, I also wonder what their ratio of spam vs. actual ASL-requiring-customer calls is. Since it's so easy and quick to connect to a real human with comolete audio and video enabled on both ends, won't it be prone to abuse?

The more such a service gets attention from regular media (instead of the niche, targeted audience it aims to serve), I fear it'll be another case of "why we can't have good things".


It’s already pretty easy for anyone to talk to an human at Apple tbh, be it by phone or by going to a physical store, their support is already really accessible for everyone.

Here if I understand you don’t get access to support but to an interpreter who will call the support for you. It’s of no interest if you don’t know ASL


Agreed, their customer side gets a person very quickly and so does their Apple Business Manager program. After spending many hours fighting AT&T’a automated and unhelpful customer support I got connected with a knowledgeable person instantly with ABM who was able to diagnose my issue at a level I never dreamed possible for what I’ve come to think of as “support” from most companies.

They were able to confirm the issue was on AT&T’s end, give me the exact language to use when talking to AT&T, and were nice and easy to work with. At one point they needed a picture of something and I braced myself for what that process would look like, it was stupid simple, they sent me an email with a link to upload the picture. It literally couldn’t have been easier but I don’t think most support places could have accomplished that task.

I have been an Apple customer for well over a decade and that ABM experience blew me away and made me happy I had gone with iPads for my business over cheaper Android tablets (one of _many_ reasons).


Something that baffles me about modern business is that Apple is far and away the most valuable company in the world, they’re phenomenally profitable, they’ve got an absolute mountain of cash, and they’ve done so by basically eschewing every page in the MBA playbook, and this has had no influence whatsoever on how the business world talks about proper business strategy.

They’re obsessive about quality, they treat design and customer experience as first class parts of the process, they invest heavily in long-term R&D, they have world leading support - they’ve basically done the opposite of what every financial analyst and business consultant in the world thinks is the right move, and as a result they’re worth more than Saudi-Aramco, and yet somehow every company out there is still sprinting in the exact opposite direction of how Apple operates.


Apple had one leader who thought differently, and that paid off, and now there is already some inertia inside apple because that is part of their brand. They put out one bad keyboard, mess with the ports in their macs and the world judges them harshly, everything is something gate.

Because Apple's way works does not mean other ways do not. Those other ways are way easier and cheaper, and businesses are fine chasing lower hanging fruits.


One part is probably because they are so secretive externally and internally, so there have been no business books on apples culture, playbook etc. No TED talks from ex Apple leaders.

I think Tesla is the closest to trying to follow their model, in a much scrappier way of course.


Tesla in no way follows Apple’s model. Musk has a very specific way of running companies reflected in both SpaceX and Tesla. That involves two key things Apple would never do:

- ship 90% solutions to learn very early what doesn’t work - iterate and make changes very rapidly to the product

This can be seen in the model 3, which has had significant variations even within a year. Both improvements (better fit and finish) and likely mistakes (deleting lidar).

The same is seen on the SpaceX side. The starship launches were wildly successful despite the catastrophic endings. This can be seen in the live streams of the employees cheering wildly at the end (which completely baffled armchair critics).

I’m on the fence if this is a good way to run companies long term, but the general point is that it’s about as far from Apple’s culture as you can get.


Nit: Model 3 never had lidar, radar is what they removed.


That Apple doesn't ship 90% solutions is laughable. The initial versions of OSX, iPhone etc. had a lot of rough edges, and were practically public betas.

The only difference is that Jobs and later his deciples were able to blow smoke up your ass, and sell the obvious incomplete product as focusing on some other distraction, or something. Clearly the reality distortion field worked.


Hardware vs software. Big difference because one can be easily updated.


You must not recall Antennagate.


I do, that’s one phone in how many models? And that wasn’t even “bad” compared to the flaws spacex/tesla are willing to accept.


I'm not suggesting that Apple has egregious hardware or software problems, but (as a fanboy of neither company) that you're cherry-picking by claim that Apple isn't "ship[ping] 90% solutions to learn very early what doesn’t work".

Panel gaps or whatever at Tesla are obviously something they know about, it's not like nobody at Tesla is aware that their fit and finish doesn't match a Mercedes S-Class.

Whereas Antennagate is something that shouldn't have escaped the lab, I can't really a similar incident with Tesla.

The recent Starship explosion was a success, if you want to see the cost of avoiding rapid iteration when it comes to rockets look no further than the SLS.



Tesla has not been known for great customer service. They are user friendly..ish, but human to human level experience is a hit or miss.


They were in the early days. I remember someone posting about how their Tesla broke down and a Tesla flatbed pulled up with a working car, swapped them out, and left their car as a loaner


> Here if I understand you don’t get access to support but to an interpreter who will call the support for you. It’s of no interest if you don’t know ASL

I might receive some slack here, but I gave it a try (I don't know ASL). Within 5 seconds of clicking the button, I was connected to a live human, with both my video and sound, as well as the other person's, turned on (although I did have the ability to switch them off). There was no intermediate step to filter out spam calls. I did cut the call quickly though to avoid wasting their time!


Wouldn’t the spam calls be pretty minimal if they make it known that they will not provide support to non-ASL callers?


Not just ASL but a total of nine signing options (click the Languages button in the top right).


And once again, the strange way Apple operates in Canada appears. They offer ASL, but not LSQ, the common sign language for francophones.


There's probably fewer than 1000 deaf speakers of LSQ, many of whom are also fluent in ASL.

ASL is a more common language for deaf people in Francophone communities than LSQ.


Do you have sources for that?


Canadian census table, "mother tongue", Quebec Sign Language is listed as 990; language spoken most often at home 1475; "Other language spoken at home 6200."

c.f. ASL at 4965, 8175, 37620.

I perhaps overstated the case, but...

Surely a good part of the 990-1500 are also functional in ASL. Those who aren't are a vanishingly small subpopulation to have a team of interpreters on staff scheduled to shifts, ready to videoconference.


> Surely a good part of the 990-1500 are also functional in ASL.

Do you, once again, have sources for that? I'm only trying to highlight the weird edge cases Apple constantly makes to operate in Canada, and you're carrying their water for them. They're adults, they could have explained themselves why they don't serve that language community.


I think I've done pretty well providing sources and you're pretty adversarial. e.g.

> you're carrying their water for them

[You made a statement, and anyone who chooses to disagree with you is "carrying [Apple's] water for them". So I guess we should just STFU?]

> they could have explained themselves why they don't serve that language community

I think this is ridiculous. Are you aware of any company making a statement about why they don't serve a specific language community?

In any case, sure, I will take the bait and further strengthen my point:

Within Quebec, 1310 in the last census indicated that they are conversant in ASL.

The number of people in Quebec who would prefer to speak Quebec Sign Language are dwarfed by the number of people in Quebec who would prefer to speak any one of Moriysen, Telugu, Bosnian, the Swiss dialect of German, Nepali, etc. I have not heard any statements by any firms about why they do not cater to these language communities in Quebec.

There's also hundreds of people in Quebec who are native speakers of other sign languages other than LSQ or ASL. Apple does not cater to these people, either.


100%. Wonder how soon this will become automated though? Since ASL seems like an image recognition task, and AI models are getting pretty good at that.


It’s not really, ASL is it’s own distinct language from English, like most sign languages. ASL is most closely related to French Sign Language, which still isn’t anything related to French and has its own linguistic features and independent development. Because of this translating ASL to English (or any other language) is no easier than translating to or from any other spoken language.


And to add on a little, ASL is like any other language in that it has regional dialects. It's why many people introduce themselves with their name and where they learned to sign.

One of the more common examples from teachers is the sign for SLOW. On the west coast, we sign it as the flat palm being pulled slowly back along the extended non-dominant arm. But in the south and parts of the east, it is signed as the Y sign, thumb against the cheek, being tilted forward.

The way you fix this casually is to ask the other person to finger spell or mime what they meant. I can't see AI being able to do that well for a while but maybe..?


Sure, but machines are actually pretty good at spoken and written translation these days, so I would expect that once the image recognition is solved they could handle ASL readily as well.


Oh absolutely, I agree. I just wanted to make certain it was clear that it isn’t as simple as classifying images as words.


Since ASL doesn't have a massive corpus of scrabeable training data (at least I would think). It's probably on the harder side when it comes to creating machine translators.


There must be a huge corpus of TV broadcasts, filmed live events etc. where a person speaking is being translated into sign language in real time.


Those are usually subtitled.


in theory with a mastery of speech recognization one would assume you could then apply that to any diction that has an accompanying ASL translator to derive ASL elements from the contexts given by the spoken word.

in practice that's all difficult, if at all possible -- but just sayin; we have decently good speech interpretation at this point, perhaps we aren't far from self-training ASL against something similar.


Usually, but I'm referring to the broadcasts that have a live sign language interpreter on-screen.


If Apple is filming its ASL employees, it’s about to get a lot more


I suspect that buoys and placement make signed languages harder for machine translation than spoken ones.


> ASL seems like an image recognition task

Not trying to be trite, but I imagine you don't know ASL. It is an extremely loose and context dependent language really. I am sure it can be done eventually, but I still think it's very far off in terms of recognition. Would love to find out more about the state of the art for ASL or BSL recognition though.


Never, the whole point is to employ ASL fluent people to keep the culture and language alive.


You mean a video recognition task.


I thought it was AI


Sounds like a smart way to build a high quality dataset of captioned ASL videos while also providing a great service. Guessing in 2 years they’ll have a decent ML model.


I’m not saying they won’t do that but Apple has been committed to accessibility in a way that no other company comes close to for decades. It’s in their core DNA and something their CEOs have said they will do even though the return on investment doesn’t really exist. Though I think overtime, it will.

I wish “accessibility” wasn’t such a dirty word for most people (thinking it’s only for “disabled” people, or people with a problem). Accessibility is for everyone and I encourage people to dig into that section of their phone settings. Most people will be surprised at what is there and that there are probably 1-2 things at a minimum that they might be interested in turning on for themselves.


I've seen more people using iPhone accessibility features without "needing" them than I have with people they're supposedly designed for. Having the rear light flash for notifications was a huge one a couple of years back, which I mostly saw out in nightclubs. (perfect application, if a bit annoying to get flashed in the face a few times a night.)

People using that floating assistivetouch button when they have a damaged screen/home button was huge too.

I think most non-technical people either don't know they're there or view them as hidden extra features. (which they kinda are, for things like mouse support)


Great point. My favorite to use is the back tap for shortcuts. Handy but unknown.


I think this is wonderful! The features benefit everyone


You are spot on in your second para. A lot of the "cool" advanced settings on iOS are hidden inside accessibility settings.


It is for anyone whose glasses get broken while they are out and about and don't have a spare pair.


I'm surprised to hear that "Apple has committed to accessibility in a way that no other company comes close to for decades" because Microsoft has already been doing that for over a decade, and is by far the most used OS for accessibility reasons.

Apple is actually pretty behind in accessibility on desktop from what I've seen. Their apple designed peripherals (kbb/mouse/etc) are extremely anti-accessible and seem to be designed for the youngest and coolest crowd with zero thought for anyone with any disability. Unlike Microsoft, they do not sell any accessibility or disability focused peripherals. And their OS doesn't make accessibility a first class experience.

Third party support for hardware and software is massive for accessibility, and OSX is notoriously bad for third party support. Major third party peripherals from the biggest companies often have pretty bad support, and so the accessibility minority gets the short end of a short stick on OSX.


> I'm surprised to hear that "Apple has committed to accessibility in a way that no other company comes close to

Blind users who want to use Windows most often end up needing to pay over a thousand dollars for a JAWS license, since the built in accessibility software isn't very good. Remote support from them costs an additional $200.

https://www.freedomscientific.com/products/software/jaws/

Apple includes full featured accessibility software with the device, and free support for it as well.


Two comments here

1: my understanding is that blind folks DRAMATICALLY prefer JAWS to the built-in OSX tools by an overwhelming margin, something like 9 to 10. That suggests that the OSX tools are extremely inferior if nearly everyone feels obligated to spend.

2: You can literally get a windows computer AND add $1000 license to it and still pay as much or less than a OSX computer. My M1 MBP cost well over $2000! You could put JAWS on a windows computer for $1500 or less. Just pointing out the extreme, extreme cost difference for accessibility users with OSX vs Windows! You can't even get into OSX for under $1k these days, and even then, it's rough and you should spend more.


Your claim was that Microsoft is as committed to accessibility as Apple is.

> I'm surprised to hear that "Apple has committed to accessibility in a way that no other company comes close to for decades" because Microsoft has already been doing that for over a decade"

The accessibility features built into Windows are just terrible, so your premise doesn't track.

Also, Amazon was selling an M1 Macbook Air for $750 this week, so you can get the hardware and the software cheaper than just buying the additional software you would need to make Windows at all viable for a blind user.


>The accessibility features built into Windows are just terrible, so your premise doesn't track.

Low effort quoting empowering false arguments. I'm disappointed in you.

I also said:

>Third party support for hardware and software is massive for accessibility, and OSX is notoriously bad for third party support. Major third party peripherals from the biggest companies often have pretty bad support, and so the accessibility minority gets the short end of a short stick on OSX.

And arguably "third-party support" was the biggest plank in my entire argument and the core problem with accessibility in OSX: worst-in-class support for third party hardware and mediocre support for third-party software.

To ignore the largest part of my argument to target an intentionally misunderstood snippet is intellectual malice, completely unserious, and undeserving of a serious reply.

Please re-read my comment which extensively discusses hardware, peripherals, price, and software (both first-party AND third-party support) if you want to have a serious discussion with me.


Say what you want about Apple, but as someone with unilateral hearing loss I can tell you this is absolutely not true. Literally out of the box Apple products for at least the last decade have specific features such as a basic sound channel balance (missing from Xbox, unsure about Windows) and best in class hearing aid support (a third party peripheral). Reliability and ease is important for these features and I don’t think buying add ons from Microsoft is remotely in the ballpark.


iPhones are essentially the only smartphones with enough accessibility features to be used by the blind, I suggest you watch a video on it because it’s quite fascinating. What do you find anti accessible about the apple peripherals? I’ve also had some experience using VoiceOver and found it acceptable.


I have a friend who is legally blind. Her husband and kid use iPhones and iPads, and she has an iPad, but for her phone she prefers Android.

It’s very confusing to me because of all I’ve read about Apple accessibility but she says she finds Android much easier to use!


ASL does not have a 1:1 correspondence with spoken language and it uses a large amount of "short hand" and "place holders" for various situations. Throw in fingerspelling, which many signers can do at a very fast rate, and I strongly doubt the ability of a fixed location camera to accurately read an ASL conversation at any useful level.


That's such an HN answer. Someone does something nice for accessibility, and the first thought is: how can we extract value from this?


This is a tech company in 2023. Of COURSE this is a data mining thing.


Why would a profit seeking engine not automate this? When was the last time any company at thr FAANG level did something "nice" just to be nice? There's always a hockey stick graph somewhere.


I was hoping to see something along the lines of ASL recognition. I strongly suspect that (an extension of) ASL could be an excellent way to interact with VR UIs. ASL natively exploits spatial context, which is what initially piqued my interest. Sadly this is an extremely difficult problem to solve.

As a bonus more people would learn ASL.


What would be the advantage of ASL over speech recognition for VR?

I suppose if you were on an airplane you could avoid disturbing other passengers, but I don't expect many people to actually use VR on airplanes.


And using VR in planes will be strange enough for at least some years to not add moving your arms in all directions strange.


Stranger than speaking out loud with your headset on?


Well, if you're on an airplane, you could do neither of those things and just watch a movie.

And if you're not out in public, you can use speech recognition more easily than learning ASL.


The format of ASL is better suited for spatial problems. Speaking is one dimensional, it occurs at your mouth.


Could you please describe an example?


ASL recognition is an insanely difficult problem to solve because it requires solving hand, finger, and arm pose estimation with occlusion, which hasn't even been properly solved in VR, let alone through webcams.


Face/head too.


Yep, where is “AI” when you need it?


I think if you had automatic interpretation, removing much of the financial incentive to learn it, and even likely a large portion of the social incentive, far fewer people would learn it.


I'm not sure I understand; why would automatic interpretation reduce the incentive to learn ASL?

I learned ASL, as a hearing person, to be able to communicate with deaf friends anywhere and with my (hearing) spouse and other friends in noisy or large places. It's also a hedge against my future potential hearing loss from tinnitus.

Automatic interpreting wouldn't solve any of those and neither it nor automated captions take the place of human-understood interpreting. I think putting ASL in something like VR where it is more visible to the hearing world might broaden its reach?


You think if you make it a less commercially viable skill and make it easier to communicate with deaf people without learning it, more people will learn it? Or even the same number of people?


I’m confused. Is the statement that people learn ASL for money and some social kudos?


Ok guys, please actually explain instead of just pouncing. Why wouldn't text chat be more appropriate in this situation? I understand that ASL could be much more quick/fluent in a person to person interaction, plus it doesn't need any technical means of writing, whether pen and paper or a computer. But isn't technical support a rare use case where typing speed is not a big bottleneck and where also writing things out can help precision and cut down on having to repeat yourself? Given that resources are always limited, wouldn't it be better to invest in accessibility where that has the most impact rather than a corner case where other workarounds are available?


On the basis of that argument, why provide a voice service either?

I understand that talking could be much more quick/fluent in a person to person interaction, plus it doesn’t need any technical means of writing, whether pen and paper or a computer. But isn’t technical support a rare use case where typing speed is not a big bottleneck and where also writing things out can help precision and cut down on having to repeat yourself? Given that resources are always limited, wouldn’t it be better to invest in accessibility where that the most impact?

Accessibility is about meeting people where they are, rather than just trying to cater to the lowest common denominator by finding “workarounds”. Apple already provides a technical chat service, and I’m quite sure deaf people are capable of using it. Is people are choosing to use this Video Relay service, its for a reason, presumably because they find ASL easier to work with than written language. Perhaps it’s an age thing, personal preference, or maybe they have other difficulties that make reading and writing difficult. But forcing people to use the cheapest form of communication simply because catering to their needs is a “corner case” isn’t providing accessibility, it’s ignoring the needs of individuals because they’re inconvenient.


ASL doesn't seem to have a written form in common use. That user might read English, but only as a second language.


Written english is often only a second language for Deaf and hard-of-hearing people. Sign languages have their own idioms, culture, and identity; ASL isn’t signed English.

I previously worked at a video relay service company (VRS in the US is a service paid for by the TRS fund through the FCC, and allows Deaf and hard-of-hearing people to make phone calls through video-chat with a ASL interpreter). In written English-based interactions with Deaf and hard-of-hearing colleagues, there is often a communication barrier as there often is with anyone speaking a second language.

In my personal experience working on tickets written by D/deaf colleagues, while sometimes we could communicate by whiteboard or text-based chat, it was indispensable to have the option to discuss the ticket with someone who could interpret present.


Good to know, thanks! I was thrown off by "American" bit so assumed to was fairly close to English.


For anyone unclear of the purpose, this seems to be an Apple-run Video Relay Service.

https://en.wikipedia.org/wiki/Video_relay_service


This one could have the benefit that the "interpreter" is aware of common apple-specific product topics and be able to directly solve problems themselves, advise on stuff, etc.

And then fall back to interpreting when more expertise is needed - with the added benefit of having a baseline understanding of the topics discussed.


I didn't know that Apple had their own TLD.

Apparently they got it in year 2015.



here's the full zone file for the TLD...

https://pastebin.com/NAnMzk3P


Interestingly enough, it seems they don't use it for much, even redirects. Domains like store.apple don't resolve.

You can of course find pages via Google:

site:*.apple


The body governing TLDs really jumped the shark. The out in the open corruption and extortion is a serious dark mark on the tech industry.


The funny thing is they’re actually doing themselves in. By refusing to moderate they only further reinforced “not .com == scam”.


I was going to comment the same thing. I’m not surprised that they have one, but this is the first I’ve ever seen it.


Worked on software that enables users to share their screens with customer service for troubleshooting purposes. Certain "customers" will call, hang up if the representative sounded like a male voice, if it sounds like a female voice, they would turn their Photo Booth app and flash themselves naked. A particular user flashed himself 100 times. Every time there is a non automated customer service, certain groups tend to abuse it, leading to an automated customer service.


All companies with Apple-like cash reserves should be doing this.


Interesting that it's actual people. I would have expected it to be generated AI, something like Signapse

https://www.signapse.ai/


Same, thought apple would never call it AI. I got moderately excited about iPhones interpreting one or more forms of sign language from reading the title.

I'm sure that's not too far off though. I expect it would be somewhat limited until sane conventions evolve for adjusting the grammar to English/user-language. (or sidestep that, and just display and store messages using memoji)


Maybe this is a stupid question but what are the use cases for this where text and subtitles don't work?


ASL is not signed English. It has its own distinct grammar. Many people who use ASL are more comfortable and fluent using ASL than communicating using English text.


Okay but now you need this very complicated video system to communicate, rather than text which works everywhere and you can read at your own pace. Also with this system it would still be auto-translated from written English so I can't see it being more expressive than the source text.

How many people find the video of the guy easier to understand than the text saying King's Cross - 14:29 - Cancelled?

https://static.wixstatic.com/media/c30a4d_0430c75ba26545c0b7...


Are you always this hostile about other languages existing?


I have no problem with the language, I'm just genuinely curious who this tech is for.

I totally get that it's useful to have a separate sign language for communicating in person. But machine translating English text into a video of a person signing when the original English text is right next to it? Does anyone find that useful? How many people know ASL/BSL but can't read English?


How do you reply to a text display? It’s a replacement for dialog, not prose.


I'm talking about https://www.signapse.ai/ mentioned above.


I wish we could see more of this in other industries.

Not related, but as a deaf person fluent in ASL-- where do I find people to talk to online these days? Where I live now (NW Europe) _ there is sadly no one to talk to.


If people are already able to video chat, then why not call Apple directly?


Because most customer service agents don’t understand American Sign Language, and it’s quite hard view sign language over a phone call, hence the whole sign language interpreter thing.


Because (a) video chat doesn't work brilliantly with sign language (signing can be rather quick, so any latency or artefacts could results in words or expressions being missed, and (b) not every Apple CSA knows how to sign ASL -- most don't.

Most places don't have ASL interpreters on-call or even to any degree among their staff, and often rely on external companies to provide (often pre-booked) ASL interpreter sessions to D/deaf clients.

It's really nice that Apple is making this available so easily and accessibly, and in-house. Just being able to click and connect to an Interpreter is a great improvement in accessibility for ASL users.


And then wait while they get redirected to the ASL agent or bring an ASL interpreter in on the call? Why would anyone want that?


It’s so confusing (and could lead to security issues) for Apple to use .apple TLD

It’s super easy for someone to get confused between example.apple and example.app/le (since both .apple and .app are TLDs)

They did this with music first, now this.

https://learn.applemusic.apple/apple-music-classical

(And it’s further confusing because sometime .apple redirects to .com)

What’s the logic Apple uses to decide between .com or .apple TLD usage?


> It’s so confusing (and could lead to security issues) for Apple to use .apple TLD

Rest assured - anything under .apple brand TLD is Apple's.

https://www.apple.com/legal/intellectual-property/tld/regist...


.app/e or .app/le are not but look quite similar.


Don't most (and possibly all popular) browsers these days hide or de-emphasize the path in the address bar, showing only the domain?


the full host, including subdomains.


In my URL bar, .apple is bold and bright white, the /le on .app/le is grey and faded.

With modern browsers, I don't think this is a problem anymore. Subdomain attacks and query length attacks have already made browsers put address shenanigans mitigations into the address bar.


According to [1], this is already possible with more than 30 TLDs: .silk (.si is a TLD), .google (.goog is one as well), .college (.co), .calvinklein and .cal (.ca), .gallery and .gal (.ga), .select (.se), .afl, .aol, .srl, .delivery, etc.

[1]: https://data.iana.org/TLD/tlds-alpha-by-domain.txt


Both chrome and firefox make it very clear what the domain is in all of these cases.

I don't recall the last time I saw a spam/phishing attempt that was this sophisticated ... mostly they are things like www.apple-support.com which is ironically for sale right now.


It always feels like my router page filled with javascript by my ISP when I visit a domain doesn't end with regular tlds. Is apple domain really belongs to apple? Nobody is questioning that, what a shame for hn users.

Edit, it seems legit: https://support.apple.com/en-us/101572


There are multiple comments talking about Apple tld…


Yup, shame on me




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: