Hacker News new | past | comments | ask | show | jobs | submit login
Apple enters the generative AI race (elpais.com)
16 points by geox 30 days ago | hide | past | favorite | 29 comments



This article trash-talks Siri a lot, but I use Siri much more than generative AI tools. It can set times, alarms, play music, tell me the weather, control smart home accessories, etc. chatGPT can't do any of that stuff. And since it's just a voice interface for existing systems, it either works or it doesn't - no making stuff up. And if you want a bit of whimsy, it has a selection of jokes and stories.


The article is from a non english speaking country. Siri was (and sometimes still is) really horrendous in some languages, so that might explain the trash talking.

For example I had the following happen till 2 years ago: "Siri, turn off the lights" "Do you mean here or in the bed room?" "Here" "Sorry, I can't do that"

So you always had to include the room with your command.

The most recent bug is Siri pretends to not know some people when calling, but will happily message them.


Those of us “of a certain age,” will get this joke: https://www.joyoftech.com/joyoftech/joyarchives/3080.html


> no making stuff up

"Hey, turn on Sucky-Chan"[my robovac]

"Sorry, I couldn't find a device called Study-Jam"


That's an example of it not working. The feedback clearly indicates that it failed & why, then you can try again. I fear with an LLM voice assistant it will constantly hallucinate 'success' responses.


Nah - sometimes after 5 attempts and Siri misunderstanding me every single time (not native speaker - Google for Siri being discriminatory against non native accents) I just give up and do the thing by hand.

Siri saying “I booked your appointment for 3 am” when I said “5 pm” is also a hallucinatory success response - no LLM needed.


I don't think they are actively discriminating. Non-native accents are generally harder to understand, even for people. This is more or less true depending on the what kind of accent it is and how heavy it is.

I'm sure if a voice assistant was developed in Japan and I tried to speak Japanese to it with my American accent, I would have a really hard time. I wouldn't view this as discrimination, I'd view it as my Japanese being hard to understand.

I have issues with Siri trying to set a timer for 50 minutes. It always thinks I say 15. This isn't a hallucination, it's a misunderstanding, as the two words sounds very similar. Humans can also make this mistake and generally will ask for clarification if the number doesn't make sense in context. The typical way to clear this up with a human is by saying "five zero", which is what I started doing with Siri and it works. To be fair, 15 is a much more normal and common timer length than 50.


I would give up and rename my vacuum the first time that happened.


A voice recognition based task app is not the same thing as an LLM. They are solving two completely different problems. It's like saying you prefer bike riding instead of grilled cheese sandwiches.

Ask Siri to write unit tests for a function you've written or to "please explain the common misunderstanding of the phrase 'blood is thicker than water'" and then get back to me about how much more capable Siri is.


I agree with you. Siri has its limitations, but also has an truly massive scope of users that are served.



I’ve had several experiences where I ask Siri to play X on Spotify, then it says it’s going to play X on Spotify, and then nothing happens. The music never starts playing.


I find Siri to be rather useful - I don't use it for complicated tasks, but I use it daily. That's about all I want. If Apple can keep process all that on the device, then great. If they have to ship it off somewhere else to analyze it and quantify it for marketing purposes, then not only no, but Hell No.


I can’t upvote this enough. The average person doesn’t wan’t a semibroken digital god (ChatGPT), they want to know the weather (Siri, when it works).


But I‘d like Siri to have the comprehension and attention of the semibroken digital god. Knowing everything anyone has ever written might be a nice bonus.


Siri already is semi-broken if it can't understand basic names in your phone book or smart devices.

Siri being like ChatGPT would be a massive improvement.


I’m not sure what to expect. Apple is better than average at finding good applications of technology for regular people.

On the other hand, Apple is below average in integrating network functionality (iCloud’s many early problems, frequent performance issues with Music and the App Store).

Finally, they had a huge lead with Siri but never settled on a path for growing it.

I’m hopeful they’ll do some really interesting stuff here. I also wouldn’t be surprised if they miss the mark.


Did they have a lead? It seemed like they were behind google assistant from the get go.


Siri original release came four or five years before Google assistant.


I really hope this is true.

Siri is so annoying to use. Frequently, I say, "Siri, turn off Alex's Bedroom," to which it responds, "Did you mean Alex's Bedroom?" Or I ask it to play some song, and it picks something totally random, which is not what I said. Or I tell it to turn on the living room lights, and it turns on all the lights.

In many ways, I've given up on using Siri for anything more than setting timers because nothing else works reliably. It will be hard to retrain myself actually to use it when they fix it.


Bologna clickbait title.

“Analysts expect a total update to Siri and an agreement with OpenAI or Google to integrate chatbots into the iPhone.”


Why? Neither OpenAI or Google provide good models that can be run locally on phones which is Apple’s advantage.


They’re going to be cloud models. Also Google’s Gemma:2b could totally run on an iPhone.

But Apple has their LLM research, and MSFT and others built good local models. Apple will offer local LLM APIs, which Siri and Apple-provided products will rely on a cloud service, as it had.


I said clickbait title : the title uses a present tense indicative mood verb, i.e., factual, then immediately in the first sentence switches to subjunctive "alternate possible future reality" mood.

it's total clickbait to announce something factual that is only a possible future.


I don’t know the stats because they’re obviously private, but I’d suspect that 10x more people use Siri than chatGPT and Gemini every day. I would much rather that they focus on improving Siri’s voice recognition and increasing the number of actions it can take, rather than trying to make it a general purpose talking computer.

I use Siri all the time to set reminders and alarms, to control devices, and to change buried settings like auto brightness etc.

What I care about is responsiveness/latency and accuracy, not having the thing pass the Turing test.


Google phones now have the option to use the old or new assistant.

The new version breaks almost every usecase for assistant.


Hand tuned dialogflow pipelines vs generalized transformers


Not sure how using openAI means they are entering the race.


It's more like they are betting on a horse that's in the race.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: