Hacker News new | comments | show | ask | jobs | submit login
Google’s most advanced voice search has arrived on iOS (googleblog.blogspot.com)
304 points by cleverjake 1843 days ago | hide | past | web | 184 comments | favorite

This is Next Level Shit. This is absolutely next level execution. The responsiveness is incredible and it immediately falls through to a well-formatted search result if it can't give you a soundbyte or a Knowledge Graph result.

Unit conversions provide in-line converter widgets... it'll gleefully show you pictures of anything safe-search while playing dumb if you search for something "naughty"... web links you select pile up in little tabs that let you slide right back to the original query... it looks good... it makes pleasing sounds that let you know what's happening...

If Siri can stage a question to Wolfram Alpha, the result is great. But if she can't, she just lamely offers a button to (Search the web for ______?) that then kicks you out to Safari. Google voice search makes Siri feel clunky.

The voice recognition is verging on instantaneous. This is amazing work.

> The voice recognition is verging on instantaneous.

Exactly! Shows you how much it has recognized as you speak. Immediate feedback. More important is the speed of getting back results. Google Now is certainly a little faster in that regard as well.

Reminded me of a 21 questions test with Siri and Google Now (on a Nexus with Android 4.1 I think) side by side as they listened simultaneously. It's interesting to see how both perform in different scenarios. Generally when Google search has the answer Google Now is a little faster. When Google search doesn't have a direct answer Google Now boils down to Google Search results - right there - as opposed to an offer to do a web search. In my experience Siri tends to have an answer more often.

The general theme was:

1. Questions that get answered by Wolfram Alpha were slower on Siri. In one test Wolfram Alpha didn't have an answer and Siri offered a web search where as Google's graph did have an answer.

2. For requests like "Call BestBuy" Siri looked up BestBuy stores and offered you the choice of which one to call where as Google Now said there were no numbers for BestBuy in the phonebook.

3. Sports questions (which I don't quite care about personally, sorry) tend to be answered way better by Siri. I remember a question about which sports personalities was taller and Siri had an direct answer with other stats whereas Google Now was a list of search result links - no ads yet. It was the same thing for another sports trivia question.

4. Siri would sometimes have to wait - almost like a timeout - on a response from Apple. Google Now didn't face such issues.

5. In some case Siri would take a little longer but give back more information - when you talk about restaurants it'll include reviews, price range and distance.

Source: http://youtu.be/z_pclCFpjgw (video)

>> "Exactly! Shows you how much it has recognized as you speak."

That's the one thing about it I didn't like. The fact that I could see a mistake made me want to try and correct it (but I don't see any obvious way to say 'that last word is incorrect').

There is something about the voice recognition algorithm that uses later words to understand what you were saying earlier. For example, whenever I say, "Show me pictures of humpback whales" - GoogleVoice starts off with "Show me pictures of home" and then when I add the word "back", it corrects home to be hump.

To some degree, that's probably how the human brain works. If you say out loud "Show me pictures of Hump" and then just stop, it really does sound pretty close to "Show me pictures of home"

I am very, very impressed on how much better than Siri the voice recognition is in terms of speed and responsiveness. And, it works just fine on my iPhone 4, where Siri isn't an option.

The standard machine learning approach for doing exactly what you describe is to use a conditional random field. I don't think wikipedia has a great page on it , but check out http://en.wikipedia.org/wiki/Conditional_random_field for a jumping place.

CRFs are used in places like voice and images where recognition or decoding of a segment logically depends on the pieces near it.

For clarification, it's much more likely a Hidden Markov Model. Most speech recognition algorithms use some form of Baysesain probability model, HMM being the most commonly used e.g. Sphinx CMU.

Remember Google has no incentive to get you to upgrade your phone, Apple does.

Yes, but Google has incentives to display ads. Which is the lesser of 2 evils?

The former. I want to see relevant ads.

Its pretty easy to ignore ads...

Wow. I haven't tested IOS6 with SIRI, but when it was Google Voice v/s Siri on IOS 5.1.1 - there was no comparison. Google voice won hands down. Even in a noisy environment. I've just bought a second hand iPhone 4s that will be upgraded to IOS6 so I can compare for myself. Until a month ago, Google was easily king of the hill.

If it acts like Android voice recognition, it is smart enough to correct speech on the fly.

Agreed. I'm running an iPhone 4. The speech recognition almost keeps up to my speech in real time. The results are typical Google.

This application gave me one of those "sufficiently advanced technology" moments. Magic.

agreed. I want this to be the triple-click function on my iPhone 4.

Indeed. I really wish there was a way to use this instead of Siri with the standard home button shortcut. Pretty much all it's missing is a way to do basic phone interaction (call X, text Y, remind me tomorrow) which should be pretty simple considering they apparently have real time voice recognition.

It's not simple on iOS since iOS simply does not allow it.

You can get pretty close, the tel URL scheme will let you make calls. MFMessageComposeViewController will let you send texts. Calendar stuff is already possible with Google Calendar so that could work pretty easily.

Its not that they couldnt do it. Its that they are expressely prohibited by apples patents. ( something about a unified search interface for your phone as well as the web )

Can you site a source here? That's an important distinction.

Here's an editorial that gives an overview of how the patent was relevant in the Apple v. Samsung case earlier this year, along with a link to the patent itself: http://www.androidpolice.com/2012/07/25/editorial-samsung-ha...

And never would allow that much integration across all apps. If that what you want, best get a Nexus 4.

I agree with everything you've said, but unfortunately, it still just won't be used on my iPhone. There's no way I will launch the Google app then click on the microphone icon. Replace Siri with this and I'll use it, but as long as it's a separate app on my phone it won't get used.

I will; I have an iPhone 4.

Sorry Siri... You've got a new big brother in my opinion.

Woah.. this may be one of the few times so many hyperbolic adjectives in a HN post seemed justified ;)

Just fired it up now, and can attest it simply blows Siri's UX right out of the water.

Kudo's to the Google team!

The voice recognition is outstanding but every query I'm trying (sports, news, weather) is returning text results.

In a situation where I'm using voice recognition I need it to read the response back to me, without that it's next to useless.

Am I missing something? Or is it the things I'm asking?

I think you have to phrase it in the form of a question in order to get a voice response.

Things like "How many meters in a mile", "Did the Tigers win?" and "What is the weather like?" all came up with an audible reply for me.

Try something with a one-shot answer like "When was the constitution of Argentina ratified"? It should speak assuming the latest Google app version.

I've tried weather in Glasgow, the time in San Francisco and a bunch of other very simple stuff.

I believe for the app to talk back, your language settings have to be set to "US English". That's how it is on Android.

Seems to be this.

Which is odd. I see no reason why English UK (or "actual" English as I like to think of it ;-) ) couldn't use English UK for recognition but speak back as if it were English US.

As it is it becomes unworkable set to English US it doesn't recognise stuff at all well with my accent, and English UK doesn't read stuff back which means that it's of massively limited use.

So great if you're in the US, for the rest of us, still a (very impressive) work in progress.

Well, it's great for search, but things that people use Siri mostly for are "please call X", "please remind me to do Y" or "make an appointment for Z".

As long as the Google app cannot interface with Contacts, Reminders or Calendar, this is just an interesting tech demo and nowhere near a Siri replacement.

Exactly. I am amazed that so few people get this. It's not about speed or accuracy, but rather about being tightly integrated with other apps on the phone.

+1. Voice recognition is mainly about hands-free utility; interacting with the phone's features and functions. Google's implementation is impressive, but it will just be something fun to show off for now.

> This is Next Level Shit. This is absolutely next level execution.

Crashed the first time I tried it.

And thus a million server blades evaporated into the night, taking my precious opinion with them. Reboot your fucking phone and try again.

A year ago, Apple was asked whether the iPhone 4 or even the 3GS will get Siri. They said no, to do good voice search, you need advanced tech - several microphones, special noice canceling DSPs, fast processors - so get out and buy a new iPhone.

Well, I just did the test. Google voice search on a 40 month old iPhone 3GS is more responsive and much more precise than Siri on the latest and best 1 month old iPhone 5.

Apple has so much egg on their face.

Google hasn't always been great in this area. Google dictation search is borderline unusable on my HTC android phone. All this will do is force Apple to improve. Great for everybody.

I tested on an iPhone 5, Nexus S and an iPad 2. Most things worked well across the board but there is some weirdness. Eg "what's 2 + 2" returned the correct result on the iPhone 5, worked on the iPad 2 after several tries and just failed every time on the Nexus S.

Some times the failed was result was just weird, but most often it was "what's to plus 2".

What's going on? Still more impressive than Siri.

Did you use it with a headset or via device microphone?

The noise-correction on the older device microphones aren't as good as the newer ones (ie, anything that's Siri compatible has at least 2 microphones for noise-reduction).

So perhaps Apple does have a legitimate (although self-serving) reason not to deploy Siri on older tech - it may suck if you have lots of ambient noise or distortion.

I also have a very poor experience with my Nexus S when using Google Now and voice search. Unfortunately, I don't believe that the Nexus S was speced to be able to run 4.x. I get the impression 4.x was designed for newer devices.

Wow...I can't remember being this blown away by a search development since...I don't know when (being able to Google mathematical formulas is pretty amazing, but not as everyday-useful)...

I asked both Google and Siri, “How much damage did Hurricane Sandy do?”

Google heard it as “How much damage did Hurricane Sandy too?” and returned with official Hurricane Sandy emergency info and latest news stories literally as I stopped talking.

Siri took nearly five seconds to register my question as “How much damage did hurricane you do” and responded with hockey league standings for the Hurricanes team.

And the execution of Google's product is more stylish than Apple's...given Google's lead in collecting voice data, nevermind their lead in search technology and algorithms...how can Apple hope to even compete in voice search except by forcing Siri on iOS users?

*edit: Here's a screenshot comparison: http://danwin.com/words/wp-content/uploads/2012/10/google-vs...

I tried the same test.

Google got the translation right first try, and the first result, which returned in less than a second, had "over $20 billion in damage" visible.

Siri took ~8 seconds to return "Ok sports fans, the Hurricanes appear to be in first place in the Southeast right now" followed by AHL team standings.

Apple has a lot of catching up to do here. They're going to have to start translating client-side, which Google has obviously figured out, and they're going to need a data source as rich as Google. It think the first part will get done at some point, but how will they match Google on the data side?

They won't. They can't.

It's important to remember that Apple is a hardware company with a software habit. Google is a software company.

While I understand why Apple hates Google so much, I think breaking with Google is a mistake. Nobody can beat Google at software; Apple hardware + Google software is a wonderful combination, and if the two companies worked together we'd really see the apex of user experience.

The hardware/software split is too simplistic a view. The companies have their various strengths and weaknesses. For example, the graphics stack on iOS embarrasses Android's. It took how many years and how many hardware advances before flagship Android got scrolling as smooth as the first iPhone?

Another example: Google released Renderscript in early 2011 (http://android-developers.blogspot.com/2011/02/introducing-r...) only to deprecate it a little over a year later (https://developer.android.com/about/versions/android-4.1.htm...). We can cherry pick examples all day long, but the bottom line is that each company excels at certain things.

It's interesting how Samsung is basically stuck to Google at this point. I'm sure they'd love to switch to their own Bada or Android fork and capture more of the profit, but Samsung is absolutely abysmal at software.

Google is a data company and its design approach is data-driven; whereas Apple is a design company and its data approach is design-driven.

False. Apple is a software company. If you've ever had to develop any non-trivial smartphone application, you would see how the iOS SDK is lightyears ahead of Android's.

Apple is a software company (who also fuses design and software really well). Google is a computer science company.

If only Google hadn't stuck it to them with Android ALL users would be better off. We'd have Google apps on iPhone hardware and UIs. It hurts to think about how much better it could be.

We'd be better off in a world with no Android, where RIM still hasn't shipped a modern smartphone OS, and with a Windows Mobile/Phone environment that has broken application compatibility twice in the past two years?

I'm 90% sure this is an example of Poe's Law (that it is impossible to tell the difference between sincere extremism and a parody of extremism - http://en.wikipedia.org/wiki/Poes_law).

But it is worth remembering the various ideas that iOS "borrowed" from Android (or the improvements Android forced iOS to make) - including decent notifications, wireless syncing, multitasking, etc. Google for more.

Apple (and everyone) improves more rapidly with a viable competitor.

It's incredibly myopic to argue that the mobile landscape would be better off without Android providing a compelling competitive force.

What makes you think that iPhone would be as good as it is if it wasn't fighting for customers who can get other smartphones?

Interesting that Apple - the device company - opted to do the recognition on the server and Google - the cloud company - chose to do it on the device.

I'm pretty sure Google is doing most of it on the server as well. Why do you think otherwise?

The sheer speed of it.

Maybe I'm wrong but I'm getting near instant voice recognition off a poor wifi connection linking to a slow internet connection.

It is, indeed, on the server (at least on my Galaxy Nexus). It won't work if you turn the connection off. Maybe they have some sort of hybrid, though, because it does have offline recognition that can call people without a net connection.

They do both, so it'll work totally offline, but it's better online and can utilise the two together seamlessly.

Because you can do a local search offline. (at least on the Android version)

On Android, voice recognition is built-in to the OS. On iPhone, they don't have this luxury and the app is relatively small, so I'm thinking almost all of it is done on the server. A link to a Google source verifying this would be nice though.

This was exactly my experience. My iPhone struggled to understand me at times (albeit only 2 out of 20 times) - but the Nexus S (a fairly old phone) understood me 20 out of 20 times. It really whipped the pants off Apple - I was quite surprised. To see so many posts above say that Google Voice was far below par of Apple was a surprise to me. Note: I haven't tested IOS6 yet, so things may be different.

How do we know that Google is translating client-side? Is there a link to a Google source to verify this? The app is only 10 MB, so it would be very impressive if they actually accomplished this, but I can't find verification anywhere.

For current events, Google will always have the lead here. They have the full weight of all the searches that are being performed around the globe at any second.

I've said it before - and I'll say it again - Google is about to crush Apple into the ground. The fundamental difference between the two is that Apple brings style but Google brings substance.

Now in the short term style > substance for the simple reason that it is easy to repackage something that is difficult to use into something simple.

Making things easy to use is obvious for designers - but not for engineers - because they focus on actually making complex things work instead of making it easy to use.

However, in the long run substance > style for the simple reason that anything that can be repackaged to make it simple will either become a minority player or a commodity item because style and veneer are easily copyable but substance isn't. Substance is a natural monopoly, and monopolies make lots of money.

What you see here is the fruition of substance over style - big data is a monopoly and Google owns it hard.

Google will be the first trillion dollar corporation. And it will do so by making everything else apart from Artificial Intelligence a commodity.

Disclaimer: I'm looking to buy Google stock and I recently exited an Apple short.

Apple has a monopoly on Apple. What other company can put out one new phone per year and still rake in money? HTC and Samsung wish they could do that, but they still each spread themselves over a half dozen budget phones and a new high-end phone every 3 months. Apple has built up so much good will and trust that they can do what no other company can: nothing. They can just work on one phone, all the engineering and supply chains, for an entire year before releasing it. That's why you're never going to see Android hardware like the iPhone 5. I wouldn't write off Apple quite yet.

Goodwill and trust works if your product is differentiable. Apple's products are glass screens with grey backgrounds. Android has surpassed iOS parity in many ways, and is equivalent in others.

Both the Nexus 4/10 surpass, or are equivalent to, the iPhone 5 and the iPad 3 respectively. Apple has no more products to grow into - outside of Aug reality glasses - where they'll just get squeezed again and Google already has the lead.

Apple's strength is not, and never was, in their hardware. It was software - and it is now commodity. Apple is the new Microsoft.

Linux kills everything that cannot differentiate.

"Linux kills everything that cannot differentiate."

Yeah, and Windows differentiated by offering an insane level of backward compatibility which makes both developers happy (they don't have to maintain and patch their apps as much) and users who don't have to lose their favorite piece of software, which is why Linux has never overtaken the desktop. Windows runs the apps people want : MS Office, Photoshop... the mac also runs some of those but you can see how fickle backward compatibility is with Apple. You can install Windows 8 on the early macintels, you can't install Mac OS X Mountain Lion on those. MS offers better support for older Apple hardware than Apple themselves.

Apple is not going to survive android on the long term unless they truly manage to turn the iPhone into a fashion statement and multiply their effort to make it look like jewelry. Apple revenue depends entirely on the iPad and iPhone nowadays, Mac aren't making them any money. Unlike Windows, people don't have any sort of long term, or big in $ investment in mobile software. Transitioning from iOS to Android is not like going from Windows to Linux and losing Office, Photoshop, thousands of dollars worth of a game library built over a decade..

Backward compatibility is for chumps. If you hide behind that and inertia in tech you will be replaced.

Everything will run in the cloud on the latest tech and whatever software you want will be streamed to you direct. And those servers will run on Linux.

The excitement over voice recognition threatens to obscure the most important breakthrough in Siri, which is that Siri tracks conversational context. For example with Siri you can have this interaction:

Me: Will it be cold tomorrow? Siri: Yes, the low temperature will be 42 degrees. Me: What about Friday? Siri: Looking better. The low temperature will be 58 degrees. (exact working paraphrased)

Try this series of questions in the new Google search app. The first gives me a wiki.answers.com page as the top result, the second a Woody Allen quote.

Or try it on any conversation bot. Using pronouns to obscure the topic of conversation has always been the best way to reveal the stupid machine underneath. Siri is a little less stupid.

Google voice search cannot do this because it is fundamentally transactional--you ask one question, get one answer. It is just another interface to their web search, albeit one with seemingly great voice recognition.

Siri is not designed primarily as a search engine. It is designed to be a personal assistant and is optimized to accomplish tasks and answers certain questions in the process of doing so.

I had I problem asking weather questions to Google. I asked "will it be cold tomorrow" and got back the forecasts temp for tomorrow in my location. And a whole lot faster than Siri (which took me three tries to get anything about weather returned). You have to provide contxt every time with Googl, but that's a very small thing in my opinion and since it just works still ends up being faster.

No surprise. As far as I can tell, Scott Forstall was the (post-Steve) executive who wanted to go to war with Google. He was in charge of Siri (Apple Search) and Maps. The minute he's canned, Google and Apple are suddenly best friends (though I'll expect Apple will continue trying to sue Android out of existence).

Google wants its apps on iOS, as they mostly care about ad revenue (not the few bucks they might make on Nexus, which is just one of many Android brands). Google has always been platform agnostic. Apple wants Google (Android) dead, but simply doesn't have the ability to beat Google at search.

Scott Forstall probably wanted Apple to create a massive data division, so they could go toe to toe with Google on search, and hope that people would still want iOS even if Google was locked out. I'm guessing the other execs were beginning to question this strategy - Google can make a "good enough" mobile OS better than Apple can make a "good enough" search engine and mapping platform. It's far better to let Google own search, and focus on doing what Apple does best.

-- Google can make a "good enough" mobile OS better than Apple can make a "good enough" search engine and mapping platform.

I think this is the heart of it right now. A "good enough" OS for Google has a bit of scroll jank and some inconsistent UI elements. "Good enough" maps for Apple will get you lost and cause you actual trouble.

I just tried it, and it's much faster than Siri. I wish I could have it replace Siri, but alas, iOS would never allow that. I much prefer the Google Now style voice over's Siri's as well.

Google originally announced this app back in August, and said it'd be in the App Store "shortly"... It's pretty obvious why Apple held this back in the approval process since it definitely competes with Siri's functionality.

Jellybean adds offline speech recognition (which is at least partly responsible for Now being wicked fast); there's no mention but wouldn't be surprised if they ported it to iOS for this app.

It doesn't seem to work offline.

It's different than Google Now on Android, as the results are just the Google search results for the query. But when the Google Knowledge Graph provides an actual answer, it puts that up top and reads it out.

That's exactly how search in Now works on Android. Search obviously can't work offline, but that doesn't mean the recognition isn't running offline.

It's not. Say "listen to muse", it doesn't work without a net connection. Only "call X" works.

The voice recognition is shockingly fast. It's printing words on screen as I'm saying the next word. Any idea if that's being done client side?

I would love to know the details here. What I'm suspecting is that Google has figured out a way to combine both client- and server-side processing for both maximal responsiveness and accuracy. Maybe there's even some Google Instant magic going on, so it'll basically predict what you are going to say and pre-fetch something based on that. I don't know much about voice recognition, but I guess it would speed up the client-side algorithm if it has knowledge that your next word is probably going to be either banana or apple.

I distinctly remember Google talking about them having achieved client-side voice recognition at the last I/O, but I can't find anything about it now.

They did, but I think performance is degraded when there's no connection. When there is a connection, it's some kind of crazy hybrid approach.

Wow, imagine if Apple actually exposed APIs that allow Siri to do what it does? This app would destroy Siri.

Edit: I'd go so far to say this is eerily similar to the issues levied during Microsoft's anti-trust case. Google clearly is unable to compete here for no other reason than artificial walls put up by Apple on their devices in software. This is mobile's IE vs Netscape.

Is it? Apple doesn't have a monopoly on the smartphone market. The objection to Microsoft was that it was using its complete and utter dominance of desktop computers in anti-competitive ways.

So far as I can tell anyone who doesn't like iOS' walled garden can pick up and leave - to the market leader, Google.

This doesn't at all seem like monopolistic behavior, just rather restrictive and perhaps unwise.

Yeah from a monopolistic point of view, it's not really the same. But there is an argument to be made that Apple has a monopoly on the tablet market right now.

My point was less about the monopolistic nature and more that this is very similar from a product perspective to IE vs Netscape. Microsoft used its internal platform APIs as a massive amount of leverage to force IE down peoples' throats, even though Netscape was technically superior on the core ability of rendering web pages. The seamless integration caused IE to win out and eventually catch up. The stark disadvantage of Google's app is nearly identical in nature to 3rd party browsers in the mid 1990s.

  > Netscape was technically superior on the core ability of
  > rendering web pages
No. Unless you are comparing IE3 with Netscape 3. NN4 and onward got worse, IE4+ got better. We now call it crappy browser but by the time IE6 was out it was a clear leader (well some my argue for IE5 on Mac). I still remember first widespread CSS bits that started to appear in the wild with IE4— hover on links, fixed background. It was possible to duplicate hover behaviour on NN4 but it was nightmarish.

Yeah I was actually comparing IE3 with Netscape 3, which was the point at which Microsoft decided to integrate it with windows explorer and was the point at which Netscape's fate was sealed.

The big difference being, Microsoft could make a better browser than Netscape, if they put their best team on it. I don't think Apple can compete with Google at search, unless they want to spend several billion a year (as Microsoft does).

This is the first time I've used any Google voice recognition. I use Siri daily (mostly for setting reminders and checking sports scores) and find it works well. I was shocked at how quickly Google was able to convert my speech to text and get a result. It was almost real time. I was considering trying an Android device because I really like the look of Google Now. By letting me try the voice recognition part on my iPhone Google may have got me. Only problem is that I'm so locked into the Apple ecosystem. I think this is becoming a problem and hindrance to competition. People spend so much money on apps, and have to select specific music/video services for each phone OS that it makes it very costly to switch regardless of which phone has the best features and technology.

Only problem is that I'm so locked into the Apple ecosystem. I think this is becoming a problem and hindrance to competition.

No shit. Seriously, this isn't just dawning on you now, is it? This is the exact history of PCs victory over the initial Mac leader originally, and it's playing out in a similar way with iOS and Android today. It's exactly why everyone has been pouring billions into mobile development - everyone wants to be the next MS with a monopoly on the OS, because that's the natural outcome when there are such high barriers to conversion.

It's not just dawning on me but it has gotten worse in the last year or two. Because of the deep integration of iCloud in both OS X and iOS switching doesn't just mean losing the apps I've bought, it means a difficult migration process to different cloud services. It's possible, if a little time consuming, but for non-geeks it'll be very difficult. Not using iCloud in the first place would be the best solution but then I lose the (usually) seamless syncing experience that makes my device so useful.

I tried this app on my wife's iPhone and I swear the voice transcription is _faster_ than my GS3. It looks faster than the Jelly Bean demos you can find on Youtube too. Whatever magic they are doing on iOS, it would be nice if they brought it to Android soon as well. :D

I was going to suggest Amazon's Cloud Player music service, EXCEPT no app in the App Store. Did Apple block this?

My mistake, sorry. Just found the app.

Im pretty sure iOS is the only one with special os specific DRM

I have always felt that it was a weakness in Siri that there wasn't a direct connection to a world class search engine. The whole "I don't know, Google it." exit point in the interaction flow was is really such a huge hole. The question then is can iOS voice assist compete with Android voice assist if Android comes with a readily accessible search engine? Its an interesting marketing challenge.

Wow, wow, wow.

Apple vs. Google, style vs. big data. While I love Apple's sensibilities, here's more evidence that data will win in the long run.

Though in this case Google has Apple beat on style as well. The Siri presentation looks extremely lacking next to Voice Search.

"Siri, open the Google Search App."

"Sorry, I can't help you with that"

Of note: this same "type it as you speak" feature, also using Google's voice recognition, is available on Android phones using the Swiftkey keyboard replacement, which means it will work whenever text input is available, be it facebook, a text, an email etc. It works so well my mind was blown and I actually use it.

Apple of course does not allow keyboard replacement either, so we're all stuck with the crappy voice recognition in iDevices (I have several, statement of fact, not fanboy-ism)

One refinement... the best part of the SwiftKey implementation is support for multiple languages at the same time. Switch input language, two clicks, and it will work. Apple doesn't allow you to switch language context (keyboard, yes, voice input, no).

Let's keep in mind here that one of Siri's core strengths, and why it must be a system level service, is that it can delegate queries to apps and 3rd party APIs. As impressive as Google's Voice Search is, it cannot execute tasks for users (reminders, setting appointments, sending messages) and, what's more, it would probably be a huge security to hole to let it access apps, the data of those apps, and execute code. This is the job of the OS. So let's be cautious about trumpeting this as a Siri replacement. At best it's a Nuance or Wolfram Alpha replacement.

> This is the job of the OS.

No, Apple has made it a job of the OS. There's absolutely nothing fundamental to the problem that makes it a job of the OS.

This is true up to a point. To my understanding, in a sandboxed environment a third party AI can only work if it is incapable of choosing which application should execute the intents. All it can do is determine the intent and pass it along to the OS, which then presents the user with a confirmation or, in the case of multiple apps receiving the intent, a tiebreaking interface.

A possible workaround would be a cumbersome two-way permissions system ("can app X access app Y?" [and, for purposes of apps asking the AI to ask follow-up questions] "can app Y access app X?", ad nauseam), but this is something of an impractical solution, because the AI would need to be granted permission every time the user installs a new app to access that app.

So yes, Apple makes it the job of the OS. This is a design compromise that mitigates the risk that the user is overburdened with confirmation dialogs, choice dialogs, and/or permission dialogs. And for an AI that just works, I would say this is pretty key.

Android has a centralised 'account' system, to access things like twitter, facebook, gmail etc. These can be used for applications to connect to these services. Android also provides permissions for accessing contacts, call logs, calendars and latitude. So in some sense, Google have had to make Android do this as well, but it would be hard to do it with the way iOS currently works.

As others have hinted, Android already enables this via intents. Google Now doesn't even have to be aware of the app you want to use to handle a specific action. For example, it just has to fire off a "schedule an event" intent, and Android will route that to your preferred schedule management app. There is no security hole here - it's simple event firing and consumption, except it happens across apps. Security is handled by allowing the user to select which app they want to consume the event, and annoyance is mitigated by allowing users to specify that the selected app should always be used as the default. It's a marvelous system, and is one of the design decisions that, IMO, makes Android a massively more powerful platform than iOS.

Siri must be system-level because Apple has prevented other apps from being capable of assuming system default roles. It is an effect of iOS's design, not an inherent flaw in mobile OSes.

I responded to a similar objection here, same thread: http://news.ycombinator.com/item?id=4719725

That's what Intents are for. You can have intents that can talk between different applications in a standard format.

Security is just an excuse for not wanting to try and enable this.

>>Security is just an excuse

Not when it comes to giving Google access to my personal data.

But it _could_ issue reminders, set appointments, and send messages if channeled through google's servers couldn't it? I'm sure there are many iOS users like myself who point the default apps at Google's calendars and mail. If the app calls back to google and says "put x on calendar for authenticated user y" then it would show up in the iOS apps.

From my basic understanding of Android it could just use an intent to at least pre-fill some forms with it.

This is exactly how I feel, there is no comparison, Siri is king of the hill for an Engineer like me. I use it all the time (in very noisy environments think power plant levels of noise) to set appointments, countdown timers/alarms, dictating texts, opening applications (as well as playing songs or other content) and asking it Engineering calculations with the help of Wolfram Alpha integration. Google's attempt can barely understand me at times and is useless in every other sense.

Heavily considering Android now after this.

Between this, the unlocked/contact free prices for the Nexus 4 and google maps? I'm typically an iOS guy and I'm right with you. If only they had put LTE in the Nexus 4!

HSPA+ at 42mbps is still Pretty Damn Fast, even if it's not LTE. I'd have loved for there to be an LTE variant, but I don't think it's a dealbreaker given the alternatives.

Isn't the LTE issue overblown?

Yes, for the moment. LTE will be standard in a couple of years time & by then voice over LTE will (hopefully) so we won't need the ridiculous hack of falling back to the 3G connection for voice with it's negative effects on battery life.

Until then, LTE is for those who have to have the latest thing, regardless of price or who actually need the data rates achievable with LTE and have coverage where you're likely to be using it.

>>Until then, LTE is for those who have to have the latest thing

Um, no. There's a mountain of difference between HSPA+ and LTE. The former is like having medium-speed cable Internet, and the latter is like having FIOS. This has a direct impact on user experience.

From my non-scientific tests, it seems like LTE has much lower latency than 3G networks. That's a big deal for me.

Ask it "how much wood would a woodchuck chuck ... " and prepare to laugh.

Why would this be down voted when the blog post suggests you should do this?

Just getting a generic web search, what am I missing?

you have to say the whole line (including the 'if a woodchuck could chuck would' piece)

Agree with everyone else, on LTE this is blindingly fast. I also played with the goggles feature which read text on a watch face, identified the building I work in and seemingly instantaneously ocr'd text on a postcard!

Haven't yet tried on iOS, but on Android I was pretty amazed when I realized their voice search actually recognized some pretty complex words spoken in Finnish. And the market share or our language is pretty small (around ~6M speakers).

I believe Google must be doing something right on the voice frontier when they can accomplish things like this. They must have some pretty efficient methods for teaching the system new languages.

Too bad Google isn't doing that much advertising of this for its own Android phones. Because Samsung sure as hell won't do it. They'd rather advertise their own bad replica of Siri.

Why doesn't Google release such a thing for desktops? Voice commanding my PC or using it as a kind of a personal assistant has been my dream for like 10 years. I've tried quite a number of apps and nothing compares to Siri or Google's voice search.

You can kind of do the voice search stuff already (click the microphone on the Google search box). Desktop searches seem more reluctant to fork over Knowledge Graph results for some reason though. Voice Actions are, of course, not possible because that's not possible from a browser.

The Google search app on Windows 8 has this

Also happens to be one of the only apps in the store that's only available on x86.

Surprised no one's mentioned the goggles search. I was pretty blown away to maximize an application on my desktop, snap a picture and have it recognize it (down to the version number, for Outlook 2010)

you must try the sudoku thing

This is awesome.

Note for non-Americans: The app only speaks results back if your selected voice search app is 'English (US)'.

Still doesn't speak back to me

Try resetting the app by exiting it and removing it from the multitasking bar to force it to load from scratch on the next run.

That should work, because I've noticed the app is a little buggy and occasionally gets into a confused state where voice search, voice responses or both stop working. Doing that reset fixes it.

This may be a stupid question, but is there an equivalent Android app?

This functionality is part of Android 4.1 (Jelly Bean) and later. You can access it through the "Google" app, the search widget on the home screen, the lock screen, etc.

And on phones with soft buttons from anywhere just by swiping upwards from the soft buttons.

Wow, this is fantastic. Eddy Cue has a tough road ahead getting Siri even to this level, I think.

Google, please release your maps app for iOS now!

Perhaps in order to compete, Apple will need to get more nimble. For example, in iOS3, I could use voice recognition on my 3GS to play songs, skip forward, call people, etc (similar to Google's voice control) - it did not need an internet connection.

As of iOS5, you couldn't get this if you enabled Siri. So in order to do mundane stuff, you still had to enable a round-trip to an Apple server.

They should revert this so strictly local, mundane commands don't require an wireless latency.

The inflection is nice, especially at the end, sometimes falling, sometimes rising. I guess this isn't yet available on desktop voice search?

Since google knows so much about me, can I say:

- what movie should I see? - book tickets - mark the route to the cinema

to eventually (with a self-driving car):

- take me out to the movies, google

It's weird that this is avaliable on iOS, but not on Android yet.

Even if you're lucky enough to live in the US and be able to use Google Now (the voice search like in this iOS and Siri) you won't have the same features available. If you, God forbid, change your language away from US English into something like horrible UK English the features are disabled and just becomes dictation instead.

I'm sorry about being a little bit annoyed, but I can't understand at all why Google put in place extremely stupid restrictions of features requiring their users to hex-edit their binaries in order to get access to the availible features. It's moronic.

What features are available in the iOS voice search that aren't in Google Now? I live in Canada and every example in the video I can do on my gnexus.

For some reason this app seems to miss a lot of its advertised functionality in Germany, or maybe just on my phone. The image search results aren't displayed in a scrolling slideshow at all, instead each result links to the desktop version of a typical image search result, showing the preview and information in a sidebar and the embedding web page on the left.

Yes, the voice recognition is very fast, but then again, most questions only work in English, no chance to get anything useful in German or other languages. That's not really competition to Siri in this department.

Very cool. I was pissed that Siri was not available for my iPad 2 but this works well. Nice!

Impressively fast. Is the voice recognition being done exclusively on Google's servers or is there a local component?

With as fast as it's working I would be shocked if a chunk of it wasn't being done locally. I think the refinement may be server side (IE: looking at the words in context against popular search strings to see if it may have miss-interpreted a word in the search string).

Android 4.1 introduced local voice recognition, and I expect they put some of that juice into this product. It works incredibly fast in either case.

I would guess all server side but maybe they extract features on the device to save some processing/data.

It's so, so fast, but seems to have trouble with accuracy, at least when I've tried speaking to it. It almost seems to decide on what I've said earlier in my sentence before I finish my sentence, which I'm guessing is lower accuracy compared to waiting for me to finish talking before analyzing.

Except it seems to retroactively change its guess of what you said earlier based on what you said later.

I haven't used an iOS device since I switched to the Galaxy Nexus about 5 months ago. How does this compare to Google Now on android 4.1?

In general I've been very pleased with voice search on Google Now -- just reading the blog post I wasn't too amazed by the examples they gave for iOS because it sounds identical to what Google Now provides. I assumed that Google would release these features for android before iOS, but am am surprised by the overwhelmingly positive comments others here have to say here. Can anyone do a comparison and shed some light?

I do have to say that Google Now is sometimes rather slow -- the voice recognition is very fast (type as you talk realtime) but web search can sometimes take 10+ seconds to load even when already connected to wifi. Other times, it just works.

I'm seriously thinking about switching, but the number of phones and features is a bit confusing.

If you're switching from an iPhone, and can make do with 16GB of space, buy the Nexus 4 once the reviews come in (unless the reviews show something startlingly wrong). The nexus phones are where you get the iPhone equivalent experience of fast, long term updates and no carrier BS. (Except the Verizon Galaxy Nexus, which is why Verizon isn't getting the Nexus 4).

Alternatively, if the 16GB isn't enough, or you must go Verizon your choices are either the HTC One X or Samsung Galaxy S3. Read the reviews see which suits you better.

There are a horde of cheaper phones. And yes its confusing the choice in the mid to low end. But if you're switching from an iPhone, those aren't aimed at the same market. You can get one of them, but unless you're still using a 3GS it will be an inferior experience.

The Galaxy Nexus had an awful screen though. Not sure about the Nexus 4.

The Nexus 4 has a 320 ppi IPS display. From what I've heard, it's the best you'll find on any phone right now. No more pentile AMOLED.

The GNex has a pentile screen. Reports are that the Nexus 4 has an IPS display, so it should be appreciably sexier.

Am I doing something wrong? It plays the beep noises even if I have my phone set to silent (iPhone 4S).

I know that Siri works this way as well (which I also find irritating). Why won't it obey my settings!

I think its terrible, perhaps its just my Scottish accent which is probably one of the harder English speaking languages to account for however its especially terrible. Siri doesn't have very many problems with me, but the Google services on my iphone and now guffed Nexus 7 were really terrible to the point it was embarrassing how inaccurate it was.

I use Siri a lot, and its powerful because it actually feels par of the phone. I can actually do useful things with it.

I'm surprised it still "misses" on a lot of phrasings and sends the search to Google which is generally useless since Google loses the context. Especially phrasings that Siri advertises as working.

"how far is it to X" doesn't work in Google but works in Siri. "how did (sports team) do" doesn't work in Google but works in Siri. "what movies are playing" works but "where is Argo playing" doesn't. And this is just weird because "where is Looper playing" works.

Now if Google really wants to finish the job and stick a fork in Siri, they should make a developer SDK available so other iOS apps can use this technology. Right now, the only ad hoc speech-to-text service available is Nuance's (which yes, powers, Siri) but the API rate restrictions (and pricing tiers) make it a much less than satisfactory option.

My take on Google Voice Search vs. Siri, including the mystery of the missing NHL standings data. http://www.forbes.com/sites/anthonykosner/2012/10/31/google-...

Irritating how the app doesn't let you clear your search terms. You can only delete a search term by entering a new one, even with "search history" disabled. Really tweaks my OCD sense.

Great app technically though. Hopefully this'll push Apple to make Siri more responsive.

It does not read the results back to me. Is there something wrong with my settings?

No, it only speaks the result if there is single answer to say. If you ask it what is 15% of 51 it will speak the result. But if you ask a broad question (what is quantuum mechanics) it will just list all results.

I tried 15% of 51 and also queries like what is the population of Tokyo similar to the demo. It does not seem to want to speak the results back.

Which phone do you have? I suspect this could be disabled on older iPhones. I have a 4 and I don't get any voice feedback.

I tried it on iPhone 4 as well and it does speak some answers. Touch the "i" button on the left of the red microphone button and make sure "Speak answers back" at the bottom is turned on.


I can the "i" button in that video, but my app does not have it, and I don't get voice answers.

Edit: Switching to English (US) does give vocal answers, but then it doesn't seem to understand my South African accent anymore.

I am on the iPhone 5

For some reason I had English (Australian) active in the Google language settings even though I don't live there. It started reading the answers when I switched that to English (US).

Yes, that did the trick! English (US).

Under "Voice Search" in settings try changing the default language to US English.

I want this for my gingerbread phone. I am on mercy of Motorola for update and they won't give it. So now google can at least update the voice search app in my phone to this. hopefully.

> I am on mercy of Motorola for update and they won't give it.

I know that feel bro. That's why I dumped my Samsung Droid Charge and got an iPhone 5.

So frustrated not getting any updates.

It's great and a blackeye to Apple because Google showed that it can deliver the same thing that Apple has failed to by making Siri only available to iPhone4S and above.

I was getting into the video, but the responses by the anonymous users playing w/ the iphone is annoying. "Show me pictures of whales", shows some whales, "COOL!!!"

Why do I get kicked into safari when I search for directions and click on the map? Whereas when I click on external web sites it loads in a tab within the app?

If this is the same one on android, I don't see where the excitement comes from.

I love their image search though. Take a picture of painting, show you a lot of info.

It can recognise my Indian English accent and feels instantaneous , moved this app to my home screen

Now apple need to catchup on Siri

A few fun things to ask:

"Who made you?", "Tell me a joke?", "Who am I?", "Will you marry me?"

Some years ago, I was showing of my Nexus One to my sister, and she was so impressed with the voice to text that she exclaimed "Wow! Can Google really do that?". The resulting text from Google was "Yes we can. Google is amazing. :-)"

Never been able to reproduce it though.

I've been waiting for this feature since I upgraded to an iPhone...

Between Google Now, local voice recognition and Google's recent attempts to extract more factual data out of search results, they've created and are expanding some amazing stuff.

With the data that Google has, I can ask it math questions, ask it questions about release dates of movies or video games. And now I can query that data through Google Now (or will be able to as they pipe through from that dataset to exposing it through Google Now).

Funny, even with some of the features just in 4.2, Now became as much or more of an assistant than Siri. I still can't get over it will scan my email for packages and give me notifications about it. That to me is the epitome of why I love what Google does. They are good at data.

This is great and I will use it right up until it starts showing ads. I really hope Google can find another income stream because ads are a deal breaker for me. The first time Google maps dropped two pins on a location search, one an ad, was the last time I used Google maps.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact