
Google Handwrite - sama
http://techcrunch.com/2012/07/26/search-in-cursive-google-now-lets-you-hand-write-search-queries-on-phones-and-tablets/
======
jonknee
In case anyone is interested, I watched the network requests go by to see how
it works. Google is doing the detection server side (no surprise there, but it
was smooth enough that I had to check) and receives the drawing area
(writing_area_width, writing_area_height), the "ink" which is an array of
points and any already existing characters in the search field. The return
data is the best guesses for the input. Curious that multiple options are
returned, I'm not sure how those are used. Another AJAX request is made for
Google Instant.

------
ericdykstra
Fun feature. Would be killer with support for non-latin based language.
Imagine walking down a street in Japan, seeing a restaurant, and wanting to
search for reviews. Oh no! It's written in kanji and you have a hard enough
time reading basic words, much less names. Open up Google on your phone, write
the kanji and add the japanese word for review behind it, and off you go.

Great potential, anyway.

~~~
Permit
Judging from the commercial it does. Although I won't embarrass myself by
guessing the language.

<http://www.youtube.com/watch?v=uyeJXKfAcpc>

~~~
wheels
That was Japanese (Hiragana, one of the glyph forms that Japanese uses).

~~~
w1ntermute
The search was actually in kanji, not hiragana. And the first result (a
Wikipedia page), had sakura in katakana.

Sakura in kanji: 桜

Sakura in hiragana: さくら

Sakura in katakana: サクラ

------
mikek
I wonder if, like (800) GOOG-411, the primary reason Google is doing this is
to gather data.

~~~
mikek
Why am I being downvoted for this?

~~~
mikeevans
I'm guessing because everything Google does is to gather data.

------
agscala
Basically google wants more data to improve their handwriting recognition for
something else down the line. Smart move

~~~
SubZero
Now THAT would be amazing. Incredibly smart move. What better way to create
your own handwriting recognition program than to have all 4 billion people
field testing it for you.

~~~
SoftwareMaven
They've already done that. That was the entire purpose of Google 411 (except
voice instead of hand writing).

------
grantjgordon
This works _remarkably_ well. I'm shocked.

~~~
RandallBrown
yeah, I've done handwriting with a stylus on windows and mac before and had
much worse results than I'm getting with my finger on this. Very impressive.

------
dazbradbury
OK, so Google want to improve their handwriting recognition. Fine. But from a
users point of view this just tells me one thing:

\- Using an on-screen keyboard is such a pain for people that Google changed
their _HOMEPAGE_ to accommodate this.

Why does this irritate me?

1) None of the latest generation of smartphones are coming out with a hardware
keyboard. The best are always keyboardless.

2) Stylus-based input, which is the other extremely fast form of text entry,
has all but died out (I know some products still exist - but it was
essentially dead when capacitive screens won out).

Hence, we've gone backwards from 10 years ago. It was faster for me to send a
text/email from my phone 10 years ago then it is today. That is hugely
disappointing.

~~~
asto
Yes, this exactly. I desperately want to move from my Blackberry to an android
phone but all the best phones seem keyboard-less. Typing on a touch screen is
a major annoyance for anything more than typing a search string or sms.

~~~
hessenwolf
Go horizontal and use two thumbs. You get pretty quick. There are likely other
techniques.

------
swombat
TechCrunch appears to be mostly unreadable on the iPad, thanks to their fancy
template reloading/redirecting... Smart.

~~~
dredmorbius
Mostly unreadable on many platforms, along with their affiliate sites,
especially with JS disabled.

------
danyork
I'll note that despite the Techcrunch headline about Google Handwrite
accepting _cursive_ handwriting, it doesn't really do so. In fact, Google's
help pages specifically say you should use block printing versus cursive. I
tried some cursive and the recognition didn't work all that well. However, the
block printing was generally quite accurate.

I wrote about some of my views on this at:
[http://www.disruptiveconversations.com/2012/07/google-now-
le...](http://www.disruptiveconversations.com/2012/07/google-now-lets-you-
handwrite-search-queries-on-ipad-iphone-android.html)

I actually see a great value to it for times when I'm walking around a
conference with my iPad and want to do a quick Google search. I can't really
use voice (i.e. Siri) and I can't use two hands to type. This handwriting
could be an interesting option.

------
mtgx
I wonder, do latest versions of Android have support for 3rd party Wacom
styluses? Although I'm not sure Wacom even sells 3rd party styluses for
capacitive tablets that are as accurate as Samsung's S-pen or those old
Windows tablets.

The reason I'm asking is because I'd like to buy say a Nexus tablet to use
with a stylus, but want the same level of accuracy as Samsung's S-pen or
better, without having to buy Samsung's Galaxy Note devices. Do those need a
special panel as well to have that kind of accuracy, do they only require that
Android has the necessary API's and the support for those styluses?

------
dredmorbius
As anyone who mastered PalmOS's grafitti can attest, sometimes writing on a
PDA / smartphone is a lot more usable than typing, especially on a soft
keyboard.

Input remains the weak point to many or most of these devices.

~~~
gbog
The beta version of Swype is really a major improvement. I'say I can write
faster on my phone than on my pc, under certain conditions. The much slower
part is copyediting, but it is mostly because sites like hn break the flow
with some fancy textarea tweaks.

------
realmickey
When is Google's/Android's competitor to OneNote going to be available?

~~~
stcredzero
Android/iOS/Metro -- whichever one has a properly executed answer to OneNote
is going to succeed like gangbusters.

(Putting it that way, it speaks to serious dysfunction at Microsoft if it's
not them, and indeed, there's a good chance that it may not be them.)

~~~
rhplus
You're right that note-taking will be one of the killer apps for the next
generation of tablets. OneNote for Metro ("OneNote MX") is already available
in beta, so I'm guessing Microsoft is chasing that:

[http://apps.microsoft.com/webpdp/en-US/app/onenote-
mx/f02238...](http://apps.microsoft.com/webpdp/en-US/app/onenote-
mx/f022389f-f3a6-417e-ad23-704fbdf57117)

[http://www.winsupersite.com/article/office-2013-beta2/office...](http://www.winsupersite.com/article/office-2013-beta2/office-2013-public-
preview-metrostyle-onenote-143721)

------
ramanujam
Draw a heart and it'll spell it out. Nice!

~~~
Splines
It also handles diacritics too, which is handy.

------
tsahyt
From a technical point of view this is remarkable. Seems like the recognition
works really well, and that's great.

However, from a usability point of view this is probably awful. I've had an
app on my phone for a while now that doesn't do handwriting recognition but
just stores the handwritten stuff as images. Using that without a proper pen-
like stylus is a pain. Writing with your fingers just doesn't really work.
Fingertips are just not accurate enough for this kind of movement, which is
precisely why we guide a pen with multiple fingers instead of just one.

Additionally it's _much_ slower than just typing it out, even on the sucky on-
screen-keyboard that you're provided with these days.

So it's slow and surprisingly hard to use. And frankly, I don't see styluses
making their big comeback next year. People are just too addicted to touching
things with their fingers.

------
dag11
This is pretty cool, and works really well! For me, though, Jelly Bean voice
search from my lock screen is still the quickest.

~~~
setheron
My company has forced pinpad lock screen on my jelly bean.

Can't seem to find the voice search from there :(

~~~
dag11
I use pattern unlock. But I installed WidgetLocket to give me a primary
lockscreen where I was able to make one of the swipe positions a Google Now
shortcut, and another position a Google Now Voice Search shortcut, in addition
to the two existing positions of a Camera shortcut and the Unlock shortcut.

------
pragmatic
Note: This doesn't seem to work on an Opera mobile browser. Seems to work ok
in the default android browser.

~~~
MatthewPhillips
Google doesn't release mobile products for any browser except WebKit. They
specifically whitelist.

------
wasd
I wasn't able to the setting pages on Chrome while I was signed in. Am I blind
or was this a UI oversight?

~~~
tonfa
On the bottom of the search homepage.

------
ivanilla
It doesn't work on my iPhone with iOS 4. Anyone else having problem getting it
to work?

------
darkstalker
Can this be activated on desktop?

------
jenius
Sometimes opening a keyboard and typing is just too efficient. Sometimes you
need a technology that will slow your life down and make it more prone to
errors. Google handwrite: why type when you can awkwardly scribble?

~~~
bokonon
I agree with you, but at the same time I think it's great that somebody is at
least trying to innovate the tools we have for interfacing with technology. I
know handwriting recognition isn't anything new, but the idea that the
keyboard is all we will ever need can only contribute to the stagnation of
advancement.

I'm still waiting for the day I can telepathically control all my devices.

~~~
stcredzero
There could be a lot done with eyeball tracking. We already have setups that
can accurately determine where you will be looking a fraction of a second _in
the future_.

Eyeball tracking combined with speech or handwriting translation could be
quite powerful. The eyeball tracking could provide a lot of contextual
information to make speech/handwriting recognition more accurate.

~~~
bokonon
_We already have setups that can accurately determine where you will be
looking a fraction of a second in the future._

Do you have a link to more information about this? That's really impressive.

~~~
stcredzero
One's saccade eye movements are very regular in acceleration profile, so it's
possible to determine where you're going to look from the start of the saccade
movement.

I could not find the original reference, but there is some highly technical
information here:

[http://books.google.com/books?id=27jCNmafYU4C&pg=PA150&#...</a>

~~~
jenius
This is totally incorrect, eye tracking is not sufficiently advanced to be in
any way useful in terms of entering input. There exist 'visual keyboards' for
people that are paralyzed which allow them to select a key based on where they
are looking. These, while a fantastic tool to help disabled people, are not
even close to approaching the speed we achieve using a manual keyboard.
There's an example here - your eye has to linger for at least a second for the
computer to be confident in your choice.

<http://www.denverpost.com/recommended/ci_20768271>

And it only takes common sense to realize that we don't look at every key we
type, and it makes it faster. The quickest typists don't have to look at the
keyboard at all, and this increases efficiency. Both handwriting and eye
tracking are much much slower than typing, no matter what. You can be the
fastest handwriter on the planet, and still a moderately talented typist will
burn the shit out of you. It just takes less time to hit a key than it does to
write an entire letter form.

I realize my first comment was kind of mean and sarcastic, but that was
because this idea is so completely stupid and not progressive at all that I
thought the relatively intelligent community on hacker news would realize this
immediately. All my friends and co-workers who saw it were like "this is
completely dumb"... immediately.

I understand that people like things that are 'different' and 'progressive',
but this particular tool is neither of the above. It's a fun little trick that
is totally not practically useful in any way.

~~~
stcredzero
_> This is totally incorrect, eye tracking is not sufficiently advanced to be
in any way useful in terms of entering input._

Actually, you are totally incorrect. If you look back at the thread, I am not
proposing eye tracking as a sole means of input, but as a means of providing
contextual information to other means of input.

 _> I realize my first comment was kind of mean and sarcastic, but that was
because this idea is so completely stupid and not progressive at all that I
thought the relatively intelligent community on hacker news would realize this
immediately. All my friends and co-workers who saw it were like "this is
completely dumb"... immediately._

Then at least you or you and your friends are guilty of sloppy reading, of a
level I do not expect for HN. Again, this is not proposed as a primary means
of input, but as an enhancement to contextual information for speech and
handwriting input.

~~~
jenius
Hey sorry this is so late. Didn't realize what you meant by contextual input
or how that could be useful, but you're right - I can't deny such a fuzzy
statement. It's somewhat possible that this could help something sometime in
the future, definitely. Apologies for calling you incorrect. Eye tracking is
however not efficient or useful as a main input for typing and will very
likely not take a leading role in helping to recognize word/sentence input to
computers.

On the second statement, I meant my original comment about google handwriting,
the one at the top of this tree - not your comment about eye tracking. So I
apologize again - I must have not been clear in the way I stated that and I'm
sorry you took offense.

