

Researchers use accelerometers to keylog an Android smartphone - ukdm
http://www.extremetech.com/mobile/92946-a-wiggly-approach-to-smartphone-keylogging

======
ChuckMcM
I agree it is overly alarmist, but it is a clever hack using the
accelerometer.

Many people know that the 'Enterprise' class of Seagate SATA drives has a
vibration sensor which is used to adjust the seek algorithm under high
vibration loads. But you can also read it from the SMART pages. If you have an
idle drive on a system you can sit there reading/re-reading that sensor to get
a sense of whether or not your machine is being shaken, and if you have a lot
of such drives you and make some more interesting observations about not only
what is happening locally but in the environment around the drive. Not what
Seagate planned I'm sure but a fun result none the less.

------
tomelders
I'm speculating that it would be possible to disable the accelerometer
whenever a number pad is on screen. Might be one way to secure against this...
I can't think of a reason to have an accelerometer working while I'm typing
numbers.

Of course, someone could then present a fake number pad to the user, but then
someone could also hit the user over the head with a brick and steal their
wallet.

------
eridius
The bit at the end is a bit FUD:

> but in theory, someone could take the work of Chen and Cai, implement it in
> JavaScript, and then use it to steal your login details and credit card info
> when you surf the web

That's only true if the JavaScript can actually get accelerometer data when a
password prompt is up. I have no idea about Android, but while the following
is pure conjecture, I'd be quite surprised if iPhone gave JavaScript
accelerometer access while the page is in the background.

------
ryanpetrich
Conclusion is alarmist. Web pages can't receive accelerometer or gyro events
when they are in the background on iOS or Android. Applications can't receive
those events in the background either on iOS (possibly also Android?)

~~~
wxs
Would it not be possible to present a sensitive page in a full-screen iframe,
and then capture an approximation of what the user entered into that iframe by
picking up accelerometer events from the outer page? Not a perfect hack by any
stretch, since the URL would be wrong, but potentially problematic.

------
cambriar
I think it's a great 'out of the box' topic to research.

Initially, I thought it was a couple of students who had pulled this off which
lead me to, "Wow, what an awesome curriculum". However, the fact that it was a
couple of scientists in research doesn't surprise me. It's still very clever
and innovative.

On a side note: I'm very certain, having worked for my local school district's
special education program during the summer times, that there exists many
disabled students who rely on weak technologies as their way of broadcasting a
message. (They cannot simply just walk into the classroom and 'text'.)

For instance, I was astonished when I saw this poor girl, in a push
wheelchair, swaying back and forth a little bit, using a head button* to
decide on a letter/word/picture/phrase sitting in a 5 x 10 table scrolling the
x, then the y to a final destination. 2 minutes later, she said "Good Mrning".
I can almost guarantee that a nice bit of software to detect a few twitches
and movements is far more welcome to them.

*Head Button: Imagine the chair in the picture below with an over-sized arcade button taped to the chair's headrest. The display she used looked worse than an old Gateway of mine.

<http://imgur.com/2cQ7h>

~~~
stevenbedrick
If you're interested, the field that studies these sorts of technologies is
called Augmentative and Alternative Communication (AAC), and it is a very
active area of research with a lot of great stuff going on. My department does
a lot of work in this area; drop me an email if you're interested and I can
point you in the direction of some reading.

You are 100% correct that current AAC solutions are often suboptimal, but one
of the big challenges facing the field is that individual capabilities (both
in terms of cognitive function and motor control) vary _extremely_ widely from
user to user, and so there's no such thing as a "one-size fits all" solution.
Oh, and head buttons like the one you saw typically use "over-sized arcade
buttons" because the user doesn't have the fine motor control needed to
operate anything smaller. The idea is to give them as big and easy a target to
hit as possible. There are users who, literally, are limited to twitching one
of their eyebrows as their sole voluntary muscle motion, and sometimes even
that is unreliable (i.e., it only twitches some of the time, or for a short
period of time). In such a case, you want the button to be as easy as possible
to trigger- think Fitt's law. :-)

Accelerometer-based solutions such as the one described in the article have
the potential to be extremely useful; however, in the case of a user who's
limited to a head button (or an elbow button, a nose switch, etc. etc.), the
physical movements involved could well be too variable and irregular to be
usefully decoded. That said, one of the active areas of research within the
AAC world is how to build machine learning algorithms into AAC software such
that an individual's system adapts over time to their patterns of use, so
depending on what's going on with a particular user it might be possible to
train something up... but, of course, doing this in any kind of repeatable way
often ends up to be a crapshoot, as, again, users vary incredibly widely in
terms of which muscles they can control and the extent to which they can do
so.

------
d_r
Sadly, very hard to read on my iPhone because the "mobile layout" magic makes
the page erratically jump up and down when I scroll.

------
barista
So why is this specific to android again?

