
ExtraSensory Dataset and ExtraSensory App - yonatanv
http://extrasensory.ucsd.edu
======
yonatanv
Behavioral Context Recognition: automatically recognizing a person's context
(what she is doing? where she is? whom with? etc.) from sensors on every-day
devices (smartphones, smartwatches).

The ExtraSensory Dataset contains data for more than 300k minutes, recorded by
60 people in-the-wild (while they engaged in their regular natural behavior in
their natural environments - home, work, commuting...). It includes sensor-
measurements and combinations of context-labels, describing the behavior of
the people. It is publicly available for free at
[http://extrasensory.ucsd.edu](http://extrasensory.ucsd.edu) .

The ExtraSensory App is the mobile application that was used to collect the
dataset in-the-wild. It automatically records sensors on the phone and watch,
and it has a flexible UI with many methods to self-report your behavior. You
can also use the ExtraSensory App to run in the background of the phone and
provide real-time context-recognition (the classifier on the server-side
provides probabilities for 51 context-labels) - and you can integrate these
recognized context with your own application. The ExtraSensory App is publicly
available for free (full source code) at
[http://extrasensory.ucsd.edu/ExtraSensoryApp](http://extrasensory.ucsd.edu/ExtraSensoryApp)
.

Cheers, Yonatan Vaizman
([http://acsweb.ucsd.edu/~yvaizman](http://acsweb.ucsd.edu/~yvaizman)).

