
Project Tango - psbp
http://www.google.com/atap/projecttango/
======
martythemaniak
This is essentially putting a Kinect in your phone and hooking it up with
hopefully high-level APIs. It may take 2-3 years to make it into regular
phones, but when it does it will be huge. Apple's acquisition of PrimeSense
(the makers of the 1st gen Kinect) means they're also working on this.

To give you a real-world example: when I started BarSsense
([http://www.barsense.com](http://www.barsense.com)) the core problem was
tracking the path and velocity of a weightlifter's bar. I bought a PrimeSense
camera because it can extract a lot more data and with greater accuracy out of
an image than a regular camera. After some prototyping, I decided to use a 2D
camera and deliver the software as an app because I thought wide distribution
and ease of use was more important than the fidelity and correctness of the
data - ie, the "worse is better" approach. When these cameras make their way
into regular phones, "worse is better" will suddenly become "better".

~~~
evanlivingston
I don't understand the application beyond gaming. The other applications
discussed are basically shopping.

Essentially this would make it much easier to represent the physical world
digitally. But what use cases does a consumer or the average phone user have
for digital representations of the physical space around them, particularly
given that the user is already aware of the the physical space around them?
How can this sort of digital device extend our ability to interact with
physical space?

~~~
groby_b
* Mapping the physical world

Ever wanted a floor plan for your home? Just wave your phone around.

* Visual annotation

Add direction overlays, see the plan for a play your team is executing on the
helmet HUD, highlight "dangerous"(weaving, too fast, whatnot) drivers on the
car HUD.

* Integrate sensor data to extend human perception

Add an IR overlay. Sample sound across the room, do a volumetric display of
noise levels. "see" the strength of your WiFi signal.

* Image post processing

You have a 3d map of an area, plus pictures of all textures - rearrange to
your hearts content.

* Alternate Reality

Completely change the look of the world around you, just because you can.
(Semi-useful application: Interior decoration. See that couch right in your
living room before you buy it)

There are tons of applications there. It mixes the "reality" of physical space
with the malleability of the digital space.

~~~
evanlivingston
I think I'm a neo-luddite.

~~~
gilgoomesh
I don't think you're a luddite. With any technology like this, it's important
to ask: what applications does this really enable and will those applications
simply be a waste of time.

This is an unveiling of a technology but the "applications" they show in this
video are probably a waste of time to most users. Google have not shown a
killer app that uses this tech. Maybe they're working on something (they hint
that they may be planning to integrate indoor mapping into Google Maps which
might be interesting) but they're not showing it in this video.

That doesn't mean the tech is bad. But we haven't seen enough to judge it as
useful to end users.

~~~
jrockway
Google Maps already has indoor maps.

~~~
girvo
Yeah but they even said in the video that they struggle with indoor navigation
and positioning.

------
SiVal
"Imagine measuring your room by just walking around in it." Yes, and imagine
transmitting each of those footsteps in real time back to Google, so they can
have a map of your room, too, in case they, or other parties, ever need it.

I was just having a discussion yesterday with a friend who works at Google
about what data they store when you query their search engine. Every single
keystroke, including backspaces, is stored. They don't just know what you ask.
They know how well you can spell and know how well you type, not just in
general but down to specific letter sequences. With this data, they can tell
if you are regularly more impaired (fine motor control) at some times than at
others, or if you're growing more impaired over time and match that against
the content of your queries, etc.

"Phones that don't limit their boundaries to a touchscreen", meaning, we're
not satisfied limiting our knowledge of you to just what we can extract from
what you enter and how you enter it and when on a touchscreen. We want to know
every step you take, when you sit, when you stand, how and where you walk....
SO much more data about you and your world that we can mine for treasure!

I'm not saying that Google is evil. My friends at Google certainly aren't.
It's just that they are like kids in a candy store with unprecedented access
to data and so many great, new algorithms for extracting information from it
that they are just loving it, the way geeks would. But we're really going down
a rabbit hole here.

~~~
blhack
Good god I am so sick of reading comments like this. This view of the world is
just _lazy_. Almost as lazy as "remember, if you're using a service for free,
you're not the customer, YOU'RE THE PRODUCT!", which seems to get parroted
every single time there is any discussion related to google, facebook,
microsoft, instagram, twitter, or any other web company that uses ads as a
revenue model.

Yes, google can know what your room looks like. Google could also know
literally every place you go, every message you send, every transaction you
make through a bank, every website you visit, every photo you take, every
person you meet, when you use the bathroom, when you sleep, etc. etc. etc.

Who _cares_? If you don't like these services, don't use them. Nobody is
forcing you to use this device.

Have you, or any of the other people on HN who repeat this ad nauseum, ever
considered the possibility that the people running google are actually just
hackers who got lucky enough to have the resources to pursue projects that
they think are cool?

I mean...what would you do if you had google's resources? If I had google's
resources I'd probably be doing exactly what they're doing. Things like trying
to improve the broadband situation in the United States, protected rhinos from
African poachers, developing cool future-tech like self-driving cars, etc.

If you don't like modernity, stop using modernity. Move to a homestead in the
pacific northwest, never use a telephone, never use grid power, grow your own
vegetables, grow your own cattle, and keep away from the evil, scary spies at
_gooooogggllleeee_ who are trying to...uh...give you better targeted
advertisements? Or provide you with better search results?

~~~
discostrings
> Good god I am so sick of reading comments like this. This view of the world
> is just lazy.

I think your view of the world is the lazier of the two--you're basically
saying "let's not worry about the negative implications of the things we
create as long as our intentions are good".

> Who cares? If you don't like these services, don't use them. Nobody is
> forcing you to use this device . . . If you don't like modernity, stop using
> modernity. Move to a homestead in the pacific northwest, never use a
> telephone, never use grid power, grow your own vegetables, grow your own
> cattle, and keep away from the evil, scary spies at gooooogggllleeee who are
> trying to...uh...give you better targeted advertisements? Or provide you
> with better search results?

Here's how your entire comment reads to me: "If you're interested in thinking
about the future implications of the new things we're creating, please shut up
and go live in the woods, because you obviously hate modernity. I'm really
sick of people thinking about the future and how we might be creating things
that will eventually hurt us. The market comes before foresight, and
commentary isn't welcome; when you have potential concerns for the future, the
only acceptable response is voting with your feet. Please do that and don't
ruin our fun."

What you miss is that one can object to a single, potentially dangerous aspect
of recent developments (personal information collection and exploitation)
while embracing technology in general at the same time.

In creating new technology, it's essential that we examine the paradigms we're
creating. If no one's doing that, we're almost sure to run ourselves into
serious trouble sooner or later. There's a reason you look before you leap.

Which is lazy: choosing not to use awesome new technology because it has some
potentially dangerous implications, or just using any new and seemingly
convenient thing that people come up with and assuming it's got to be okay?

~~~
nostrademons
I think the grandparent's point is to focus on things that are _actionable_.
If you are worried about the information Google collects on you and so choose
not to use Google products - fine, that's your right. But if you're worried
about the information Google collects on you and so choose to plaster that all
over message boards every time Google is mentioned - exactly what are you
accomplishing? Do you think the people reading your comment don't already know
what information Google has available? Do you think Google's going to change
its practices because you complain?

~~~
discostrings
> if you're worried about the information Google collects on you and so choose
> to plaster that all over message boards every time Google is mentioned -
> exactly what are you accomplishing?

If you're worried about the information Google collects about /everybody/, and
you're posting in /relevant topics/, I think you're accomplishing quite a lot.
I think it's really important that when new technology is announced,
interested parties discuss both "here are some exciting new possibilities" and
"here are some areas for concern". In an impartial discussion forum, a product
announcement shouldn't just be an excitement-fest.

> Do you think the people reading your comment don't already know what
> information Google has available?

This technology will likely result in Google and others collecting and having
access to new sources of information about people. I think that many people
reading the announcement might not immediately think about some of these
implications, and I think it's a great time for review and examination of the
topic.

> Do you think Google's going to change its practices because you complain?

I think some engineers and others working on this project and other projects
at Google probably read HN, and I think comments here could have an effect on
how they build this product and what data is eventually sent to the company.
Reading these concerns might be a good reminder to reflect on these things
from time to time as well.

> I think the grandparent's point is to focus on things that are actionable.

Action without reflection is a dangerous formula. Google is taking action by
building this and announcing it. What we do at HN is learn, reflect, and
discuss.

------
magicalist
"\- Johnny Lee and the ATAP-Project Tango Team"

Johnny Lee was the guy with the awesome Wii Controller demos back in 2007
(can't believe it's been that long).

[http://www.youtube.com/watch?v=Jd3-eiid-
Uw](http://www.youtube.com/watch?v=Jd3-eiid-Uw)

edit: here's the full set of demos:
[http://johnnylee.net/projects/wii/](http://johnnylee.net/projects/wii/)
(also, I'm assuming it's the same guy, but his site says he's at Google now)

~~~
yelnatz
I am so happy to see a familiar face when I played the video.

Back in 07/08 when he demoed his hacks in TED it really blew my mind. [1]

I was really hoping someone would pick him up and let him loose on some
projects.

I guess Google did just that.

[1]
[http://www.youtube.com/watch?v=0H1zrLZwPjQ](http://www.youtube.com/watch?v=0H1zrLZwPjQ)

~~~
bradyd
He worked on the Kinect project at Microsoft as well. In fact he put up the
prize money for the Open Kinect contest run by Adafruit to develop drivers for
the Kinect.

[http://procrastineering.blogspot.com/2011/02/windows-
drivers...](http://procrastineering.blogspot.com/2011/02/windows-drivers-for-
kinect.html)

------
skywhopper
I wish I could go back to the time when I thought this kind of thing was
awesome. Nowadays, it's just one more aspect of our private lives that'll be
stored on Google's (or whomever) servers, sold to advertisers, harvested by
the NSA, and abused by law enforcement. Add some object recognition to the 3D
scanning, and you can start getting marketing messages about how much better
[Featured Brand]'s refrigerator would suit your needs, courts will rule that
no reasonable person would expect the interior contents of their house to be
private and require a search warrant, and the DEA will be able to find
something that looks enough like drug paraphenalia in the images from anyone's
home to justify a home invasion of whoever their preferred target of the
moment is.

~~~
rl3
I suppose the good news is that 3D point cloud data in context of a public
setting wouldn't be much more invasive than the already-pervasive video
surveillance we have today.

The bad news is as you say: having mapping data of your private residence
siphoned off by third parties, be it government or private industry.

I would say I'm as excited as I am worried about the potential for this
technology. That said, I think the privacy implications are very similar to
Kinect, which is already here. The only difference perhaps being mobility.

~~~
psbp
Did HN react to the Kinect or Kinect 2 this way? It seems like there's a lot
of burden put on Google to not do stuff that could be creepy even though
that's all they do.

~~~
bri3d
I can't comment on "HN's reaction" because I don't tend to put much stock in
trying to draw a consensus from HN.

However, my recollection is that there was a reasonably vocal reaction to the
privacy implications of Xbox One / Kinect 2 in gaming and tech media, although
they mostly related to the timing of the first Snowden leaks. This article is
a pretty good example of the media hype around it:
[http://www.theverge.com/2013/7/16/4526770/will-the-nsa-
use-t...](http://www.theverge.com/2013/7/16/4526770/will-the-nsa-use-the-xbox-
one-to-spy-on-your-family) .

I think the "always-on" nature was more criticized than the point-cloud/time-
of-flight abilities of the camera, but pushback against more and more ways to
collect data attached to equipment made by big tech corporations isn't
exclusive to Google.

------
k-mcgrady
Really cool. I love how this could help the blind (my first thought when I saw
how it worked - glad it was acknowledged in the video as a use case). The
stuff Google's working on (and the face they're making it public is really
exciting me (this, Glass, contact lens etc.). The more I see the more I think
I'm going to spend more time developing for Google's platforms and less time
developing for Apple's. I'm sure Apple has a few cool projects they'll be
sharing this year but Google seems to be the one pushing the boundaries.

~~~
connerdc
Since you mentioned it here is a project I'm working on with blind and
visually impaired users with Glass
[https://www.youtube.com/watch?v=CEDg0k1HsH8](https://www.youtube.com/watch?v=CEDg0k1HsH8)
and I'm interested in extending it using something like tango.

~~~
k-mcgrady
Excellent work. Tango would make this incredibly powerful although I wonder
how long it'll take to get those sensors packed into Glass.

------
Hermel
I'm looking forward to the first first-person shooter built on Tango. E.g. a
ghostbusters game in which you walk around in your own house, the ghosts only
being visible through your Tango.

~~~
bryogenic
How about tactical coordination and mapping for teams (i.e. paintball /
special operations)? I'm picturing the HUD 'map' from every FPS game, except
it is based on the current view and relative locations of the team's devices.

~~~
JshWright
How about a HUD that projects a map onto a firefighter's SCBA facepice?

That would be absolutely incredible... Firefighters die every year because
they get disoriented and lost in zero-visibility conditions. Even if it didn't
have a pre-generated map of the structure loaded... it could build a map as it
went, and at least be able to provide a 'retrace my steps' view.

~~~
lotyrin
This just adds a depth camera to the phone and uses it and the visible light
camera to map the space and orient the device. If a human is disoriented in a
no-visibility situation due to smoke, the cameras will not likely be able to
orient them either, except maybe something like a power outage, where visible
light is unavailable but the depth camera can still function.

~~~
JshWright
Why does it need to be a visible light camera? Why not infrared?

~~~
lotyrin
Depth camera isn't visible light, but will likely also be blocked by smoke.

------
tbenst
My submission:

3D Printing is a big data problem where the data is not being collected.
Sensors in desktop 3D printers are usually restricted to simple limit switches
on the axes.

We would use Project Tango for a real-time feedback system for 3D Printing.
Initially, we would demonstrate a simple functionality: recognizing when a
print is failing and instructing the machine to stop, rather than waste more
material. Next, with the help of the open-source community, we would expand
functionality to dynamically adjust machine instructions to compensate/fix
problems observed during the print. Here are a few examples:

\- adjust bed height for different layer heights via software rather than
manual hardware tinkering

\- dynamically change extrusion rate if underextrusion/overextrusion is
observed

\- detect if belts are slipping & correct extruder positioning

\- pause print is no filament is extruding

\- intelligently resume print if stopped (e.g. power failure)

\- inform slicing software if/where/why a print fails so the software can
reslice and repeat properly

For users, no new hardware will be needed besides Project Tango - a computer
will stream GCODE instructions via USB to a RepRap-like 3D printer (e.g.
Makerbot, Ultimaker, etc.). Project Tango is precisely the breakthrough we
have been waiting for to make 3D printing more user friendly.

~~~
aw3c2
You can do that right now with a gazillion of applications that do
photogrammetry. Project Tango is nothing new per se. Just add a stereoscopic
camera, choose software and add the analysis/decision-making. Some links to
get you started:

[http://insight3d.sourceforge.net/](http://insight3d.sourceforge.net/)

[http://ccwu.me/vsfm/](http://ccwu.me/vsfm/)

[http://www.cs.cornell.edu/~snavely/bundler/](http://www.cs.cornell.edu/~snavely/bundler/)

[http://www.arc-team.homelinux.com/arcteam/ppt.php](http://www.arc-
team.homelinux.com/arcteam/ppt.php)

[http://ti.arc.nasa.gov/tech/asr/intelligent-
robotics/ngt/ste...](http://ti.arc.nasa.gov/tech/asr/intelligent-
robotics/ngt/stereo/)

------
jlas
"Unfortunately, due to FCC restrictions, we can only send development units to
incorporated entities or institutions at this time."

That's a bummer.

Also, the page's default background-color should be set to black (or something
dark). Most of the text is white(ish) and with a slow connection the
background images take a while to load, making it impossible to read while you
wait. /rant

~~~
cakeface
This is a little creepy actually when you think about it. Here is this amazing
new technology and the government says that only corporations are allowed to
possess it. I'm sure that there are reasons but on the face of it I feel like
it is right out of a cyberpunk book.

~~~
stan_rogers
That just indicates that it hasn't passed Class B certification yet (under
Title 47 CFR Part 15), which basically means that as an unintentional RF
radiator (that is, at least in part not an intentional radiocommunication
device), it hasn't yet been shown not to interfere with other devices. Even
the wall wart for charging your phone needs to meet Class B in order to be
sold or supplied for household use.

------
TophWells
Here's the application I thought was immediately obvious: memory palaces.

Based on the old mnemonic trick of taking a real physical location that you
know well, and associating memories with objects in that space. The digital
version of this would be having files and data stored in a "physical" place -
although they're not solid, they'd be tied to a single location.

Harder to organise, but I know several people who have completely filled their
computer's desktop with shortcuts, because they don't like futzing around with
folders. The folder metaphor isn't the be-all and end-all, there are times
when it's appropriate and times when it's not. The metaphor of icons that are
dragged around the screen is limited by available screen space - Project Tango
gives you a house-sized (or even just room-sized) 3D space to play with, more
than enough for all the files you could need to be immediately visible.

The main risk is that my virtual room could end up as messy as my real room.

I don't know how anyone could make money from this, but it would be really
damn cool.

------
kodablah
Here's the best idea I can think of with this technology.

Imagine taking a scan of your pantry, refrigerator, and/or laundry room. Then
mark everything as what it is (e.g. "box of cheez-its", "milk", etc). Then
come back a few days and do the scan again and it'll tell you what's missing.
Once you return from shopping, scan again saying what the new items are (even
if they aren't what was there). The software would probably need to recognize
certain shapes so a slight rearrangement/movement doesn't change. It'd be like
history/bookmarks/favorites for perishables!

~~~
nivla
Isn't it much easier to just look into your refrigerator/pantry and keep a
mental track of things? There are a few pitfalls to your suggested method:

* It takes a good amount of time scanning and cataloging everything and we are taking about a world where people don't find enough time to scribble down these things on a piece of paper.

* What if you pulled out cereal box A and cereal box B but switched places when putting them back?

* Since its scanning the boundary of objects, it won't be able to tell you if your Milk is empty, half empty or full.

The one thing I could think of where this is useful is in scanning
rooms/objects which are for sale. Like 3d scanning the bike you are going to
sell, or a realtor using it to provide a virtual 3d tour of a house. However,
Microsoft already did it with their Photosynth program and regular photos you
take [1], so I am not sure how this is going to fair if it turns out to be
expensive.

[1] [http://photosynth.net/](http://photosynth.net/)

~~~
kodablah
My idea was more of a replacement for a shopping list, not necessarily one's
memory. To do diffs on physical environments may help situations where the
contents of the diff matter. Maybe you don't have to catalog, maybe it can
just show you what shape appears to be gone. Yes, there are many
practicalities not ironed out, but I could see it being useful.

------
sirkneeland
I swear you could just write a "HN comment generator" script for these sorts
of articles:

input: $new_google_tech

output string: "$new_google_tech will let Google know more about you! I
disapprove! Also, NSA."

Repeat. Occasionally sprinkle with insightful comment about the actual
technology being introduced.

------
twelvechairs
This kind of technology will lead to very big changes in the surveying
industry, which is big. Also will lead to open 3d city models (the next
openstreetmap will be in 3d and textured - IMO gmaps and apple maps will lose
out to this eventually) and real-estate models (see your apartment without
actually visiting) which will lead to lnock-on effects for a range of other
property industry fields. Academically - urban studies (a field mostly based
on experience and conjecture) will benefit hugely from realistic 3d
modelling...

------
uptown
That link seems dead to me. This article has more info:

[https://news.ycombinator.com/item?id=7272849](https://news.ycombinator.com/item?id=7272849)

[http://techcrunch.com/2014/02/20/google-launches-project-
tan...](http://techcrunch.com/2014/02/20/google-launches-project-tango/)

------
patrickaljord
Funny, the official tagline of the project is "We like epic shit".
[http://i.imgur.com/mvcyUX4.png](http://i.imgur.com/mvcyUX4.png)

[https://plus.google.com/115422404677762786098/about](https://plus.google.com/115422404677762786098/about)

------
snotrockets
I wonder if Google has a stack of experimental projects to release whenever a
competitor announces some headline grabbing news.

~~~
chaz
PR works better when you don't compete with competitors for attention. You
either beat them to the headlines, or you wait a couple of days to find a
clearing. This announcement is interesting, but not nearly the scale of
WhatsApp/FB.

An exception might be if you have a very similar announcement. Doing so would
get you included in the same articles as your competitor.

~~~
MichaelGG
>announcement is interesting, but not nearly the scale of WhatsApp/FB.

Wait, what? Why would Google view the WhatsApp news as competitive? Other than
it setting an insane price and inflating bubbles more, how is the FB
acquisition really useful news to anyone?

This tech will have far more impact on the world. Apart from the few people
doing startups, FB buying WhatsApp will have almost zero impact in day-to-day
lives.

~~~
chaz
> FB buying WhatsApp will have almost zero impact in day-to-day lives.

I was only responding in the context of parent's PR comment, re: scale of
press worthiness of the day. Tech sites have been covering WhatsApp/FB non-
stop for the past 24 hrs.

As for impact on the world, this is just a call for developers. I'm encouraged
by the ambition, but it's too hard to judge if this will be successful.
FB/WhatsApp will almost immediately affect 400M users (+1M more per day).

------
iceIX
Interesting to see that this is coming from ATAP - the one part of Motorola
that Google kept in the sale to Lenovo[1]. I was curious to see if they would
merge the ATAP group into Google X, but I guess they're going to keep it
separate and just rebrand it as Google ATAP.

[1] [http://www.theverge.com/2014/1/29/5359068/google-keeping-
mot...](http://www.theverge.com/2014/1/29/5359068/google-keeping-motorola-
advanced-technology-group-project-ara-phone)

------
ihsw
I'm shocked by the distinct focus on fear and doubt about Google's intentions
with this technology -- does it really matter _that_ much?

The rising tide lifts us all, and this technology can be ubiquitous within
twenty years. Isn't that worth it?

~~~
gkya
Google is enormously huge, and the breadth of their product line is jaw
dropping. This fact causes the instant reaction of connecting every
"innovation" that comes out of it to be connected to greed for more and hunger
for data. This would be an overtly exciting Kickstarter project to find out
about, but when it is from Google, the perception is that they are managing to
invade one more market.

They _are_ the Internet, they have the half of cellphone market, they sell
apps, books, CPU cycles, they sell Chromebooks, tablets, they make maps, they
provide apps for chat, video conferencing, they've made an online office
suite, they store your documents, they make the mobile OS that is sold on
other company's mobile phones, they do mobile payment, they create programming
languages, they employ Ken Thompson, Rob Pike, and probably the guy you envy
the most, they have quantum computers and they do a sh*t load of other things
that I can't bother to remember (Translate, Youtube, G+, Blogger, Groups,
...).

It is not possible (for me, at least) to attach Google the childish excitement
that two geeky weirdos with scruffy hair bear for their fundraiser. Google is
now the hot kid in the school, who's tall, strong, handsome and athletic, who
also plays guitar and is very successful at his exams, speaks three languages
and represents his school in the drama festival, and also participates in the
school band as whoever the guy who gets the most attention in a band.

Innovation is good for science, but it is harmful to the collaborative efforts
when it comes from companies that should've been saturated. When Google dies
(like anything else), it will fell over the crowd that lives under it's
shadow.

------
ok_craig
This technology inside Google Glass would be pretty killer.

~~~
sailfast
As long as it's in the phone, couldn't you tether it to Google Glass to
generate a real-time HUD based on where you were? (and perhaps also capture
images to attach to the 3D model)

That was the first thing I thought about - actual augmented reality down to
the inch would be quite something.

~~~
Crito
With future versions of Google Glass perhaps. The current version does not
have a display that can provide a HUD in the way that it seems you are
thinking.

------
shanselman
This is the return of Johnny Chung Lee!
[http://en.wikipedia.org/wiki/Johnny_Lee_(computer_scientist)](http://en.wikipedia.org/wiki/Johnny_Lee_\(computer_scientist\))

------
radley
Isn't the next _win_ turning your house / apartment into a game world /
holodeck using something like this together with an Oculus Rift?

Household layout and furniture are mapped out, then re-textured to represent a
castle, evil lair, enemy corporation, etc. Textures can update allowing the
story to reuse each room as different places as you progress - the same way
the holodeck area is actually small but uses optical illusions to give you a
sense of greater mobility.

Not saying we have holodeck. But it's a step towards.

------
gamegoblin
I wonder if it uses the machine vision aspect to prevent the typical problem
of accelerometer drift. e.g. by orienting itself relative to walls/other
stationary things.

Also, imagine making a 3D "scanner" that you can scan objects with into a
virtual world, or print out on a 3D printer.

~~~
gallamine
Accelerometers usually don't have significant drift - digital gyroscopes do.
The acceleromters are quite noisy though (and measure both gravity and
acceleration). From some of the brief images in the intro movie it looks like
the cameras are doing tracking and alignment of "interesting" features. Things
like that fall into the Simultaneous Localization and Mapping (SLAM)
technology area. Very cool stuff.

~~~
boucherm
Well, maybe was I just unlucky, but I tried to use an IMU for my SLAM and was
shocked by the extravagant amount of drift of accelerometers.

~~~
greendestiny
Do you mean the accelerometer values drifted over time (i.e. the direction of
gravity shifted on a stable accelerometer) or do you mean the value when
doubly integrated drifted?

~~~
boucherm
Right: the doubly integrated values shifted while the sensor didn't move.

~~~
boucherm
*drifted

------
benhirashima
It's great to see mobile 3D scanning receive validation from a major player
like Google. My company, RealityCap, is working on bringing 3D scanning to
mobile devices without the need for specialized 3D cameras. We use the sensors
already present in smartphones: the camera (monocular), accelerometer, and
gyroscope. That means that we can bring 3D scanning to millions of mobile
devices that already exist. [http://realitycap.com](http://realitycap.com)

------
Aoyagi
Every time something new comes out from Google or Microsoft, I just wait to
see where is the catch. There is always _the catch_.

~~~
nivla
I guess the _catch_ in this would be that they get to integrate the public
scans with Google Maps and/or help train Google Glass about the surroundings.

------
ulyssesgrant
As someone who's worked on visual odometry projects before, I'm extremely
interested in how they're able to compute the phone's motion estimations so
accurately and in real time on a mobile device. Is any of this project open
source? Is anyone here familiar with the vision techniques they're using to
accomplish this feat? Perhaps there's an FPGA or ASIC designed for visual
odometry inside the phone.

~~~
siera
It uses a Myriad 1 processor from Movidius
[http://www.movidius.com/tango](http://www.movidius.com/tango). It's a custom
media coprocessor aimed at mobile devices. I found more information about the
processor architecture in these slides : [http://www.hotchips.org/wp-
content/uploads/hc_archives/hc23/...](http://www.hotchips.org/wp-
content/uploads/hc_archives/hc23/HC23.19.8-Video/HC23.19.811-1TOPS-Media-
Moloney-Movidius.pdf)

Edit : Video of the talk associated with the slides
[http://www.youtube.com/watch?v=KiTaPRtN2yM](http://www.youtube.com/watch?v=KiTaPRtN2yM)

~~~
ulyssesgrant
Thanks for the info. Still very impressed with the speed and accuracy,
assuming it works as well as they have presented. To be able to compute visual
odometry in an arbitrary scene on a mobile device seems like a really big
accomplishment, no?

~~~
siera
Visual odometry is getting quite accurate. For example, it is used on the Mars
Exploration Rovers (Spirit and Opportunity) to improve the position obtained
from wheel odometry [1]. As you mentionned, the big accomplishment is doing
this in real time on a mobile device. On a typical mobile ARM processor, it
would probably be very slow and drain the battery in no time. That's why they
are using a custom coprocessor designed specifically for image processing and
computer vision. The next challenge is porting the existing visual odometry
algorithms on this new processor. It looks like hiDOF did the implementation
[2].

[1]
[http://onlinelibrary.wiley.com/doi/10.1002/rob.20184/abstrac...](http://onlinelibrary.wiley.com/doi/10.1002/rob.20184/abstract)

[2] [http://hidof.com/projects/project-
tango/](http://hidof.com/projects/project-tango/)

------
roycehaynes
Tango looks a lot like The Structure Sensor by Occipital -
[http://structure.io/](http://structure.io/)

~~~
rch
I still like that Occipital is planning to ship a standalone unit, and support
people wanting to 3D print various housings.

I guess the positive interpretation is that crowdfunding seems to be putting
some pressure on larger companies to go public with research projects earlier
than they normally would.

------
_random_
"written in Java, C/C++, as well as the Unity Game Engine" \- good news and a
discreet way of saying they support C# via Mono!

~~~
pjmlp
Except you can't really use Unity for standard line of business applications
and their Mono runtime is stuck in .NET 3.5 land.

~~~
_random_
I personally don't like Unity, but at least Google is starting to think out of
the mainstream OSS box.

~~~
pjmlp
I was looking forward to the FOSDEM talk about ART, but they cancelled it.

The only thing we can infer from the public source code is that it has been
enabled by default for whatever might be the next release.

So no news if with ART, other languages could be better supported or not.

------
rcthompson
Ulterior motive: [https://xkcd.com/1204/](https://xkcd.com/1204/)

------
pyvek
Is anyone else getting 404?

    
    
        The requested URL /atap/projecttango/ was not found on this server. That’s all we know.

~~~
kfinley
Yes.

Google Cached:
[https://webcache.googleusercontent.com/search?q=cache:VI0ttG...](https://webcache.googleusercontent.com/search?q=cache:VI0ttGWUYloJ:www.google.com/atap/projecttango/)

Video:
[https://www.youtube.com/watch?v=Qe10ExwzCqk](https://www.youtube.com/watch?v=Qe10ExwzCqk)

------
ultimatedelman
I think a game that maps your home or yard or street or whatever, recognizes
common objects like trees, benches, cars, houses, people, etc, skins them with
some sort of other-worldly skin, and allows you to walk around in the world
finding virtual treasures and going on dynamic, virtual quests and fighting
monsters by _actually walking around_ would be the first literally fully-
immersive game. incorporate events queued by actual sound coming in through
the mic and you have the coolest game ever made, to date.

think ingress, but overlaid in real time on what your camera sees and phone
hears, instead of just on google maps.

------
Nate630
Google needs more data so that Google Maps can map the inside of your house.
:-)

------
runewell
Did I understand this correctly, will this actually map onto virtual Earth
geometry at the same position and scale as real-life? That would be amazing,
you would literally be able to use Google Maps to walk inside buildings and
look around not to mention it could be the first Earth-accurate metaverse.
Imagine sitting down with a more advanced pair of Google glasses and being
able to see both your real-life environment and its identical online copy with
virtual avatars navigating the hallways along with your real-life co-workers.

------
znowi
GPS tracking + complete 3D environment history of your presence.

I can imagine how NSA is excited :)

------
pinaceae
interesting that they built this into a phone. but i guess once you need a
mobile computing device with a screen and a camera you end up with a "phone"
nowadays.

the reliance on lenses is a hinderance though as it has two and holding it
without covering them both requires user effort. you can see it at the end of
their video where a finger is partially over the second lens.

still, exciting!

~~~
salgernon
Surely the right place for this is in a google glass. Or at that point, given
the sensors required, a google helmet.

------
d0
Is it just latent paranoia, or can I just see evil uses for this?

~~~
carbocation
Every new technology, as well as every old technology, can be used for "good"
or for "evil." More broadly, many activities to which "goodness" can be
assigned will vary in "goodness" based on the context (in many ethical
systems).

------
bambax
> _updating it’s position and orientation in real-time_

ITS NOT IT'S

~~~
72deluxe
Or, you could reword your comment (which I upvoted):

It's not "it's"! It's "its".

------
rasz_pl
[https://www.youtube.com/watch?v=Qe10ExwzCqk#t=19](https://www.youtube.com/watch?v=Qe10ExwzCqk#t=19)

19 second mark shows they are using monocular structure from motion.

Some Japanese Uni has a working solution(HDL inside FPGA) doing same thing too
[http://hackaday.com/2014/01/12/autonomous-quadcopter-fits-
in...](http://hackaday.com/2014/01/12/autonomous-quadcopter-fits-in-the-palm-
of-your-hand/)

you can try it yourself, code is on github
[https://www.youtube.com/watch?v=E35xbo3r8rA](https://www.youtube.com/watch?v=E35xbo3r8rA)
[https://github.com/nymanjens/ardrone-
exploration](https://github.com/nymanjens/ardrone-exploration)

------
kamaal
This is really amazing. This is basically the stepping stone for 'self driving
car' equivalent applications for indoors.

One wonders what Google is upto. Big investments in AI research, and then
things like this. You could build domestic robots to do all sorts for work
with a technology like this.

And I'm not just talking of toys or games here. The very nature of consumer
electronics can be redefined with technologies like these.

Something to think of:

2015:

-> By now, it is likely that "clean a house" will be within the capabilities of a household robot.

[http://en.wikipedia.org/wiki/Predictions_made_by_Ray_Kurzwei...](http://en.wikipedia.org/wiki/Predictions_made_by_Ray_Kurzweil)

------
pazimzadeh
This should be interesting to the guys who made Minecraft Reality.

[http://13thlab.com/](http://13thlab.com/)
[http://minecraftreality.com/](http://minecraftreality.com/)

~~~
trippy_biscuits
So how long will it be before someone can take a picture of a person and find
all the similar matches on Tinder modulo any profile preferences and desired
attributes? Maybe save it for later and use it in a completely different
location? In other words, take a picture and then have an app that finds as
many matches for that person all while applying any filter preferences the
user has in his or her own profile. One person may see exciting matching
opportunities. Others may use these modern technology aids in becoming the
ultimate modern predator. It's one thing when you want to find a great date.
It's quite another when someone uses it to find your son or daughter.

------
deletes
I understand the idea that you can get the exact positions of the surrounding,
if you have an accurate position of the device. But how are they achieving
that? I don't know of any such small tech capable of being accurate to 1cm,
without external help.

I don't see how can they do this without an accurate position of the device.

An example when it breaks down without the position information:

Pointing the device perpendicular at an uniformly colored wall and moving
parallel to the wall. Then the visual information is undefined and you have to
rely on very accurate position sensors( which don't exist ) to correctly scan
the environment.

~~~
fudged71
I'm sure tiny gyros and accelerometers have improved in the past couple years.
They've probably advanced dead-reckoning using existing sensors. And maybe
they can use both front and back cameras for position information.

~~~
deletes
But they haven't improved. Oculus is having these problems right now. Their
best solution is having an additional camera looking at the user, and they
have a much simpler task with a stationary user.

------
Pro_bity
Forget the phone, this is technology is ideal to merge into Google Glass.

~~~
hellbanTHIS
Google Glass: now with Spider-Sense!

------
flyt
Not a single woman in this entire video.

edit: actually there's one I saw in the background as a camera pans over a
table, but 99.9999% of the people represented here are men.

~~~
uglycoyote
yeah, and I thought it was a bit odd that they were all sitting so close to
one another

------
belgianguy
I like the mapping ability, imagine mapping out your house and be able to use
it as a level in a game, like walking on your ceiling with Oculus Rift.

------
atburrow
It would be very interesting to see an application for this using Glass. Could
this potentially help blind people navigate in unknown environments?

~~~
georgemcbay
Yeah it could be used for that, though I'm not sure why Google Glass would be
used in that situation. A chest strap to mount the device looking forward and
maybe headphones in the blind user's ears (to speak out TTS directions, or
even just tones which represent directions) would be a better feedback
mechanism.

~~~
atburrow
I agree. I suppose a Google Glass would be a bit silly since the screen would
be unused, but seeing as it has most of the technology built in already, it
would be easier to get a prototype working. An ear piece is another bonus
added to the device as it could notify the user of obstacles.

------
archagon
Space detection is very exciting to me, but this feels like a technological
stopgap. Is there any research being done into "blind" space detection, that
doesn't require the use of cameras? Is that even possible? It would be amazing
if I could have my phone in my pocket and (for example) adjust the rotation of
a nearby camera based solely on my hand gestures.

------
cmollis
awesome. I continue to be amazed at Google's breadth of ambition. If it works,
which I assume it mostly does, I think it will be beneficial to most. Some
(perhaps even some division of Google itself), will do something totally
asinine and attempt to subvert the original intents of the engineers and
designers who are currently involved. But that's what happens with any new
technology that broadens access and visibility. I don't believe anything
Google does is inherently 'evil' on its face.. unfortunately, given their
reach, they are a natural access point for some federal agencies to execute
their primary tasks, as they fall under the same vague, broadly defined
national security laws as we all do. The only difference is that we don't run
the Internet and its major consumer touch points. Google does and they need to
make money from their capital investments. It would be smart to use it and do
something cool.

------
stefek99
Contest finished at the moment it is started?

[http://www.google.com/atap/projecttango/terms/](http://www.google.com/atap/projecttango/terms/)

"The Contest begins at 12:00:00 A.M. Pacific Time (PT) Zone in the United
States on February 19th, 2014"

"On or about February 19th, 2014, each Essay will be evaluated by the Judges."

#memonkey

------
return0
Cool technology and all, but i find that it is an odd fit for phones. Apart
from games and other gimmicky applications, or exceptional cases like
navigation for the blind, what are the awesome applications of space detection
for a phone? I can think of much more interesting applications in mobile
devices like cars.

~~~
icefox
Think of a phone not as a phone, but more like a really cheap way to have a
portable computer with a screen and battery that you can put sensors on. It
comes in the phone package because the industry is very much geared to
churning out that form factor on the cheap.

Edit: and you can duck tape it to [car|bike|ect].

------
mbrameld
I hope I can strap one on an RC plane or multi-rotor and map my neighborhood.
That would be kind of cool.

------
fidotron
If you were to build an Android powered robot isn't this exactly what you'd
need inside it?

------
mixerjoe
Cool would be if this mixed-reality game here would be on Tango:
[http://www.indiegogo.com/projects/gbanga-famiglia-rise-
and-f...](http://www.indiegogo.com/projects/gbanga-famiglia-rise-and-fall-
game/x/5121343)

------
adamzerner
I think the video made the mistake of focusing on features, and not benefits.
(Not saying that the benefits aren't there, just that I personally don't see
them, and the video didn't do a good job of making them apparent).

------
chacham15
I can see this being HUGE in the real estate market. Imagine if a house seller
can walk through his house with one of these, then upload the data to a
website and now you can have a truly online "open house".

~~~
User8712
How is this different than say, walking around your house with your smart
phone recording video, and then pushing a button to upload it to your real
estate ad?

Advantage Tango: The user could manually navigate the virtual house.

Advantage Standard Video: Improved accuracy, no wonky 3D model and texture
recreations, but actual video. Runs on low end hardware, no fancy plugins
required. You can turn on lights, open drawers, show how a sliding door works,
or how the fridge looks inside.

If I had my choice, I'd take video over Tango for real estate. Even so, video
has been around for ages, it's dead simple to create and upload, yet never
used. Therefore, if video hasn't taken off in the past decade, I can't see
Tango having any noticeable impact on the market.

------
cynusx
Generating CS maps from the university campus now becomes ridiculously simple
:)

------
jogzden
I can see tech like this being used in the next Bond film. Every day, the
advances in modern technology surprise and impress me. I love our line of
work, where we can set our mind to something and just make it.

------
arjn
Anyone recall the mapping drones from the recent Ridley Scott film
"Prometheus" ? (similar concepts in other films too I think)

Combine this with a small smart-ish drone and it could get very interesting.

------
pbreit
Some of these applications sound more promising than the user input
applications Leap Motion has been touting. I bet there are massive commercial
opportunities for technology like this.

------
ingend88
Get this story and 4 other in top5HN Newsletter. Signup at
[http://top5hn.launchrock.co](http://top5hn.launchrock.co)

------
TeMPOraL
Applied for our local Hackerspace. I hope Google will consider applications
not only from companies, but from community R&D labs as well :).

------
wegi
I am a little bit disappointed, that I have to be part of a company to apply
there. No time to work full time while doing my masters.

------
bzalasky
Looks cool. This reminds me of what Dekko was working on while they were still
in business, though I suppose at Google scale.

------
Nogwater
Could this do real-time position, rotation, and gesture tracking if it was
embedded in a VR head-mounted display?

------
epmatsw
Makes me think of the arm computers from the Neanderthal Parallax. I'd love to
get my hands on one of these

------
boucherm
Argh, this is my phd subject. Not sure wether this is good (for the lights) or
bad news... What do you think?

------
infocollector
Paracosm as partner? Isnt this exactly what the paracosm startup was doing?
Does anyone know the story?

------
mrbill
Surely I'm not the only Charlie Stross fan who immediately thought "SCORPION
STARE"...

------
z3phyr
Couple it with neuromorphic engineering, and we have a good robot AI
technology in our hands!!!!

------
web007
This is the embodiment of the phone-scanner-mapping thing from The Dark
Knight. Fascinating!

------
Keyframe
With enough precision this could work really well for set
extensions/reconstruction.

------
ejstembler
Pretty much all junk and ancient looking. It's sad this genre has not
advanced.

------
hope1985
Great idea! I think it can be very helpful to navigate blind people.

------
tunesmith
Fun to imagine this tech being put into a contact lens.

------
jcoleh
Google's answer to iBeacons?

~~~
Synaesthesia
Totally different. iBeacons are a low power Bluetooth transmitter that can
notify phones of their presence.

This is a prototype phone for mapping scenes in 3D

------
jankeromnes
s/updating it’s position/updating its position/

------
mendelk
Getting a 404.

------
nikunjk
404 error

~~~
chengyang
where you get the 404?

~~~
PavlovsCat
I get one when clicking the submitted link.

------
Allower
This is not creepy at all...

