
ARKit + CoreLocation [video] - gfredtech
https://twitter.com/AndrewProjDent/status/886916872683343872
======
joezydeco
The _really_ interesting stuff is in Part 2 of his demo:

[https://twitter.com/AndrewProjDent/status/888380207962443777](https://twitter.com/AndrewProjDent/status/888380207962443777)

~~~
timdorr
If you could somehow project that as a HUD in a car, then you'll have a great
solution to a big problem in the navigation space. Many people have trouble
reading maps because they have difficulty in translating the pictographic
display from some other screen into the roads and signage they see in front of
them. If they no longer have to do this translation, then following directions
becomes significantly easier for a large class of people that struggle with
current navigation systems.

~~~
AJ007
It could have some negative side effects, as in: "But the arrow pointed
straight off of the edge of the cliff!"

~~~
tiernano
If they drive off a cliff when using the system, would they not be more
eligible for a Darwin Award?

------
tasoeur
Don’t get me wrong this is pretty cool, but one thing that makes me a bit
annoyed is the way the author frame the video. With the current version of
ARkit and most AR libraries these days, there’s no occlusion of the virtual
geometry, meaning that you can actually see the blue line and arrows overlay
the real world at all time (So some far away virtual arrow will be on top of a
real wall right in front of you). That’s a bit less sexy and it’s clearly
hidden here. It’s also possible that some directions look confusing with the
lack of occlusion and too numerous virtual objects on the screen.

I’m looking forward to when AR libraries use their point cloud technology to
also do object reconstruction and virtual occluders using real world objects!

~~~
nobbis
We're planning to bring occlusion to ARKit with Forge. The accuracy can't
match a depth camera, but still plenty of use cases. Here's an example on a
Pixel:

[https://www.youtube.com/watch?v=K9CpT-
sy7HE](https://www.youtube.com/watch?v=K9CpT-sy7HE)

~~~
mendeza
This is awesome! Is the sdk available?

------
epaulson
I haven't upgraded anything to an iOS 11 beta so I haven't been able to try
this myself - but how effective is ARKit if the camera loses a view for a
little bit?

~~~
mendeza
[https://twitter.com/AndrewMendez19/status/888765856225923072](https://twitter.com/AndrewMendez19/status/888765856225923072)

Here is a video upload from a July 4th ARKit experience I made. If you have
small movements and have occlusion, ARKit does well. You see in the video that
after movement and occulusion it loses its bearings.

But whats crazy is when you return the view and ARKit fixes itself

------
agumonkey
I can see Apple dropping some Glass reinvention swiftly.

~~~
tigershark
They have to go back in the line. Google is already using google glass in some
manufacturing environments. Microsoft already released hololens, although for
now mainly for devs. magic leap hopefully is coming next year and accordingly
to the rumours should be quite a big leap.

~~~
nihonde
If there's one lesson to be drawn from Apple's history, it's that first to
market is not a good metric for blockbuster success.

------
Artemis2
Nokia had something similar years ago :
[https://youtu.be/55Qdem9pJxY](https://youtu.be/55Qdem9pJxY)

~~~
bhouston
I swear I had an app installed on my iPhone 3G (I guess 2008/2009) that did
something similar. Was it Sam Altman's Loopt that had a 3D view that used
geomarkers and displayed them as you rotated the phone with the camera view
leaking through it? I believe it was L*something.

~~~
ipsum2
I believe you're thinking about Layar.
[https://www.youtube.com/watch?v=b64_16K2e08](https://www.youtube.com/watch?v=b64_16K2e08)

~~~
bhouston
That was it. Thanks.

------
friedman23
eventually we are going to fall back to a google glass style device when
people realize how stupid it is to have to walk around with a phone up

------
dantle
Hasn't Yelp's app implemented this feature (called Monocle) for half a decade
now? Could anyone explain the difference?

~~~
k-mcgrady
I remember that (haven't used it in years though). I remember when it launched
it was a big deal. I think the big deal here is that any reasonably skilled
iOS dev could write this in a few hours. Yelp certainly spent more time and
money doing it. So the big deal is the tools have finally reached a point that
doing useful AR work isn't difficult.

------
ice109
is this a trick or is geolocation on iPhones better than the 5m typically
quoted.

~~~
bflood
I generally saw 1-3m accuracy when I was trying out ARKit and some GIS data,
YMMV

[https://twitter.com/bFlood/status/888485889248157697](https://twitter.com/bFlood/status/888485889248157697)

that said, when using worldalignment gravityAndHeading, any location
inaccuracy when the ARSCNView starts up will throw off the AR illusion,
sometimes considerably. I hope apps will be able to correct during an
ARSession when better location data is detected

~~~
ice109
>relative to a nearby iBeacon

where is the iBeacon in the parking lot?

~~~
bflood
there is no iBeacon (not sure where you're quoting that from?), its just
spatial data (lat/lons) projected into the SCNScene coordinate space (which is
relative to the location and heading of the device when the ARSession starts)

~~~
ice109
>Core Location provides services for determining a device’s geographic
location, altitude, orientation, or position relative to a nearby iBeacon.

[https://developer.apple.com/documentation/corelocation](https://developer.apple.com/documentation/corelocation)

~~~
jafingi
As the quote states, it can (determine the geographic location, altitude,
orientation) OR (determine position relative to a nearby iBeacon)

