
Magic Leap SDK - wheresevan
https://creator.magicleap.com/home
======
pavlov
This is off-topic, but these footers that read “Made with [emoji heart] in
[random place]” are such an irritating startup meme.

It makes particularly little sense for Magic Leap: they’ve got a billion
dollars in VC funding, and their product is supposedly “made with heart” like
an artisanal Brooklyn cupcake?

~~~
tudelo
Is it really that annoying that someone possibly made x thing with love in y
location? What does money have to do with that? You can get paid to do
something and do it with love.

~~~
pavlov
It just sounds weird and fake in a corporate setting. Sort of like "Grandma's
Homemade Cookies, by NestléKraftUnilever"

~~~
tudelo
True. It's generic and cliche. But, if the cookie package said they were made
with love in factory location x, what's so bad about that? They are supposed
to be Grandma's cookies :)

------
anchpop
Is it possible to see details about the sdk without having to give Magic Leap
my email?

~~~
nassyweazy
anonymous@protonmail ?

~~~
ocdtrekkie
You also have to enter your name, birthdate, sign some agreements, etc.

~~~
Erlangolem
Yeah, I lied. It worked. Bob John 02/22/1992 has access now. They get to spam
the email I specifically have for useless logins.

Win-win?

~~~
ocdtrekkie
I suspect by falsifying your information, you violated the agreement. :P

------
diggan
This requires a account with Magic Leap to even be able to read any content,
this has no place on HN...

~~~
runesoerensen
Here's the announcement blog post
[https://www.magicleap.com/stories/blog/developer-journey-
sta...](https://www.magicleap.com/stories/blog/developer-journey-starts-now)

~~~
diggan
Cool, hopefully mods change the link to this instead.

------
Ajedi32
Lots of interesting details about the hardware in the developer documentation:

* 6-DOF tracked controller

* Eye tracking

* Multiple depth of field display

* Persistent location tracking across multiple rooms

And limitations:

* FOV is going to be rather small

* Variable depth-of-field doesn't work well with objects less than a foot (30 cm) from your face

* Takes a few seconds for the system to adjust when you move real objects around (so if you move your couch, it won't notice instantly)

~~~
zawerf
This is huge news for this secretive company so it's unfortunate that this
submission got flagged. I don't think they were ever upfront about what their
device can really achieve before.

A lot of the stuff you highlighted are especially interesting since other
popular HMD don't have them yet (for example eye tracking and of course their
lightfield stuff). Even a 6DOF tracked controller was a surprising feature to
me since they don't seem to use any extra external sensors like oculus/vive.

~~~
Holomakerbot
“I don't think they were ever upfront about what their device can really
achieve before.”

They’ve been very upfront about this. The problem is all the noise that’s been
generated by armchair critics who have spammed every article about them
calling them a scam/vaporware/the next Theranos, etc.

------
zawerf
For those who don't want to log in, here's the snippet of their features from
[https://creator.magicleap.com/downloads/lumin-
sdk/overview](https://creator.magicleap.com/downloads/lumin-sdk/overview)

    
    
      #### C API
    
      - Audio: Stereo audio output and voice microphone recording are supported.
      - Camera: Capture still images and videos from the color camera.
      - Dispatch: Allows apps to open URLs using the Magic Leap Browser or other apps.
      - Eye Tracking: Ability to retrieve fixation point position and eye centers. Blinks can also be detected.
      - Graphics: OpenGL ES, Desktop, and Vulkan rendering paths.
      - Hand Gestures & Key Point Tracking: Recognize the user's hand poses (gestures) and track the position of identifiable points on hands such as the tip of the index fingers.
      - Head Tracking: Headpose is tracked in full six degrees of freedom (DOF).
      - Image Tracking: Track the position and orientation of specified image targets in the user’s environment.
      - Input (Control / MLMA Support): Retrieve either 3 DOF (orientation) or full 6 DOF (position and orientation) from the Magic Leap Control. Detect button and touchpad presses and the analog trigger. Trigger values range from 0 to 1. A range of touchpad gestures are also supported, as are haptic vibration and LED ring feedback. This interface works seamlessly with both physical Magic Leap Controls and the Magic Leap Mobile App.
      - Light Tracking: Provides information (luminance, global color temperature) about the ambient light of the user's environment.
      - Media Codec: Low-level, hardware-accelerated media encoding and decoding.
      - Media Player: Simple, straightforward media playback interface.
      - Meshing: Converts the world's depth data into a connected triangle mesh that can be used for occlusion and physics.
      - Music Service: Supports connecting and listening to a streaming music service.
      - Occlusion: An interface for feeding depth data to the Magic Leap platform for hardware occlusion.
      - Planes: Recognize planar surfaces in the user's environment for placing content. This includes semantic tagging for ceilings, floors, and walls.
      - Raycast: Fire a ray and get the point of intersection with the world's depth data.
      - Secure Storage:Save data from your app to the device's encrypted storage.
    
      #### Magic Leap Remote
    
      - Supported ML Remote APIs:
          - Head pose
          - Eye tracking
          - Hand Gestures and Keypoint tracking
          - Control Input
          - World Raycast
          - World Meshing
          - World Planes
          - Image Tracking (Simulator only)
          - Graphics API (OpenGL)
    
      - Properties panel in the GUI lets you manually set the state of API data such as head pose, detected gestures, and so on.
      - Shows the virtual room in a Mini Map as well as an Eye view that depicts your application content composited over the virtual room.
      - Supports gamepad, keyboard, and mouse bindings. Use a gamepad to drive head position, define keys to trigger detected gestures, and more.
      - The bundled virtual room generator app procedurally generates random rooms that you can customize even further to test your game logic in the Simulator.
      - Experimental support for faster graphics rendering in ML Remote can be enabled by setting the environment variable ML_REMOTE_ENABLE_GPU_SHARING = 1. This feature is experimental, because it has not yet gone through rigorous testing and might lead to a crash or visual artifacts under some circumstances.

