Hacker News new | past | comments | ask | show | jobs | submit login

Thank you for your comment, because it triggered an interesting chain of thoughts about a semi-related problem I’m working on at work.

Usually with a Kalman filter, you’re taking into account the spatial measurement error (gyro-measured roll rate error, accelerometer-measured acceleration error, etc) but I don’t think I’ve ever encountered a system that explicitly modelled sensor latency variation relative to timestamps. Based on the description of the problem they encountered here, I suspect what happened is that it lost a frame but didn’t adjust the “photo timestamps” appropriately; every frame that came along afterwards would have had an incorrect timestamp? Even if the Kalman filter was set up to handle “this photo was taken 20ms ago” when doing its forward integration, if they didn’t model “this photo was taken 50ms ago but is reporting that it was taken 20ms ago” then you’d pretty readily get the kinds of oscillation they were getting.

Edit: yeah, just like the sibling comment said :)




HoloLens provides a timestamp with every frame from each sensor, which can be used for sensor fusion outside of the system usage. Windows.Perception.PerceptionTimestamp can be used for either recorded data (e.g. camera) or for future predictions (e.g. predicted device position). The predicted latency is also used to adjust the render to ensure the viewer's perspective is correct even though the draw calls may be lagging slightly behind the viewer's position.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: