Hacker News new | past | comments | ask | show | jobs | submit login
A simple method of measuring audio latency (2018) (blog.nirbheek.in)
37 points by luu 43 days ago | hide | past | web | favorite | 4 comments

> So the simplest reliable implementation is to have only one wave traveling down the pipeline at a time. If we send a wave out, say, once a second, we can wait about one second for it to show up, and otherwise presume that it was lost.

This is incidentally a problem also faced by radar systems: if more than one radar pulse is in flight at a time, it's not possible to uniquely distinguish an echo.

One solution is to not send single-frequency pulses, but instead send chirps (https://en.wikipedia.org/wiki/Chirp). These pulses _can_ overlap, since the end of one pulse and the beginning of the next can be distinguished by their frequency components.

> But unlike network packets, we lose all context once the waves leave our pipeline and we have no way of uniquely identifying each wave.

It would take additional processing, but this (which causes the above problem) could also be remedied by encoding real information (a sequence number) into the packet using ordinary signal modulation techniques.

I've used jack_iodelay to do the exact same thing, and it works great.

More information on it and audio latency in general can be found in the Ardour manual here:


Or you can put a microphone close to the speaker and measure the frequency of the generated tone.

While clever and elegant, this method does suffer from the flaw that all multiples of f=1/latency will also resonate.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact