
Jitter – Does It Matter? (2011) - PascLeRasc
https://nwavguy.blogspot.com/2011/02/jitter-does-it-matter.html
======
Animats
_" Cables can “smear” digital signals by attenuating the highest
frequencies."_

Yeah, right. If you get "smeared" digital signals beyond the sampling
threshold, you'll get so many data errors that nothing will work.

The only thing that matters is whether whatever is clocking the DAC has a
constant clock rate and that you don't get a buffer underrun on whatever feeds
it data.

If this were a real problem, it would have shown up on analog VGA monitors as
a wobbly image. Those are also an analog signal from a DAC coming from a
clocked data stream. Several orders of magnitude faster than for audio. There
have been VGA devices with jitter problems, and it's really obvious - vertical
lines aren't straight. That was pretty much fixed by the time VGA went away.

(There was once a $13,500 HDMI cable. Really[1]. It was laughed at on
Amazon.[2])

[1] [https://www.forbes.com/sites/ianmorris/2015/01/02/please-
don...](https://www.forbes.com/sites/ianmorris/2015/01/02/please-dont-
buy-a-13500-hdmi-cable/#588ce29639b6)

[2] [https://www.amazon.com/WireWorld-Platinum-Starlight-Cable-
Me...](https://www.amazon.com/WireWorld-Platinum-Starlight-Cable-
Meter/dp/B00KY2NKCO)

~~~
hatsunearu
>Yeah, right. If you get "smeared" digital signals beyond the sampling
threshold, you'll get so many data errors that nothing will work.

I don't want to feed the audiophile nonsense but you do get high frequency
attenuation with shitty cables and human hearing is pretty dang sensitive.
Even just slight attenuation can and will incur jitter in the signal.

It's just that one can mitigate the damage it with some clever circuitry and
using anything other than coathangers or wet string...

Also come on, our sense of hearing is likely much more capable than our sight.
If you introduce things like delta-sigma modulation, the jitter requirements
become pretty ridiculous too. Something on the order of two digit picoseconds
RMS.

It's cool to hate on ridiculous 1000 dollar HDMI cables and monster sized
mains power cables but legit audio engineering can be pretty challenging,
since our sense of hearing is actually way more sensitive than most other
things we encounter.

~~~
ohazi
The reason the "smearing" argument doesn't hold water when looking at jitter
from something like a USB cable is that this interface is being used to send a
bitstream, not timing information (at least not directly).

The way it usually works is that first the computer sends some sort of mode-
setting message. This is where the computer tells the Audio DAC controller
what output sampling rate to use.

The DAC then uses _its own internal circuitry_ to generate the timing signals.
This circuitry is what determines the jitter.

The computer then sends the audio data over the cable. This data is captured,
buffered, and then finally sent to the DAC _when the internally generated
timing system is ready for it_.

So the only thing that cable smearing can do is introduce errors into the
digital messages that the computer sends. If it's particularly bad, the mode-
setting message won't make it intact and you won't hear anything. If the mode
does get set correctly, but there are occasional bit errors in the bitstream,
you'll hear occasional (but obvious) pops. If the computer can't send the
bitstream at the expected rate, the buffers will over or underrun and
everything will stop.

But what you won't get is more jitter.

The original argument assumes that the cable is sending a signal whose edges
are used for clock recovery, _and that this recovered clock is used as the
timebase for the sampling system_. But nobody actually does this [1].
Reasonably high jitter / phase noise on the bitstream signals is fine, as long
as the data can still be decoded.

[1] Okay fine, HDMI sort of does this, but they're almost always using a more
sophisticated retiming system.

~~~
hatsunearu
OK yeah, I'm not sure what I was thinking when I wrote about cabling.

I agree that clock recovery and PLL filtering can take care of jitter.

------
pickdenis
NwAvGuy was a true beacon of light in the sea of bullshit that the audiophile
community can sometimes be. It's sad he's not around anymore. I had a lot of
fun building his Objective2 amplifier.

~~~
corey_moncure
I bought the assembled O2/ODAC from JDS labs, and then the black edition a few
years later. It's essentially a perfect product and I fully believe that any
claims of superior transparency in audio reproduction over it from any other
manufacturer are 'outside the realm of human perception' (i.e. snake oil).

There might be a few nice utility bells and whistles on other products but
that's it.

Probably someone in the industry made him a hush money deal he couldn't refuse
and that was that. Good for him.

------
ldoughty
So the article is in reference to audio... Jitter in regards to sending
data/requests to a 3rd party, such as a cloud provider of a service...
definitely a good thing.. it helps differentiate your requests from a DDOS.

Skimmed the article to see the subject, saw it was audio, but still wanted to
make this comment.

------
GregoryPerry
Jitter is always introduced by non-RTOS operating systems that don't have a
guarantee for preemptive realtime scheduling. The kernel scheduler introduces
jitter, supervisory processes introduce jitter, etc. And an easy test to see
the effects of this is to try to control a servo motor with a GPIO pin without
RT_PREEMPT or comparable RTOS.

Simple fix would be an RT_PREEMPT-linked Linux sound player.

~~~
Fice
Scheduling jitter does not cause jitter in audio, because audio is buffered
and the buffer is consumed by audio device using it's own clock. Yet OS has to
fill the buffer in time or underrun will happen, which will be audible as
crackling and stutter. For some applications input and output audio buffers
need to be very small to avoid introducing noticeable delay in audio
processing, and in this case real-time capabilities are required to prevent
buffer overruns and underruns.

------
samstave
Jitter was a HUGE issue in ~2004 for Lucas/ILM wrt rendering speeds... but due
to how long ago that was, I dont exactly recall why...

They were the first Foundry customer, for these reasons...

They had render farms that were sensitive to jitter... Raleigh Mann later went
on to run netops for google... thought he has left there and now runs williams
sonoma - but he was super jitter allergic...

------
rurban
It does for realtime. This article only cares for audio, but with realtime
jitter is the worst problem of all.

Comparable to memory failures, segv, out of bounds, ... Usually caused by a HW
problem, but sometimes SW is at fault also.

------
new_realist
Love NwAvGuy’s DAC and amp. I own two of each and attempted to upgrade to
expensive alternatives only to be disappointed.

