QuickNav

Jitter, Why does it matter to digital video?

I'll take it as obvious that people reading my drivel understand that good signal transfer is paramount in analog connections (say component video).  Something that irks me, however, is when I see the argument "HDMI/DVI is digital so cable quality doesn't matter".  The truth of the matter is that it does matter.  I'll explain why now.

Myth:  Digital signals are either on or off, cable quality does not matter beyond connecting the signal from point A to point B.
Reality:  The on/off part is true, but the quality of the signal does matter and a poor cable will impair your video quality more than you may imagine.  Why?

Myth:  Monster cable is a rip-off and is overpriced, a cheap e-bay cable will work just as well.
Reality: Yes it is overpriced, there are several alternatives, a cheap (likely Chinese) e-bay cable is not a good alternative.  Why?

Why? Jitter.

What is jitter?  Jitter is when a signal varies in time or amplitude outside of a mathematical ideal.  The ideal signal would re-occur in the same exact timeslice and at the same amplitude every time.  Since no transistor is perfect (and a chip with thousands to millions of them will be even less so) there are specifications that define how much jitter is acceptable. 

What does jitter do?  Timing Jitter (tJit) affects when the signal arrives.  Many digital systems are synchronous, that is to say that the data is synchronized to a clock and that clock determines when the data is looked at (sampled).  The sampling window is defined by the setup and hold times.  Setup and hold times are the amount of time before and after the edge of the clock respectively that the data must be stable to ensure it is read correctly.  Again, in a perfect world the data would be sampled at the exact time that the clock is transitioning, but in reality our friend tJit causes some uncertainty as to when the data is actually latched into the chip's buffers.  Setup and hold specifications compensate for that uncertainty.  If the signal you are interested in has excessive jitter, then you may start encroaching into the sampling timing budget.  If this happens then the possibility exists that you may read the data before it has transitioned to the intended state.  What does this mean to your video quality?  That pixel that should have been blue is now black, or worse, the one that should have been black is now cyan!  In all fairness you are not likely to notice one mis-sampled pixel for one frame of video.  But what if 103 thousand pixels are mis-sampled for every video frame?  That's ~5% of a 1080p picture.  The pixels will be different for every video frame (barring a periodic anomaly) and you won't see them as "bad pixels" per-se.  What you will see is noise. 

The top picture at right shows a signal where there is a large amount of leading edge jitter.  You can tell that the bulk of the signals show up at the right edge of the transition but there is a non-trivial amount of times where the signal "bleeds" to the left (arriving early) and there are two edges *way* out there.

The next picture shows what is referred to as a data "eye".  The goal for a design of a transmitter is to have the opening in the middle as big as possible, as this is the sampling area (where the signal is most stabile), simultaneously to keep the transition zone as narrow as possible (represents jitter).

Five percent noise is not outside the realm of possibility for tJit, and effectively kills the HD experience, and we're not even done yet!  There are still two more sources of crappy video that await us!

So now that we've covered tJit (the most common source of frustrations of engineers designing chips) lets look at a form of jitter that rarely plagues the I/O's of PCB mounted components, but that your cheapie cable is almost certain to introduce: aJit.  Amplitude jitter is when the signal (while possibly timed correctly) fails to open the data eye in the height category.  There is some minimum voltage that is required to represent a "1" to the system.  What happens when that level is not consistently reached?

Here we can see three signals:  A clock on channel 3, a stream of data pulses on channel 1, and a strobe pulse one cycle wide on channel 2.  The clock looks good, the strobe look OK, but what is going on with the datastream?

In this plot the colors go from blue to red, with red being where the majority of signals are located and the blue being where signals only occasionally show up.  What we see is the majority of the data signal being where it belongs, but a very non-trivial amount of the data ending up too low to cleanly register as valid data.  With such a poor datastream it is plausible that the monitor would fail to display a picture at all, and in any event the quality of the image would be so poor that you would likely suspect something is wrong.  if it was less pronounced then again we could have poor picture quality, ruining the HD experience while still allowing "acceptable" performance.  How about another 5-10% reduction in picture quality?  One likely culprit of a signal that does this is a connector that was oxidized when soldered to the cable, thus there is a highly resistive path that the signal must traverse prior to getting to its destination.



The strobe pulse shows another artifact that when significant enough can look like jitter and can also impact the ability to recover a valid signal.  There is a reflection showing up at the end of the pulse as evidenced by the extra hump starting at just shy of the second division after center and terminating at roughly division 3.5.  This could easily be encountered in a cable with impedance mismatch (wrong spec cable), or a cracked solder joint at the connector (poor assembly quality).

Of course there are other sources of jitter other than the cable.  In the R&D world the quality of components is always selected by performance, not cost, as you are talking about tens to only the low hundreds of systems being built.  In the production world there are other considerations.  If you can save $0.20 on a component and you are going to build 10 million systems that translates into $2,000,000 in savings.  Oscillators run the gamut from tens of cents for LC resonators upwards of $20+ for high frequency high stability crystal oscillators.  For the three following pictures the source oscillators were 25MHz ranging in price from about $2.50 to about $6.00 in 10K unit quantities.  What a difference component selection makes.  In all three shots everything except the oscillator was the same.  Same device under test, same board, same scope, same probe, same power supplies.

The final quality of your picture is determined by the total signal integrity of all three color channels (RGB) getting to your monitor. All the various sources of noise (tJit, aJit, reflections) are additive in the effect that they have in degrading your picture quality. While tJit is likely to be inherent in your system, reflections and aJit will only serve to worsen your viewing experience. Controlling these two sources of noise is easily accomplished by choosing cables built to the proper specification, with good quality connectors, and keeping the connectors free of debris and oxidation (hence the nearly ubiquitous gold plating on higher end connectors). You can also control tJit to some extent as heat tends to increase jitter in VLSI components. Proper ventilation for your HD equipment is paramount.

About the images used in this piece:
Scope faces are for illustration purposes and may not be from a HD video stream.


Donate towards my web hosting bill!