Continuous advances in high-speed communication and measurement systems require higher levels of performance from system clocks and references. Performance acceptable in the past may not be sufficient to support high-speed synchronous equipment. Perhaps the most important and least understood measure of clock performance is jitter. One of the main challenges with jitter measurements is that there are currently no industry standard methods established. There are multiple variables to consider, from what test equipment is used to what the actual test conditions are. While JEDEC standards do provide definitions and suggested test conditions, there is a lack of consistency between measurements from different testers.
In this post, we will describe various practical methods of measuring jitter, including the relevance and ease of each method. These test methods are the basis for how all jitter measurements are conducted at Vectron.
There are three components that remain consistent for all forms of jitter testing: the device under test (DUT), a reference oscillator, and a power supply. Previously we have discussed how a clean power supply (in our PSR post) is to accurate testing, but the reference oscillator is a vital component for the test setup, as the equipment measuring the DUT needs to have a better noise floor than the DUT. Otherwise the performance of the DUT will be degraded by the noise of the equipment.
Standard measurement equipment (oscilloscopes, counters, signal source analyzers) contain an internal TCXO/OCXO, and for most timing devices these are sufficient. Vectron’s test equipment uses a low phase noise OCXO, locked to an external 10MHz Rubidium clock which is in turn locked to a GPS receiver.
Time Domain Jitter Measurements
Oscilloscopes are the main device used for time domain jitter measurements. Oscilloscopes allow for the easy viewing of waveforms and pulses, and are considered indispensable for any time and frequency lab. While there are many vendors who can offer jitter measurement test packets, they are often provided at an additional cost. An oscilloscope that is high-speed (1GHz+) and has a high sampling bandwidth (10GS/s+) should be sufficient to gather the desired data. We also need to recall that time domain jitter measurements (specifically period and cycle-to-cycle) are random and expressed in terms of mean value and standard deviation over a number of samples. JEDEC standard 65 requires a minimum of 1000 samples, but a 10,000 sample minimum is preferred by most applications.
Although the official definition of period jitter is the difference between a measured clock period and the ideal period, in real world applications it is often difficult to quantify what the ideal period is. If we observe the output from an oscillator set to 125 MHz using an oscilloscope, the average measured clock period may be 7.996 nS instead of 8 nS. Therefore, it is more practical to treat the average observed period as the ideal period, and is a common practice by timing device manufacturers. The standard procedure for measuring period jitter involves randomly measuring the duration of one clock period 10,000 times, and using the recorded data to calculate the mean, standard deviation and peak-to-peak values. Due to the random nature of period jitter, the peak-to-peak values can vary greatly, and often times period jitter needs to be re-calculated several times to come up with an average value.
Below is an example of period jitter measured on a Wavecrest SIA-3300C signal integrity analyzer for a 200MHz XO. This analyzer platform is setup to measure 30,000 samples at a time, and is executed three times in order to obtain an average peak-to-peak value.
Measuring cycle-to-cycle jitter is very similar to measuring period jitter, but with one additional step. The standard procedure for measuring cycle-to-cycle jitter involves randomly measuring the duration of two clock periods 10,000 times, and taking the absolute difference between the two. The recorded data is used to calculate the mean and standard deviation values, and the peak value is simply the largest difference in periods observed. As with period jitter, the peak-to-peak values can vary greatly, and often times cycle-to-cycle jitter needs to be re-calculated several times to come up with an average value. Some digital oscilloscopes have a histogram feature, which simplifies a lot of the math.
Below is an example of cycle-to-cycle jitter measured on a Lecroy Waverunner 610ZI Digital Oscilloscope for a 50MHz XO. In this case, a jitter measurement tool, assigned to P8 and labeled ‘dper’, is used to calculate the cycle-to-cycle jitter. This analyzer platform is setup to measure 30,000 samples at a time, and is executed three times in order to obtain an average peak-to-peak value.
Measuring TIE jitter is very difficult with only an oscillator. Typically, a histogram is necessary to plot the measurement values against the frequency of occurrence of the measurements. An example of a jitter histogram for a TIE measurement is shown below. In this case, the continuous variable is mapped into 500 bins, and the total population of the data set is 3,200,000. The mean value of TIE is theoretically zero, and as can be seen in this measurement, the mean value is 0 nsec. For this plot, the distribution is approximately Gaussian with a standard deviation of 1.3 psec.
Frequency Domain Jitter Measurements
Whereas time domain measurements are handled primarily by an oscilloscope, frequency domain measurements are handled primarily by a signal source analyzer (SSA). Most SSA’s have a very low noise floor (-180dBc/Hz), and have integrated cross-correlation techniques that further reduces the test system noise. Cross-correlation essentially cancels noise by taking the vector sum of the measurement results of two independent measurement channels.
For measuring phase noise, Vectron prefers using the Agilent E5052B. The 5052B includes two independent PLL paths with two built-in reference sources that are uncorrelated with each other (there is also an option for an external reference source). If two signals are uncorrelated, their vector sum, meaning the total noise power from the reference sources taken through vector averaging, lowers the system noise floor by canceling the noise from its internal reference sources and other related circuits, while the noise signal from the DUT is emphasized. This allows for fast and user-friendly testing, with the main downside being that only one device can be tested at a time. The E5052B can also calculate the integrated noise over a desired range (see example below, measuring from 12kHz to 5MHz), and calculate the integrated phase jitter.
If one does not have a signal source/spectrum analyzer that can calculate the jitter by itself, any analyzer with a decent noise floor can be used to calculate the frequency domain phase jitter and integrated period jitter using the formulas below:
RMS Noise (radians)
RMS Noise (degrees)
Integrated Period Jitter (seconds)
As an example, these formulas were used for the same part displayed above, and only using data points 4-7, we were able to calculate RMS phase jitter of 10.0468 degrees and an integrated period jitter of 178.611fs, as compared to the E5052B’s results of 10.1156 degrees and 179.834fs. There are also several free web tools that can calculate these values based off of inputting the data. Vectron in the past has used the Jitter Labs application to confirm the measurements of our test setup.
Tags: Jitter, Oscillator, Phase Noise
Leave a Reply
You must be logged in to post a comment.
How do you calculate jitter percentage? ›
To measure Jitter, we take the difference between samples, then divide by the number of samples (minus 1).How do you measure network jitter? ›
To measure network jitter, you'll need to correctly calculate the average packet-to-packet delay time. Alternatively, you could measure the variation between absolute packet delays in sequential online communications. How you check jitter will vary according to the type of traffic.What equipment measures jitter? ›
Oscilloscopes are the main device used for time domain jitter measurements.Is jitter 1 ms good? ›
Jitter is measured in milliseconds (ms), and ideally, an acceptable jitter level should stay below 30 ms. Anything higher than 30, and you may start to see issues with audio or video quality. As a rule, packet loss should stay under 1%, and network latency shouldn't exceed 150 ms.How do you calculate latency and jitter? ›
For example, when two packets (packets A and B) are sent through a network: Packet A takes 15 ms to traverse the network. Packet B takes 18 ms to traverse the network. The difference in latency between the two packets in the pair is 3 ms. Jitter = | 15 – 18 | = 3 ms.What is the standard value for jitter? ›
What's an acceptable level of jitter? If possible, jitter should be below 30 milliseconds, packet loss should be no greater than 1% and network latency should not be more than 150 ms one way and 300 ms RTT.What is an acceptable jitter threshold? ›
Jitter is measured in milliseconds (ms). A delay of around 30 ms or more can result in distortion and disruption to a call. For video streaming to work efficiently, jitter should be below 30 ms. If the receiving jitter is higher than this, it can start to slack, resulting in packet loss and problems with audio quality.What are the different types of jitter measurement? ›
Jitter falls into two categories: random and deterministic. Random jitter is unbounded and hard to diagnose. Deterministic jitter is often periodic and narrowband. It is also often correlated to a particular noise generator.What is the ideal value for ping and jitter? ›
Ping (or latency shouldn't go over 150 ms or 300 ms for a round trip) Jitter should remain below 30 ms.What is a jitter test? ›
What Does Jitter Test Mean? A jitter test is a type of network performance test that helps to evaluate and measure the rate and statistics of network-jitter-based errors and latency. Advertisements. Jitter tests help in identifying the amount of jitter present on a network connection or infrastructure.
How do you measure jitter on a spectrum analyzer? ›
Like phase noise, jitter can be viewed as spectral density in a spectrum analyzer. It is measured in root-mean-square (RMS) or peak-to-peak values. Jitter frequency and its inverse, jitter interval, are measured in terms of time, length of the interval or cycles per unit of time.What is the main source of jitter? ›
Sources of Jitter
cross talk from radiated or conducted signals. dispersion effects. impedance mismatch.
Can jitter be higher than latency? It is possible that jitter on a line could get higher than latency. This is because jitter is a measure of deviation from a standard rate of delivery and latency is the time a packet takes to get from source to destination.What is the difference between ping and jitter? ›
Jitter refers to delays in sending or receiving data packets over your network connection, usually measured in milliseconds (for example 30ms). Ping is the rate at which data delay is happening, usually measured in milliseconds (for example 30ms).What is the difference between latency and jitter? ›
The major distinction between jitter and latency is that latency is defined as a delay via the network, whereas jitter is defined as a change in the amount of latency. Increases in jitter and latency have a negative impact on network performance, therefore it's critical to monitor them regularly.What is an example of a jitter? ›
What is a network jitter? Network jitter refers to the amount of variation in the latency of receiving packets. For example, packet one arrives in 5 milliseconds, packet two arrives in 10 milliseconds, and packet network arrives in 30 milliseconds. In this example, network latency is gradually getting worse over time.How many milliseconds of latency is noticeable? ›
Latency's effects depend on observers, but most will perceive obvious latency around 100 - 120 milliseconds. Communications will start to break down around 250 - 300ms.What is jitter tolerance? ›
As specified in previous ITU-T PON systems, the jitter tolerance is defined as the peak-to-peak amplitude of sinusoidal jitter applied on the input of 50G-PON signal that causes a 1-dB optical power penalty at the optical equipment.What is the acceptable skew for VoIP? ›
In general this measure should not exceed 150ms in one direction to prevent deterioration of call quality. If a part of the call travels over the public Internet (which introduces its own latency), the organization's internal network latency should be significantly less than 150ms.What is the highest latency for VoIP? ›
For VoIP, you want your latency to be around 20 ms or less. The highest you can have with latency being mostly unnoticeable is 150 ms. Any higher, and you'll start hearing the repercussions.
What is acceptable latency and jitter for VoIP? ›
Latency of 150ms or less (one-way) is generally acceptable. Latency greater than 150ms (again, one way) adversely affects the call quality experience. • Jitter. Jitter is the variation in the arrival rate of packets at a destination point.What is the recommended jitter value for Cisco? ›
A general rule of thumb is to ideally keep jitter for video between 30-50 ms.What is acceptable jitter in teams? ›
Jitter and the Packet loss should also be measured frequently. Make sure that the Jitter value stays under 50ms, the packet loss as close as 0% as you can and of course the latency under the 100 milliseconds.Is 10 jitter bad? ›
Jitter should always be below 30 milliseconds for working efficiently and video streaming. If you receive a jitter higher or more than this, it will start to lag, resulting in issues with the quality of audio and packet loss.What is the difference between clock ppm and jitter? ›
What exactly is the difference between these? Frequency offset (ppm) refers to the difference between a clock's nominal (or ideal) frequency and its actual frequency. Jitter refers to variations between when an ideal clock's edge should occur, and when it actually does occur.What is jitter MS in speed test? ›
Jitter: Also called Packet Delay Variation (PDV), jitter frequency is a measure of the variability in ping over time. Jitter is not usually noticeable when reading text, but when streaming and gaming a high jitter can result in buffering and other interruptions.What is latency vs throughput vs jitter? ›
The MST charts show the throughput used by the media. Jitter is the packet delay variation from sender to receiver. Latency is the time for data/packet to reach from sender to receiver. Was this article helpful?Why is my ping so high but my internet is good? ›
A high ping rate is usually a result of miscommunication between your connection's packet transmission rate and the game's server's response. When you're sure your internet speed is good enough, the culprit could be the game's server you're connected to.Is 70 ms a bad ping? ›
Generally, an acceptable latency (or ping) is anywhere around 40 – 60 milliseconds (ms) or lower, while a speed of over 100ms will usually mean a noticeable lag in gaming.What is ideal ping latency? ›
20ms - 50ms - Good: This is the most common range for gamers. You should still experience smooth and responsive gameplay. 50ms - 100ms - Fair: This is the average range for gamers who are connecting to international servers. You may experience occasional lag depending on the game and your settings.
What is the frequency response of jitter? ›
Jitter frequency, the more commonly quoted figure, is its inverse. ITU-T G. 810 classifies deviation lower frequencies below 10 Hz as wander and higher frequencies at or above 10 Hz as jitter. Jitter may be caused by electromagnetic interference and crosstalk with carriers of other signals.How do you measure peak to peak jitter? ›
- Measure the duration (rising edge to rising edge) of one clock cycle.
- Wait a random number of clock cycles.
- Repeat the above steps 10,000 times.
- Compute the mean, standard deviation (σ), and the peak-to-peak values from the 10,000 samples.
Modern analyzers can measure frequencies with an accuracy of < 0.1%, which is ideal for wireless communication applications.Is 12 ms jitter good? ›
Jitter is measured in milliseconds (ms). A delay of around 30 ms or more can result in distortion and disruption to a call. For video streaming to work efficiently, jitter should be below 30 ms. If the receiving jitter is higher than this, it can start to slack, resulting in packet loss and problems with audio quality.Is jitter the same as latency? ›
The major distinction between jitter and latency is that latency is defined as a delay via the network, whereas jitter is defined as a change in the amount of latency. Increases in jitter and latency have a negative impact on network performance, therefore it's critical to monitor them regularly.Is jitter and ping same? ›
Jitter refers to delays in sending or receiving data packets over your network connection, usually measured in milliseconds (for example 30ms). Ping is the rate at which data delay is happening, usually measured in milliseconds (for example 30ms).How much jitter is acceptable for VoIP? ›
VoIP calls must have a maximum jitter of 30 ms along with 100 Kbps of bandwidth for optimal call quality. Your approach should be to determine the root cause of high latency.What is jitter in speedtest? ›
Jitter: Also called Packet Delay Variation (PDV), jitter frequency is a measure of the variability in ping over time. Jitter is not usually noticeable when reading text, but when streaming and gaming a high jitter can result in buffering and other interruptions.How much packet loss is acceptable for VoIP? ›
For users, packet loss can be more than annoying, particularly in real-time processes like VoIP and video conferencing. According to a QoS tutorial by Cisco, packet loss on VoIP traffic should be kept below 1% and between 0.05% and 5% depending on the type of video.What is a good latency? ›
Low latency is ideal as this means you are experiencing smoother gameplay. Generally, an acceptable latency (or ping) is anywhere around 40 – 60 milliseconds (ms) or lower, while a speed of over 100ms will usually mean a noticeable lag in gaming.
How much jitter is too much for gaming? ›
Jitter is a related measure. If it's too high, the game action can appear jerky. That won't necessarily cost gamers any battles, but it will make the experience less enjoyable. Jitter should stay below 30 ms for smooth play.