manual/include/latency-and-latency-compensation.html

244 lines
10 KiB
HTML
Raw Normal View History

2013-02-15 13:21:59 -05:00
<p>
2014-02-18 17:13:02 -05:00
<a
href="http://en.wikipedia.org/wiki/Latency_%28audio%29"><dfn>Latency</dfn></a>
is a system's reaction time to a given stimulus. There are many factors that
contribute to the total latency of a system. In order to achieve exact time
synchronization all sources of latency need to be taken into account and
2014-02-18 17:13:02 -05:00
compensated for.
2013-02-15 13:21:59 -05:00
</p>
2014-02-18 17:13:02 -05:00
<h2>Sources of Latency</h2>
2013-02-15 13:21:59 -05:00
2014-02-18 17:13:02 -05:00
<h3>Sound propagation through the air</h3>
2013-02-15 13:21:59 -05:00
<p>
Since sound is a mechanical perturbation in a fluid, it travels at
comparatively slow <a href="http://en.wikipedia.org/wiki/Speed_of_sound">speed</a>
of about 340 m/s. As a consequence, your acoustic guitar or piano has a
latency of about 1&ndash;2&nbsp;ms, due to the propagation time of the sound
between your instrument and your ear.
2013-02-15 13:21:59 -05:00
</p>
2014-02-18 17:13:02 -05:00
<h3>Digital-to-Analog and Analog-to-Digital conversion</h3>
2013-02-15 13:21:59 -05:00
<p>
Electric signals travel quite fast (on the order of the speed of light),
so their propagation time is negligible in this context. But the conversions
between the analog and digital domain take a comparatively long time to perform,
2014-02-18 17:13:02 -05:00
so their contribution to the total latency may be considerable on
otherwise very low-latency systems. Conversion delay is usually below 1&nbsp;ms.
2013-02-15 13:21:59 -05:00
</p>
2014-02-18 17:13:02 -05:00
<h3>Digital Signal Processing</h3>
2013-02-15 13:21:59 -05:00
<p>
Digital processors tend to process audio in chunks, and the size of that chunk
depends on the needs of the algorithm and performance/cost considerations.
This is usually the main cause of latency when you use a computer and one you
2014-02-18 17:13:02 -05:00
can try to predict and optimize.
2013-02-15 13:21:59 -05:00
</p>
2014-02-18 17:13:02 -05:00
<h3>Computer I/O Architecture</h3>
2013-02-15 13:21:59 -05:00
<p>
A computer is a general purpose processor, not a digital audio processor.
This means our audio data has to jump a lot of fences in its path from the
outside to the CPU and back, contending in the process with some other parts
of the system vying for the same resources (CPU time, bus bandwidth, etc.)
2013-02-15 13:21:59 -05:00
</p>
2014-02-18 17:13:02 -05:00
<h2>The Latency chain</h2>
2017-02-13 14:38:01 -05:00
<img src="/images/latency-chain.png" title="Latency chain" alt="Latency chain" />
2013-02-15 13:21:59 -05:00
<p>
2017-02-14 10:20:06 -05:00
<em>Figure: Latency chain.</em>
The numbers are an example for a typical PC. With professional gear and an
optimized system the total round-trip latency is usually lower. The important
2014-02-18 17:13:02 -05:00
point is that latency is always additive and a sum of many independent factors.
2013-02-15 13:21:59 -05:00
</p>
<p>
Processing latency is usually divided into <dfn>capture latency</dfn> (the time
it takes for the digitized audio to be available for digital processing, usually
2014-02-18 17:13:02 -05:00
one audio period), and <dfn>playback latency</dfn> (the time it takes for
2017-02-14 10:20:06 -05:00
In practice, the combination of both matters. It is called <dfn>round-trip
latency</dfn>: the time necessary for a certain audio event to be captured,
2014-02-18 17:13:02 -05:00
processed and played back.
</p>
<p class="note">
It is important to note that processing latency in a jackd is a matter of
choice. It can be lowered within the limits imposed by the hardware (audio
device, CPU and bus speed) and audio driver. Lower latencies increase the
load on the system because it needs to process the audio in smaller chunks
which arrive much more frequently. The lower the latency, the more likely
2014-02-18 17:13:02 -05:00
the system will fail to meet its processing deadline and the dreaded
<dfn>xrun</dfn> (short for buffer over- or under-run) will make its
2014-02-18 17:13:02 -05:00
appearance more often, leaving its merry trail of clicks, pops and crackles.
2013-02-15 13:21:59 -05:00
</p>
<p>
The digital I/O latency is usually negligible for integrated or
<abbr title="Periphal Component Interface">PCI</abbr> audio devices, but
for USB or FireWire interfaces the bus clocking and buffering can add some
2014-02-18 17:13:02 -05:00
milliseconds.
2013-02-15 13:21:59 -05:00
</p>
2014-02-18 17:13:02 -05:00
2017-02-14 10:20:06 -05:00
<h2>Low Latency use cases</h2>
2013-02-15 13:21:59 -05:00
<p>
Low latency is <strong>not</strong> always a feature you want to have. It
comes with a couple of drawbacks: the most prominent is increased power
2014-02-18 17:13:02 -05:00
consumption because the CPU needs to process many small chunks of audio data,
it is constantly active and can not enter power-saving mode (think fan noise).
Since each application that is part of the signal chain must run in every
audio cycle, low-latency systems will undergo<dfn>context switches</dfn>
between applications more often, which incur a significant overhead.
This results in a much higher system load and an increased chance of xruns.
2013-02-15 13:21:59 -05:00
</p>
<p>
2014-02-18 17:13:02 -05:00
For a few applications, low latency is critical:
2013-02-15 13:21:59 -05:00
</p>
2014-02-18 17:13:02 -05:00
<h3>Playing virtual instruments</h3>
2013-02-15 13:21:59 -05:00
<p>
A large delay between the pressing of the keys and the sound the instrument
produces will throw off the timing of most instrumentalists (save church
2014-02-18 17:13:02 -05:00
organists, whom we believe to be awesome latency-compensation organic systems.)
2013-02-15 13:21:59 -05:00
</p>
2014-02-18 17:13:02 -05:00
<h3>Software audio monitoring</h3>
2013-02-15 13:21:59 -05:00
<p>
If a singer is hearing her own voice through two different paths, her head
2014-02-18 17:13:02 -05:00
bones and headphones, even small latencies can be very disturbing and
manifest as a tinny, irritating sound.
2013-02-15 13:21:59 -05:00
</p>
2014-02-18 17:13:02 -05:00
<h3>Live effects</h3>
2013-02-15 13:21:59 -05:00
<p>
2014-02-18 17:13:02 -05:00
Low latency is important when using the computer as an effect rack for
inline effects such as compression or EQ. For reverbs, slightly higher
latency might be tolerable, if the direct sound is not routed through the
computer.
2014-02-18 17:13:02 -05:00
</p>
<h3>Live mixing</h3>
2014-02-18 17:13:02 -05:00
<p>
Some sound engineers use a computer for mixing live performances.
Basically that is a combination of the above: monitoring on stage,
effects processing and EQ.
2013-02-15 13:21:59 -05:00
</p>
<p>
In many other cases, such as playback, recording, overdubbing, mixing,
mastering, etc. latency is not important, since it can easily be
compensated for.<br>
To explain that statement: During mixing or mastering you don&#039;t care
2014-02-18 17:13:02 -05:00
if it takes 10ms or 100ms between the instant you press the play button
and sound coming from the speaker. The same is true when recording with a count in.
2013-02-15 13:21:59 -05:00
</p>
2014-02-18 17:13:02 -05:00
<h2>Latency compensation</h2>
<p>
During tracking it is important that the sound that is currently being
2014-02-18 17:13:02 -05:00
played back is internally aligned with the sound that is being recorded.
</p>
<p>
This is where latency compensation comes into play. There are two ways to
2014-02-18 17:13:02 -05:00
compensate for latency in a DAW, <dfn>read-ahead</dfn> and
<dfn>write-behind</dfn>. The DAW starts playing a bit early (relative to
the playhead), so that when the sound arrives at the speakers a short time
2014-02-18 17:13:02 -05:00
later, it is exactly aligned with the material that is being recorded.
Since we know that playback has latency, the incoming audio can be delayed
2014-02-18 17:13:02 -05:00
by the same amount to line things up again.
</p>
<p>
As you may see, the second approach is prone to various implementation
issues regarding timecode and transport synchronization. Ardour uses read-ahead
to compensate for latency. The time displayed in the Ardour clock corresponds
to the audio signal that you hear on the speakers (and is not where Ardour
2014-02-18 17:13:02 -05:00
reads files from disk).
</p>
<p>
As a side note, this is also one of the reasons why many projects start at
timecode <samp>01:00:00:00</samp>. When compensating for output latency the
DAW will need to read data from before the start of the session, so that the
audio arrives in time at the output when the timecode hits <samp>01:00:00:00</samp>.
Ardour3 does handle the case of <samp>00:00:00:00</samp> properly but not all
2014-02-18 17:13:02 -05:00
systems/software/hardware that you may inter-operate with may behave the same.
</p>
2013-02-15 13:21:59 -05:00
2013-02-15 17:20:27 -05:00
<h2>Latency Compensation And Clock Sync</h2>
2013-02-15 13:21:59 -05:00
<p>
To achieve sample accurate timecode synchronization, the latency introduced
2014-02-18 17:13:02 -05:00
by the audio setup needs to be known and compensated for.
2013-02-15 13:21:59 -05:00
</p>
<p>
In order to compensate for latency, JACK or JACK applications need to know
2014-02-18 17:13:02 -05:00
exactly how long a certain signal needs to be read-ahead or delayed:
</p>
2017-02-13 14:38:01 -05:00
<img src="/images/jack-latency-excerpt.png" title="Jack Latency Compensation" alt="Jack Latency Compensation" />
2014-02-18 17:13:02 -05:00
<p>
<em>Figure: Jack Latency Compensation.</em>
2013-02-15 13:21:59 -05:00
</p>
<p>
In the figure above, clients A and B need to be able to answer the following
2014-02-18 17:13:02 -05:00
two questions:
2013-02-15 13:21:59 -05:00
</p>
<ul>
2014-02-18 17:13:02 -05:00
<li>
How long has it been since the data read from port Ai or Bi arrived at the
edge of the JACK graph (capture)?
</li>
<li>
2017-02-14 10:20:06 -05:00
How long will it be until the data written to port Ao or Bo arrives at the
2014-02-18 17:13:02 -05:00
edge of the JACK graph (playback)?
</li>
2013-02-15 13:21:59 -05:00
</ul>
<p>
JACK features an <abbr title="Application Programming Interface">API</abbr>
that allows applications to determine the answers to above questions.
However JACK can not know about the additional latency that is introduced
by the computer architecture, operating system and soundcard. These values
can be specified by the JACK command line parameters <kbd class="input">-I</kbd>
and <kbd class="input">-O</kbd> and vary from system
to system but are constant on each. On a general purpose computer system
the only way to accurately learn about the total (additional) latency is to
2014-02-18 17:13:02 -05:00
measure it.
2013-02-15 13:21:59 -05:00
</p>
2013-02-15 17:20:27 -05:00
<h2>Calibrating JACK Latency</h2>
2013-02-15 13:21:59 -05:00
<p>
Linux DSP guru Fons Adriaensen wrote a tool called <dfn>jack_delay</dfn>
2017-02-14 10:20:06 -05:00
to accurately measure the round-trip latency of a closed loop audio chain,
with sub-sample accuracy. JACK itself includes a variant of this tool
2014-02-18 17:13:02 -05:00
called <dfn>jack_iodelay</dfn>.
2013-02-15 13:21:59 -05:00
</p>
<p>
Jack_iodelay allows you to measure the total latency of the system,
subtracts the known latency of JACK itself and suggests values for
2014-02-18 17:13:02 -05:00
jackd's audio-backend parameters.
2013-02-15 13:21:59 -05:00
</p>
<p>
jack_[io]delay works by emitting some rather annoying tones, capturing
them again after a round trip through the whole chain, and measuring the
difference in phase so it can estimate with great accuracy the time taken.
2013-02-15 13:21:59 -05:00
</p>
<p>
2014-02-18 17:13:02 -05:00
You can close the loop in a number of ways:
2013-02-15 13:21:59 -05:00
</p>
<ul>
2014-02-18 17:13:02 -05:00
<li>
Putting a speaker close to a microphone. This is rarely done, as air
2014-02-18 17:13:02 -05:00
propagation latency is well known so there is no need to measure it.
</li>
<li>
Connecting the output of your audio interface to its input using a
patch cable. This can be an analog or a digital loop, depending on
the nature of the input/output you use. A digital loop will not factor
in the <abbr title="Analog to Digital, Digital to Analog">AD/DA</abbr>
2014-02-18 17:13:02 -05:00
converter latency.
</li>
2013-02-15 13:21:59 -05:00
</ul>
<p>
2014-02-18 17:13:02 -05:00
Once you have closed the loop you have to:
2013-02-15 13:21:59 -05:00
</p>
<ol>
2014-02-18 17:13:02 -05:00
<li>Launch jackd with the configuration you want to test.</li>
2017-02-14 10:20:06 -05:00
<li>Launch <kbd class="input">jack_delay</kbd> on the command line.</li>
2014-02-18 17:13:02 -05:00
<li>Make the appropriate connections between your jack ports so the loop is closed.</li>
<li>Adjust the playback and capture levels in your mixer.</li>
2013-02-15 13:21:59 -05:00
</ol>