Passive form, image legends

This commit is contained in:
Ed Ward 2019-01-14 14:20:39 +01:00
parent dfd7223eb7
commit d3016470f8

View File

@ -14,10 +14,11 @@
<p>
Since sound is a mechanical perturbation in a fluid, it travels at
comparatively slow <a href="http://en.wikipedia.org/wiki/Speed_of_sound">speed</a>
of about 340 m/s. As a consequence, your acoustic guitar or piano has a
of about 340 m/s. As a consequence, an acoustic guitar or piano has a
latency of about 1&ndash;2&nbsp;ms, due to the propagation time of the sound
between your instrument and your ear.
between the instrument and the player's ear.
</p>
<h3>Digital-to-Analog and Analog-to-Digital conversion</h3>
<p>
Electric signals travel quite fast (on the order of the speed of light),
@ -26,31 +27,36 @@
so their contribution to the total latency may be considerable on
otherwise very low-latency systems. Conversion delay is usually below 1&nbsp;ms.
</p>
<h3>Digital Signal Processing</h3>
<p>
Digital processors tend to process audio in chunks, and the size of that chunk
depends on the needs of the algorithm and performance/cost considerations.
This is usually the main cause of latency when you use a computer and one you
can try to predict and optimize.
This is usually the main cause of latency when using a computer and the one that
can be predicted and optimized.
</p>
<h3>Computer I/O Architecture</h3>
<p>
A computer is a general purpose processor, not a digital audio processor.
This means our audio data has to jump a lot of fences in its path from the
This means the audio data has to jump a lot of fences in its path from the
outside to the CPU and back, contending in the process with some other parts
of the system vying for the same resources (CPU time, bus bandwidth, etc.)
</p>
<h2>The Latency chain</h2>
<figure>
<img src="/images/latency-chain.png" alt="Latency chain">
<figcaption>
Latency chain
</figcaption>
</figure>
<img src="/images/latency-chain.png" title="Latency chain" alt="Latency chain" />
<p>
<em>Figure: Latency chain.</em>
The numbers are an example for a typical PC. With professional gear and an
optimized system the total round-trip latency is usually lower. The important
The numbers are an example for a typical PC. With professional gear and an
optimized system the total round-trip latency is usually lower. The important
point is that latency is always additive and a sum of many independent factors.
</p>
<p>
Processing latency is usually divided into <dfn>capture latency</dfn> (the time
it takes for the digitized audio to be available for digital processing, usually
@ -77,10 +83,10 @@
milliseconds.
</p>
<h2>Low Latency use cases</h2>
<p>
Low latency is <strong>not</strong> always a feature you want to have. It
Low latency is <strong>not</strong> always a feature one wants to have. It
comes with a couple of drawbacks: the most prominent is increased power
consumption because the CPU needs to process many small chunks of audio data,
it is constantly active and can not enter power-saving mode (think fan noise).
@ -92,18 +98,21 @@
<p>
For a few applications, low latency is critical:
</p>
<h3>Playing virtual instruments</h3>
<p>
A large delay between the pressing of the keys and the sound the instrument
produces will throw off the timing of most instrumentalists (save church
organists, whom we believe to be awesome latency-compensation organic systems.)
</p>
<h3>Software audio monitoring</h3>
<p>
If a singer is hearing her own voice through two different paths, her head
bones and headphones, even small latencies can be very disturbing and
manifest as a tinny, irritating sound.
</p>
<h3>Live effects</h3>
<p>
Low latency is important when using the computer as an effect rack for
@ -111,6 +120,7 @@
latency might be tolerable, if the direct sound is not routed through the
computer.
</p>
<h3>Live mixing</h3>
<p>
Some sound engineers use a computer for mixing live performances.
@ -120,10 +130,12 @@
<p>
In many other cases, such as playback, recording, overdubbing, mixing,
mastering, etc. latency is not important, since it can easily be
compensated for.<br>
To explain that statement: During mixing or mastering you don&#039;t care
if it takes 10ms or 100ms between the instant you press the play button
and sound coming from the speaker. The same is true when recording with a count in.
compensated for.
</p>
<p>
To explain that statement: During mixing or mastering, one doesn&#039;t care
if it takes 10ms or 100ms between the instant the play button is pressed
and the sound coming from the speaker. The same is true when recording with a count in.
</p>
<h2>Latency compensation</h2>
@ -141,10 +153,10 @@
by the same amount to line things up again.
</p>
<p>
As you may see, the second approach is prone to various implementation
The second approach is prone to various implementation
issues regarding timecode and transport synchronization. Ardour uses read-ahead
to compensate for latency. The time displayed in the Ardour clock corresponds
to the audio signal that you hear on the speakers (and is not where Ardour
to the audio signal that is heared on the speakers (and is not where Ardour
reads files from disk).
</p>
<p>
@ -152,7 +164,7 @@
timecode <samp>01:00:00:00</samp>. When compensating for output latency the
DAW will need to read data from before the start of the session, so that the
audio arrives in time at the output when the timecode hits <samp>01:00:00:00</samp>.
Ardour3 does handle the case of <samp>00:00:00:00</samp> properly but not all
Ardour does handle the case of <samp>00:00:00:00</samp> properly but not all
systems/software/hardware that you may inter-operate with may behave the same.
</p>
@ -166,10 +178,14 @@
In order to compensate for latency, JACK or JACK applications need to know
exactly how long a certain signal needs to be read-ahead or delayed:
</p>
<img src="/images/jack-latency-excerpt.png" title="Jack Latency Compensation" alt="Jack Latency Compensation" />
<p>
<em>Figure: Jack Latency Compensation.</em>
</p>
<figure>
<img src="/images/jack-latency-excerpt.png" alt="Jack Latency Compensation">
<figcaption>
Jack Latency Compensation
</figcaption>
</figure>
<p>
In the figure above, clients A and B need to be able to answer the following
two questions:
@ -197,7 +213,6 @@
measure it.
</p>
<h2>Calibrating JACK Latency</h2>
<p>
Linux DSP guru Fons Adriaensen wrote a tool called <dfn>jack_delay</dfn>
@ -206,7 +221,7 @@
called <dfn>jack_iodelay</dfn>.
</p>
<p>
Jack_iodelay allows you to measure the total latency of the system,
Jack_iodelay allows to measure the total latency of the system,
subtracts the known latency of JACK itself and suggests values for
jackd's audio-backend parameters.
</p>
@ -216,7 +231,7 @@
difference in phase so it can estimate with great accuracy the time taken.
</p>
<p>
You can close the loop in a number of ways:
The loop can be closed in a number of ways:
</p>
<ul>
<li>
@ -224,20 +239,19 @@
propagation latency is well known so there is no need to measure it.
</li>
<li>
Connecting the output of your audio interface to its input using a
Connecting the output of the audio interface to its input using a
patch cable. This can be an analog or a digital loop, depending on
the nature of the input/output you use. A digital loop will not factor
the nature of the input/output used. A digital loop will not factor
in the <abbr title="Analog to Digital, Digital to Analog">AD/DA</abbr>
converter latency.
</li>
</ul>
<p>
Once you have closed the loop you have to:
Once the loop has been closed, one must:
</p>
<ol>
<li>Launch jackd with the configuration you want to test.</li>
<li>Launch jackd with the configuration to test.</li>
<li>Launch <kbd class="input">jack_delay</kbd> on the command line.</li>
<li>Make the appropriate connections between your jack ports so the loop is closed.</li>
<li>Adjust the playback and capture levels in your mixer.</li>
<li>Make the appropriate connections between the jack ports so the loop is closed.</li>
<li>Adjust the playback and capture levels in the mixer.</li>
</ol>