copy-edit chapter 19

This commit is contained in:
Jörn Nettingsmeier 2014-02-18 23:13:02 +01:00
parent 2127ca7c27
commit de15abf3ee
5 changed files with 494 additions and 262 deletions

View File

@ -3,9 +3,6 @@ layout: default
title: Synchronization
---
<p>Ardour can be synchronized with a variety of external devices and other software.</p>
{% children %}

View File

@ -4,42 +4,70 @@ title: On Clock and Time
---
<p>
Synchronization in multimedia involves two concepts which are often confused: <strong>clock</strong> (or speed) and <strong>time</strong> (location in time).
<dfn>Synchronization</dfn> in multimedia involves two concepts which are
often confused: <dfn>clock</dfn> (or speed) and <dfn>time</dfn> (location
in time).
</p>
<p>
A <em>clock</em> is the mechanism by which two systems <em>tick</em> simultaneously.
In the audio world this is generally referred to as <a href="http://en.wikipedia.org/wiki/Word_clock" title="http://en.wikipedia.org/wiki/Word_clock">Word Clock</a>.
It does not carry any absolute reference to a point in time: A clock is used to keep a systems sample rate constant, regular and accurate.
Word clock is usually at the frequency of the sample rate - ie at 48KHz, its period is about 20μs. Word Clock is the most common &#039;sample rate&#039; based clock but other clocks do exist such as Black and Burst, Tri-Level and DARS. Sample rates can also be derived from these clocks as well.
A <dfn>clock</dfn> determines the speet at which one or more systems
operate. In the audio world this is generally referred to as
<a href="http://en.wikipedia.org/wiki/Word_clock" title="http://en.wikipedia.org/wiki/Word_clock">Word Clock</a>.
It does not carry any absolute reference to a point in time: A clock is
used to keep a system's sample rate regular and accurate.
Word clock is usually at the frequency of the sample rate &mdash;
at 48&nbsp;kHz, its period is about 20&nbsp;μs. Word Clock is the most
common sample rate based clock but other clocks do exist such as Black and
Burst, Tri-Level and DARS. Sample rates can be derived from these clocks as well.
</p>
<p>
Time or <em>timecode</em> on the other hand specifies an absolute relationship or position on a timeline e.g. <code>01:02:03:04</code> (expressed as Hours:Mins:Secs:Frames). It is actual <em>data</em> and not a clock <em>signal</em> per se.
The granularity of timecode is <strong>Video Frames</strong> and is an order of magnitude lower than, say, Word Clock which is counted in <strong>samples</strong>. A typical frame rate is 25 fps with a period of 40ms.
In the case of 48KHz and 25fps, there are 1920 audio samples per video frame.
Time or <dfn>timecode</dfn> specifies an absolute position on a timeline,
such as <code>01:02:03:04</code> (expressed as Hours:Mins:Secs:Frames). It is
actual <em>data</em> and not a clock <em>signal</em> per se.
The granularity of timecode is <dfn>Video Frames</dfn> and is an order of
magnitude lower than, say, Word Clock which is counted in
<dfn>samples</dfn>. A typical frame rate is 25&nbsp;<abbr title="frames
per second">fps</abbr> with a period of
40&nbsp;ms.
In the case of 48&nbsp;kHz and 25&nbsp;fps, there are 1920 audio samples
per video frame.
</p>
<p>
The concept of clock and timecode is reflected in JACK and Ardour:
The concepts of clock and timecode are reflected in JACK and Ardour:
</p>
<p>
JACK provides clock synchronization and is not concerned with time code (this is not entirely true, more on jack-transport later).
Within software, jackd provides sample-accurate synchronization between all JACK applications.
On the hardware side JACK uses the clock of the audio-interface. Synchronization of multiple interfaces requires hardware support to sync the clocks.
If two interfaces run at different clocks the only way to align the signals is via re-sampling (SRC - Sample Rate Conversion) - which decreases fidelity.
JACK provides clock synchronization and is not concerned with time code
(this is not entirely true, more on jack-transport later).
On the software side, jackd provides sample-accurate synchronization
between all JACK applications.
On the hardware side, JACK uses the clock of the audio-interface.
Synchronization of multiple interfaces requires hardware support to sync
the clocks.
If two interfaces run at different clocks the only way to align the
signals is via re-sampling (SRC - Sample Rate Conversion), which is
expensive in terms of CPU usage and may decreases fidelity if done
incorrectly.
</p>
<p>
Timecode is used to align systems already synchronized by a clock to a common point in time, this is application specific and various standards and methods exist to do this.
Timecode is used to align systems already synchronized by a clock to
a common point in time, this is application specific and various
standards and methods exist to do this.
</p>
<p class="note">
To make things confusing, there are possibilities to synchronize clocks
using timecode. e.g. using mechanism called <dfn>jam-sync</dfn> and a
<dfn>phase-locked loop</dfn>.
</p>
<p>
NB. to make things confusing, there are possibilities to synchronize clocks using timecode. e.g. using mechanism called <em>jam-sync</em> and a Phase-Locked-Loop.
</p>
<p>
An interesting point to note is that LTC (Linear Time Code) is a Manchester encoded, frequency modulated signal that carries both &#039;Clock&#039; and &#039;Time&#039;. It is possible to extract absolute position data and speed from it.
An interesting point to note is that LTC (Linear Time Code) is a
Manchester encoded, frequency modulated signal that carries both
clock and time. It is possible to extract absolute position data
and speed from it.
</p>

View File

@ -1,152 +1,248 @@
---
layout: default
title: Latency and Latency-Compensation
menu_title: About Latency
menu_title: Latency
---
<h2>Latency</h2>
<p>
When speaking about synchronization, it is also necessary to speak of latency.
<a href="http://en.wikipedia.org/wiki/Latency_%28audio%29" title="http://en.wikipedia.org/wiki/Latency_%28audio%29">Latency</a> is a system's reaction time to a given stimulus. There are many factors that contribute to the total latency of a system.
In order to achieve exact time synchronization all sources of latency need to be taken into account and compensated for.
<a
href="http://en.wikipedia.org/wiki/Latency_%28audio%29"><dfn>Latency</dfn></a>
is a system's reaction time to a given stimulus. There are many factors that
contribute to the total latency of a system. In order to achieve exact time
synchronization all sources of latency need to be taken into account and
compensated for.
</p>
<ul>
<li><strong>Sound propagation through the air</strong>: since it is a mechanical perturbation in a fluid, sound travels at comparatively slow <a href="http://en.wikipedia.org/wiki/Speed_of_sound" title="http://en.wikipedia.org/wiki/Speed_of_sound">speed</a> of about 340 m/s. Some interesting consequences:
<ul>
<li>Your acoustic guitar or piano has a latency of about 1-2 ms, due to the propagation of the sound between your instrument and your ear . </li>
<li>At a large concert venue if you are far away from the stage the sound will travel faster through the path "singer → mic → nearest loudspeaker → your ear" than through the "singer → air → your ear" one, so you&#039;ll hear the real sound as an echo of the amplified one. </li>
</ul>
</li>
<li><strong>Digital-to-Analog and Analog-to-Digital conversion</strong>: electric signals travel quite fast, so their propagation time is negligible in this context, but the conversions between the analog and digital domain take a comparatively long time to perform, so their contribution to the total latency may be considerable. Fast converters are, for instance, one of the factors that distinguishes a quality audio interface from a cheap one, along with other features like low noise, low distortion, etc.</li>
<li><strong>Digital Signal Processing</strong>: digital processors tend to process audio in chunks, and the size of that chunk depends on the needs of the algorithm and performance/cost considerations. This is usually the main cause of latency when you use a computer and one you can try to predict and optimize.</li>
<li><strong>Computer I/O Architecture</strong>: a computer is a general purpose processor, not a digital audio processor. This means our audio data has to jump a lot of fences in its path from the outside to the CPU and back, contending in the process with some other parts of the system vying for the same resources (CPU time, bus bandwidth, etc.) Thanks to the combined efforts of kernel, audio driver and jackd developers, you are in position to tune your system a bit more towards the digital audio processing task, but don&#039;t expect miracles. Remember you also use your computer to write documents, surf the net, save lemmings… Polyvalence comes at a cost.</li>
</ul>
<p><img src="/ardour/manual/html/diagrams/latency-chain.png" title="Latency chain" alt="Latency chain" /></p>
<p><em>Figure: Latency chain.</em> The numbers are an example for a typical PC. With professional gear and an optimized system the total roundtrip latency is usually lower. The important point is that latency is always additive and a sum of many independent factors.</p>
<h2>Sources of Latency</h2>
<h3>Sound propagation through the air</h3>
<p>
There is not much that can done about the first two other than using headphones or sitting near the loudspeaker and buying quality gear.
Since sound is a mechanical perturbation in a fluid, it travels at
comparatively slow <a href="http://en.wikipedia.org/wiki/Speed_of_sound">speed</a>
of about 340 m/s. As a consequence, your acoustic guitar or piano has a
latency of about 1&ndash;2 ms, due to the propagation time of the sound
between your instrument and your ear.
</p>
<h3>Digital-to-Analog and Analog-to-Digital conversion</h3>
<p>
Electric signals travel quite fast (on the order of the speed of light),
so their propagation time is negligible in this context. But the conversions
between the analog and digital domain take a comparatively long time to perform,
so their contribution to the total latency may be considerable on
otherwise very low-latency systems. Conversion delay is usually below 1&nbsp;ms.
</p>
<h3>Digital Signal Processing</h3>
<p>
Digital processors tend to process audio in chunks, and the size of that chunk
depends on the needs of the algorithm and performance/cost considerations.
This is usually the main cause of latency when you use a computer and one you
can try to predict and optimize.
</p>
<h3>Computer I/O Architecture</h3>
<p>
A computer is a general purpose processor, not a digital audio processor.
This means our audio data has to jump a lot of fences in its path from the
outside to the CPU and back, contending in the process with some other parts
of the system vying for the same resources (CPU time, bus bandwidth, etc.)
</p>
<h2>The Latency chain</h2>
<img src="/ardour/manual/html/diagrams/latency-chain.png" title="Latency chain" alt="Latency chain" />
<p>
<em>Figure: Latency chain.</em>
The numbers are an example for a typical PC. With professional gear and an
optimized system the total roundtrip latency is usually lower. The important
point is that latency is always additive and a sum of many independent factors.
</p>
<p>
Processing latency is usually divided into capture latency and playback latency:
Processing latency is usually divided into <dfn>capture latency</dfn> (the time
it takes for the digitized audio to be available for digital processing, usually
one audio period), and <dfn>playback latency</dfn> (the time it takes for
In practice, the combination of both matters. It is called <dfn>roundtrip
latency</dfn>: the time necessary for a certain audio event to be captured,
processed and played back.
</p>
<ul>
<li><strong>Capture latency</strong>: the time necessary for the digitized audio to be available for digital processing. Usually it is one audio period.</li>
</ul>
<ul>
<li><strong>Playback latency</strong>: the time necessary for the digitized audio to be processed and delivered out of the processing chain. At best it is one audio period.</li>
</ul>
<p>
But this division is an implementation detail of no great interest. What really matters is the combination of both. It is called <strong>processing roundtrip latency</strong>: the time necessary for a certain audio event to be captured, processed and played back.
<p class="note">
It is important to note that processing latency in a jackd is a matter of
choice. It can be lowered within the limits imposed by the hardware (audio
device, CPU and bus speed) and audio driver. Lower latencies increase the
load on the system because it needs to process the audio in smaller chunks
which arrive much more frequently. The lower the latency, the more likely
the system will fail to meet its processing deadline and the dreaded
<dfn>xrun</dfn> (short for buffer over- or under-run) will make its
appearance more often, leaving its merry trail of clicks, pops and crackles.
</p>
<p>
It is important to note that <strong>processing latency in a jackd is a matter of choice</strong>: It can be lowered within the limits imposed only by the hardware (audio-device, CPU and bus-speed) and audio driver. Lower latencies increase the load on the computer-system because it needs to process the audio in smaller chunks which arrive much more frequently. The lower the latency, the more likely the system will fail to meet its processing deadline and the <em>dreaded</em> <strong>xrun</strong> (short for buffer over-run and buffer under-run) will make its appearance more often, leaving its merry trail of clicks, pops and crackles.
The digital I/O latency is usually negligible for integrated or
<abbr title="Periphal Component Interface">PCI</abbr> audio devices, but
for USB or FireWire interfaces the bus clocking and buffering can add some
milliseconds.
</p>
<h2>Low Latency usecases</h2>
<p>
The digital I/O latency is usually negligible for integrated or <abbr title="Periphal Component Interface">PCI</abbr> audio devices but for USB or FireWire interfaces the bus clocking and buffering can add some milliseconds.
Low latency is <strong>not</strong> always a feature you want to have. It
comes with a couple of drawbacks: the most prominent is increased power
consumption because the CPU needs to process many small chunks of audio data,
it is constantly active and can not enter power-saving mode (think fan-noise).
Since each application that is part of the signal chain must run in every
audio cycle, low-latency systems will undergo<dfn>context switches</dfn>
between applications more often, which incur a significant overhead.
This results in a much higher system load and an increased chance of xruns.
</p>
<p>
Low-latency is <strong>not</strong> always a feature you want to have. It comes with a couple of drawbacks: the most prominent is increased power-consumption because the CPU needs to process many small chunks of audio-data, it is constantly active and can not enter power-saving mode (think fan-noise). Furthermore, if more than one application (sound-processor) is involved in processing the sound, each of these needs to run for a short, well defined time for each audio-cycle which results in a much higher system-load and an increased chance of x-runs. Reliable low-latency (≤10ms) on GNU/Linux can usually only be achieved by running a <a href="https://rt.wiki.kernel.org/" title="https://rt.wiki.kernel.org/">realtime-kernel</a>.
For a few applications, low latency is critical:
</p>
<h3>Playing virtual instruments</h3>
<p>
Yet there are a few situations where a low-latency is really important, because they require very quick response from the computer.
A large delay between the pressing of the keys and the sound the instrument
produces will throw-off the timing of most instrumentalists (save church
organists, whom we believe to be awesome latency-compensation organic systems.)
</p>
<ul>
<li><strong>Playing virtual instruments</strong>: a large delay between the pressing of the keys and the sound the instrument produces will throw-off the timing of most instrumentalists (save church organists, whom we believe are awesome latency-compensation organic systems.)</li>
<li><strong>Software audio monitoring</strong>: if a singer is hearing her own voice through two different paths, her head bones and headphones, large latencies can be disturbing.</li>
<li><strong>Live-effects</strong>: This case is similar to playing virtual instruments: instead of virtual-instruments/sythensizers it is about real-instruments and and effects processing. Low latency is important when using the computer as effect-rack (e.g. guitar effects) - also precise synchronization may be important if you manually trigger sound effects like delays.</li>
<li><strong>Live-mixing</strong>: Some sound engineers use a computer for mixing live performances. Basically that is a combination of the above: monitoring on stage, effect-processing and EQ. It is actually more tricky since one not only wants low latency (audio should not lag too much behind the performance) but exact low-latency (minimal jitter) for delay-lines between speaker in front and back.</li>
</ul>
<h3>Software audio monitoring</h3>
<p>
In many other cases - such as playback, recording, overdubbing, mixing, mastering, etc. latency is not important, It can be relatively large and easily be compensated for.
If a singer is hearing her own voice through two different paths, her head
bones and headphones, even small latencies can be very disturbing and
manifest as a tinny, irritating sound.
</p>
<h3>Live effects</h3>
<p>
To explain that statement: During mixing or mastering you don&#039;t care if it takes 10ms or 100ms between the instant you press the <em>play button</em> and sound coming from the speaker. The same is true when recording with a count in.
Low latency is important when using the computer as an effect rack for
inline effects such as compression or EQ. For reverbs, slightly higher
latency might be tolerable, if the direct sound is not routed through the
computer.
</p>
<h3>Live mixing</h3>
<p>
During tracking it is important that the sound that is currently being played back is internally aligned with the sound that is being recorded.
Some sound engineers use a computer for mixing live performances.
Basically that is a combination of the above: monitoring on stage,
effects processing and EQ.
</p>
<p>
This is where latency-compensation comes into play. There are two possibilities to compensate for latency in a DAW: <em>read-ahead</em> the DAW starts playing a bit early (relative to the playhead), so that when the sound arrives at the speakers a short time later, it is exactly aligned with the material that is being recorded.
And <em>write-behind</em>; since we know that play-back has latency, the incoming audio can be delayed by the same amount to line things up again.
In many other cases, such as playback, recording, overdubbing, mixing,
mastering, etc. latency is not important, since it can easily be
compensated for.<br />
To explain that statement: During mixing or mastering you don&#039;t care
if it takes 10ms or 100ms between the instant you press the play button
and sound coming from the speaker. The same is true when recording with a count in.
</p>
<h2>Latency compensation</h2>
<p>
As you may see, the second approach is prone to various implementation issues regarding timecode and transport synchronization. Ardour uses read-ahead to compensate for latency. The time displayed in the Ardour clock corresponds to the audio-signal that you hear on the speakers (and is not where Ardour reads files from disk).
During tracking it is important that the sound that is currently being
played back is internally aligned with the sound that is being recorded.
</p>
<p>
As a side note, this is also one of the reasons why many projects start at timecode <code>01:00:00:00</code>. When compensating for output-latency the DAW will need to read data from before the start of the session so that the audio arrives in time at the output when the timecode hits <code>01:00:00:00</code>. Ardour3 does handle the case of <code>00:00:00:00</code> properly but not all systems/software/hardware that you may inter-operate with may behave the same.
This is where latency-compensation comes into play. There are two ways to
compensate for latency in a DAW, <dfn>read-ahead</dfn> and
<dfn>write-behind</dfn>. The DAW starts playing a bit early (relative to
the playhead), so that when the sound arrives at the speakers a short time
later, it is exactly aligned with the material that is being recorded.
Since we know that play-back has latency, the incoming audio can be delayed
by the same amount to line things up again.
</p>
<p>
As you may see, the second approach is prone to various implementation
issues regarding timecode and transport synchronization. Ardour uses read-ahead
to compensate for latency. The time displayed in the Ardour clock corresponds
to the audio-signal that you hear on the speakers (and is not where Ardour
reads files from disk).
</p>
<p>
As a side note, this is also one of the reasons why many projects start at
timecode <samp>01:00:00:00</samp>. When compensating for output latency the
DAW will need to read data from before the start of the session, so that the
audio arrives in time at the output when the timecode hits <samp>01:00:00:00</samp>.
Ardour3 does handle the case of <samp>00:00:00:00</samp> properly but not all
systems/software/hardware that you may inter-operate with may behave the same.
</p>
<h2>Latency Compensation And Clock Sync</h2>
<p>
To achieve sample accurate timecode synchronization, the latency introduced by the audio-setup needs to be known and compensated for.
To achieve sample accurate timecode synchronization, the latency introduced
by the audio setup needs to be known and compensated for.
</p>
<p>
In order to compensate for Latency, JACK or JACK applications need to know exactly how long a certain signal needs to be read-ahead or delayed:
In order to compensate for latency, JACK or JACK applications need to know
exactly how long a certain signal needs to be read-ahead or delayed:
</p>
<p><img src="/ardour/manual/html/diagrams/jack-latency-excerpt.png" title="Jack Latency Compensation" alt="Jack Latency Compensation" /></p>
<p><em>Figure: Jack Latency Compensation.</em> This figure outlines the jack latency API. -- excerpt from http://jackaudio.org/files/jack-latency.png</p>
<img src="/ardour/manual/html/diagrams/jack-latency-excerpt.png" title="Jack Latency Compensation" alt="Jack Latency Compensation" />
<p>
In the figure above, clients A and B need to be able to answer the following two questions:
<em>Figure: Jack Latency Compensation.</em>
</p>
<p>
In the figure above, clients A and B need to be able to answer the following
two questions:
</p>
<ul>
<li>how long has it been since the data read from port Ai or Bi arrived at the edge of the JACK graph (capture)?</li>
<li>how long will it be until the data writen to port Ao or Bo arrives at the edge of the JACK graph (playback)?</li>
<li>
How long has it been since the data read from port Ai or Bi arrived at the
edge of the JACK graph (capture)?
</li>
<li>
How long will it be until the data writen to port Ao or Bo arrives at the
edge of the JACK graph (playback)?
</li>
</ul>
<p>
JACK features an <abbr title="Application Programming Interface">API</abbr> that allows applications to determine the answers to above questions. However JACK can not know about the additional latency that is introduced by the computer architecture, operating system and soundcard. These values are indicated by <code>-I</code> and <code>-O</code> and vary from system to system but are constant on each. On a general purpose computer system the only way to accurately learn about the total (additional) latency is to measure it.
JACK features an <abbr title="Application Programming Interface">API</abbr>
that allows applications to determine the answers to above questions.
However JACK can not know about the additional latency that is introduced
by the computer architecture, operating system and soundcard. These values
can be specified by the JACK command line parameters <kbd class="input">-I</kbd>
and <kbd class="input">-O</kbd> and vary from system
to system but are constant on each. On a general purpose computer system
the only way to accurately learn about the total (additional) latency is to
measure it.
</p>
<h2>Calibrating JACK Latency</h2>
<p>
Linux DSP guru Fons Adriaensen wrote a tool called <code>jack_delay</code> to accurately measure the roundtrip latency of a closed loop audio chain, with sub-sample accuracy. JACK itself includes a variant of this tool called <code>jack_iodelay</code>.
Linux DSP guru Fons Adriaensen wrote a tool called <dfn>jack_delay</dfn>
to accurately measure the roundtrip latency of a closed loop audio chain,
with sub-sample accuracy. JACK itself includes a variant of this tool
called <dfn>jack_iodelay</dfn>.
</p>
<p>
Jack_iodelay allows you to measure the total latency of the system, subtracts the known latency of JACK itself and suggests parameters for jackd&#039;s audio-backend <code>-I</code> and <code>-O</code> options.
Jack_iodelay allows you to measure the total latency of the system,
subtracts the known latency of JACK itself and suggests values for
jackd's audio-backend parameters.
</p>
<p>
jack_[io]delay works by emitting some rather annoying tones, capturing them again after a round trip through the whole chain, and measuring the difference in phase so it can estimate with great accuracy the time taken. This is not a theoretical estimation, jack_delay is a measuring tool that provides very accurate answers.
jack_[io]delay works by emitting some rather annoying tones, capturing
them again after a round trip through the whole chain, and measuring the
difference in phase so it can estimate with great accuracy the time taken.
</p>
<p>
You can close the loop in a number of ways:
You can close the loop in a number of ways:
</p>
<ul>
<li>Putting a speaker close to a microphone. This is rarely done, as air propagation latency is well known so there is no need to measure it.</li>
<li>Connecting the output of your audio interface to its input using a patch cable. This can be an analog or a digital loop, depending on the nature of the input/output you use. A digital loop won&#039;t factor in the <abbr title="Analog to Digital, Digital to Analog">AD/DA</abbr> converter latency.</li>
<li>
Putting a speaker close to a microphone. This is rarely done, as air
propagation latency is well known so there is no need to measure it.
</li>
<li>
Connecting the output of your audio interface to its input using a
patch cable. This can be an analog or a digital loop, depending on
the nature of the input/output you use. A digital loop will not factor
in the <abbr title="Analog to Digital, Digital to Analog">AD/DA</abbr>
converter latency.
</li>
</ul>
<p>
Once you have closed the loop you have to:
Once you have closed the loop you have to:
</p>
<ol>
<li>Launch jackd with the configuration you want to test.</li>
<li>Launch <code>jack_delay</code> on the commandline.</li>
<li>Make the appropriate connections between your jack ports so the loop is closed.</li>
<li>Adjust the playback and capture levels in your mixer.</li>
<li>Launch jackd with the configuration you want to test.</li>
<li>Launch <kbd class="input">jack_delay</kbd> on the commandline.</li>
<li>Make the appropriate connections between your jack ports so the loop is closed.</li>
<li>Adjust the playback and capture levels in your mixer.</li>
</ol>

View File

@ -3,108 +3,131 @@ layout: default
title: Timecode Generators and Slaves
---
<h2>Ardour Timecode Generators and Slaves</h2>
<p>
There are three common timecode formats:
Ardour supports three common timecode formats:
<abbr title="Linear/Longitudinal Time Code"><dfn>LTC</dfn></abbr>,
<abbr title="MIDI Time Code"><dfn>MTC</dfn></abbr>, and
<dfn>MIDI Clock</dfn>, as well as
<dfn>JACK-transport</dfn>, a JACK-specific timecode implementation.
</p>
<ul>
<li>LTC Linear/Longitudinal Time Code</li>
<li>MTC MIDI Time Code</li>
<li>MIDI-Clock tempo based time</li>
</ul>
<p>
As well as a JACK specific timecode implementation:
</p>
<ul>
<li>JACK-transport</li>
</ul>
<p>
Ardour supports all of these standards.
It can generate timecode and thus act as timecode master providing timecode information to other applications.
Ardour can also be <em>slaved</em> to some external source in which case the playhead follows the incoming timecode.
</p>
<p>
Combining the timecode slave and generator modes, Ardour can also translate timecode. e.g create LTC timecode from incoming MTC.
Ardour can generate timecode and thus act as timecode <dfn>master</dfn>,
providing timecode information to other applications. Ardour can also be
<dfn>slaved</dfn> to some external source in which case the playhead
follows the incoming timecode.<br />
Combining the timecode slave and generator modes, Ardour can also
<dfn>translate</dfn> timecode. e.g create LTC timecode from incoming MTC.
</p>
<h2>Ardour Timecode Configuration</h2>
<p>
Each Ardour session has a specific timecode frames-per-second setting which is configured in <code>session &gt; properties &gt; timecode</code>. The selected timecode affects the timecode-ruler in the main window as well as the clock itself.
Each Ardour session has a specific timecode frames-per-second setting which
is configured in <kbd class="menu">session &gt; properties &gt;
timecode</kbd>. The selected timecode affects the timecoderuler in the main
window as well as the clock itself.
</p>
<p>
Note that some timecode formats are limited to a subset of Ardour's available fps. e.g. MTC is limited to 24, 25, 29.97 and 30 fps.
Note that some timecode formats do not support all of Ardour's available
fps settings. MTC is limited to 24, 25, 29.97 and 30 fps.
</p>
<p>
The video-pullup modes change the effective samplerate of Ardour to allows for changing a film soundtrack from one frame rate to another. The concept is beyond the scope of this manual, but wikipedia's entry on <a href="http://en.wikipedia.org/wiki/Telecine" title="http://en.wikipedia.org/wiki/Telecine">Telecine</a> may get you started.
The video pull-up modes change the effective samplerate of Ardour to allow
for changing a film soundtrack from one frame rate to another. The concept is
beyond the scope of this manual, but Wikipedia's entry on
<a href="http://en.wikipedia.org/wiki/Telecine">Telecine</a>
may get you started.
</p>
<h2>Ardour Timecode Generator Configuration</h2>
<p>
This is pretty straight forward: simply turn it on. The MTC and MIDI-Clock generator do not have any options.
For the LTC generator the volume of the generated LTC can be configured. JACK-transport can not be <em>generated</em>. Jack itself is always sample-sync to the jack-cycle and does not slave to anything.
This is pretty straightforward: simply turn it on. The MTC and MIDI-Clock
generator do not have any options. The LTC generator has a configurable
output level. JACK-transport cannot be <em>generated</em>. Jack itself is
always synced to its own cycle and cannot do varispeed &mdash; it will
always be synced to a hardware clock or another JACK master.
</p>
<p>
The relevant settings for timecode generator can be found in the Preferences dialog: "MIDI Preferences" (for MTC, MClk) and "Transport Preferences" respectively.
The relevant settings for timecode generator can be found in
<kbd class="menu">Edit &gt; Preferences &gt; MIDI Preferences</kbd> (for MTC,
MC) and
<kbd class="menu">Edit &gt; Preferences &gt; Transport Preferences</kbd>
(for LTC).
</p>
<p>
The timecode is sent to jack-ports <code>ardour:MTC out</code>, <code>ardour:MIDI clock out</code> and <code>ardour:LTC-out</code>. Multiple generators can be active simultaneously.
The timecode is sent to jack-ports <code>ardour:MTC out</code>,
<code>ardour:MIDI clock out</code> and <code>ardour:LTC-out</code>. Multiple
generators can be active simultaneously.
</p>
<p class="note">
Note that, as of Jan 2014, only the LTC generator supports latency
compensation. This is due to the fact the Ardour MIDI ports are not
yet latency compensated.
</p>
<p>
Note that - at the time of writing this - only the LTC generator supports latency compensation. This is due to the fact the Ardour MIDI ports are not yet latency compensated.
In <kbd class="menu">Session &gt; Properties</kbd>, it is possible to
define an offset between Ardour's internal time and the timecode sent.
Currently only the LTC generator honors this offset.
</p>
<p>
In <code>session &gt; properties</code> it is possible to define an offset between Ardour&#039;s internal time and the timecode sent. Currently only the LTC generator honors this offset.
</p>
<p>
Both LTC and MTC are limited to max of 30fps. Using frame-rates larger than that will disable the generator. In both cases also only 24, 25, 29.97df and 30fps are well defined by specifications (such as SMPTE-12M, EU and the MIDI standard).
Both LTC and MTC are limited to 30&nbsp;fps. Using frame rates larger
than that will disable the generator. In both cases also only 24, 25,
29.97df (drop-frame) and 30&nbsp;fps are well defined by specifications (such as
SMPTE-12M, EU and the MIDI standard).
</p>
<h3>MTC Generator</h3>
<p>
There are no options. Ardour sends full MTC frames whenever the transport is relocated or changes state (start/stop). MTC quarter frames are sent when the transport is rolling and the transport speed is within 93% and 107%.
The <dfn>MTC generator</dfn> has no options. Ardour sends full MTC
frames whenever the transport is relocated or changes state (start/stop).
MTC <dfn>quarter frames</dfn> are sent when the transport is rolling and
the transport speed is within 93% and 107%.
</p>
<h3>LTC Generator</h3>
<p>
The volume of the LTC signal can be configured in in the <code>Preferences &gt; Transport</code> dialog. By default it is set to -18dBFS which corresponds to 0dBu in an EBU calibrated system.
The level of the <dfn>LTC generator</dfn> output signal can be configured
in in the <kbd class="menu">Preferences &gt; Transport</kbd> dialog. By
default it is set to -18&nbsp;dBFS, which corresponds to 0dBu in an EBU
calibrated system.
</p>
<p>
The LTC generator has an additional option to keep sending timecode even when the transport is stopped. This mode is intended to drive analog tape machines which unspool the tape if no LTC timecode is received.
The LTC generator has an additional option to keep sending timecode even
when the transport is stopped. This mode is intended to drive analog tape
machines which unspool the tape if no LTC timecode is received.
</p>
<p>
LTC is send regardless of Ardour's transport-speed. It is accurately generated even for very slow speeds (&lt;5%) and only limited by the soundcard's sampling-rate and filter (see <a href="http://en.wikipedia.org/wiki/Gibbs_phenomenon#Signal_processing_explanation" title="http://en.wikipedia.org/wiki/Gibbs_phenomenon">Gibbs phenomenon</a>) for high speeds.
LTC is send regardless of Ardour's transport speed. It is accurately
generated even for very slow speeds (&lt;5%) and only limited by the
soundcard's sampling-rate and filter (see
<a
href="http://en.wikipedia.org/wiki/Gibbs_phenomenon#Signal_processing_explanation">Gibbs phenomenon</a>)
for high speeds.
</p>
<h2>Ardour Slave Configuration</h2>
<p>
Switching the timecode-source can be done via the button just right of Ardour&#039;s main clock. By default it is set to <code>Internal</code> in which case Ardour will ignore any external timecode. The button allows to toggle between Internal and the configured timecode source which is chosen in <code>Edit &gt; Preferences &gt; Transport</code>.
<p>
The timecode source can be switched with the button just right of
Ardour's main clock. By default it is set to <kbd
class="menu">Internal</kbd> in which case Ardour will ignore any external
timecode. The button allows to toggle between Internal and the configured
timecode source which is chosen in <kbd class="menu">Edit &gt; Preferences
&gt; Transport</kbd>.
</p>
<p>
When Ardour is chasing an external timecode source the following cases need to be distinguished:
When Ardour is <dfn>chasing</dfn> (synchronizing to) an external timecode
source, the following cases need to be distinguished:
</p>
<ol>
<li>the timecode source shares the clock</li>
@ -115,124 +138,156 @@ When Ardour is chasing an external timecode source the following cases need to b
<li>the timecode source uses the same FPS setting as Ardour</li>
<li>the timecode source runs at different frames-per-second</li>
</ol>
<p>
In both cases the first option is preferred: clock sync + same FPS setting.
In both cases the first option is preferred: clock sync + same FPS setting.
</p>
<h3>Frames-per-second</h3>
<p>
If the frames-per-second don&#039;t match, Ardour can either re-calculate (map) the frames or the configured FPS (<code>session &gt; properties</code>) can be changed automatically while the slave is active. The behavior is configured with the checkbox in <code>Edit &gt; Preferences &gt; Transport</code> labeled <code>Match session video frame rate to external timecode</code>: When enabled the session video frame rate will be changed to match that of the selected external timecode source. When disabled the session video frame rate will not be changed to match that of the selected external timecode source. Instead the frame rate indication in the main clock will flash red and Ardour will convert between the external timecode standard and the session standard.
If the frames-per-second do not match, Ardour can either re-calculate
and map the frames, or the configured FPS (<kbd class="menu">Session &gt;
Properties</kbd>) can be changed automatically while the slave is active.
The behavior is configured with the checkbox <kbd class="option">Edit
&gt; Preferences &gt; Transport &gt; Match session video frame rate to
external timecode</kbd>.
</p>
<p>
An edge case can also occur with 29.97 drop-frame timecode. While the SMPTE 12M-1999 specifies 29.97df as 30000/1001 frames per second, not all hardware devices follow that standard. The checkbox <code>Lock to 29.9700 fps instead of 30000/1001</code> allows to use a compatibility mode for those devices:
When enabled, the session video frame rate will be changed to match that
of the selected external timecode source. When disabled, the session video
frame rate will not be changed to match that of the selected external
timecode source. Instead the frame rate indication in the main clock will
flash red, and Ardour will convert between the external timecode standard
and the session standard.
</p>
<p>
When enabled the external timecode source is assumed to use 29.970000 fps instead of 30000/1001. SMPTE 12M-1999 specifies 29.97df as 30000/1001. The <abbr title="specification">spec</abbr> further mentions that drop-frame timecode has an accumulated error of -86ms over a 24-hour period. Drop-frame timecode would compensate exactly for a NTSC color frame rate of 30 * 0.9990 (ie 29.970000). That is not the actual rate. However, some vendors use that rate - despite it being against the specs - because the variant of using exactly 29.97 fps yields zero timecode drift.
<p class="warning">
29.97 drop-frame timecode is another corner case. While the SMPTE 12M-1999
specifies 29.97df as 30000/1001 frames per second, not all hardware devices
follow that standard. The checkbox
<kbd class="option">Lock to 29.9700 fps instead of 30000/1001</kbd> allows
to use a compatibility mode for those devices.<br />
When enabled, the external timecode source is assumed to use 29.970000 fps
instead of 30000/1001. SMPTE 12M-1999 specifies 29.97df as 30000/1001. The
<abbr title="specification">spec</abbr> further mentions that drop-frame
timecode has an accumulated error of -86&nbsp;ms over a 24-hour period.
Drop-frame timecode would compensate exactly for a NTSC color frame rate
of 30 * 0.9990 (ie 29.970000). That is <em>not</em> the actual rate. However,
some vendors use that rate &mdash; despite it being against the specs
&mdash; because the variant of using exactly 29.97 fps yields zero timecode
drift.
</p>
<h3>Clock Sync Lock</h3>
<p>
As described in the <a href="http://manual.ardour.org/synchronization/on-clock-and-time/">On Clock and Time Section</a>, timecode and clock are independent. If the external timecode-source is not sample-sync with the audio-hardware (and jack), ardour needs to vari-speed to adjust for the discrepancy.
As described in the
<a href="http://manual.ardour.org/synchronization/on-clock-and-time/">On Clock and Time</a>
chapter, timecode and clock are independent. If the external timecode
source is not in sample-sync with the audio hardware (and JACK), Ardour
needs to run at varispeed to adjust for the discrepancy.
</p>
<p>
The checkbox <kbd class="option">External timecode is sync locked</kbd>
allows to select the behavior according to your setup. When enabled, it
indicates that the selected external timecode source shares sync (Black
&amp; Burst, Wordclock, etc) with the audio interface.
</p>
<p>
In other words: if enabled, Ardour will only perform initial
synchronization and keep playing at speed 1.0 instead of vari-speed
adjusting to compensate for drift.
</p>
<p class="note">
Note that vari-speed is unavailable when recording in Ardour, and all
tracking happens at speed 1.0. So if you want to record in sync with
external timecode it must be sample-locked or it will drift over time.
</p>
<h3>MIDI Clock</h3>
<p>
The checkbox <code>External timecode is sync locked</code> allows to select the behavior according to your setup. When enabled indicates that the selected external timecode source shares sync (Black &amp; Burst, Wordclock, etc) with the audio interface.
<dfn>MIDI Clock</dfn> is not a timecode format but tempo-based time. The
absolute reference point is expressed as beats-per-minute and Bar, Beat
and Tick. There is no concept of sample-locking for MIDI clock signals.
Ardour will vari-speed if necessary to chase the incoming signal.
</p>
<p>
In other words: if enabled, Ardour will only use perform initial synchronization and keep playing at speed 1.0 instead of vari-speed adjusting to compensate for drift.
Note that the MIDI Clock source must be connected to the
<code>ardour:MIDI clock in</code> port.
</p>
<p>
Note that vari-speed is unavailable when recording in Ardour and all tracking happens at speed 1.0. So if you want to record in sync with external timecode it must be sample-locked or it will drift over time.
</p>
<h3>MClk - MIDI Clock</h3>
<p>
MIDI Clock is not a timecode format but tempo-based time. The absolute reference point is expressed as beats-per-minute and Bar, Beat and Tick. There is no concept of sample-locking for Midi clock signals. Ardour will vari-speed if necessary to chase the incoming signal.
</p>
<p>
Note that the MIDI Clock source must be connected to <code>ardour:MIDI clock in</code> port.
</p>
<h3>LTC - Linear Timecode</h3>
<p>
The LTC slave decodes an incoming LTC signal on a jack-audio port. It will auto-detect the frame-rate and start locking to the signal once two consecutive LTC frames have been received.
The <dfn>LTC</dfn> slave decodes an incoming LTC signal on a JACK audio
port. It will auto-detect the frame rate and start locking to the signal
once two consecutive LTC frames have been received.
</p>
<p>
The incoming timecode signal needs to arrive at the
<code>ardour:LTC-in</code> port. Port-connections are restored for each
session and the preference dialog offers an option to select it for all
sessions.
</p>
<p>
Ardour's transport is aligned to LTC-frame start/end positions according
to the SMPTE 12M-1999 specification, which means that the first bit of an
LTC-Frame is aligned to different Lines of a Video-Frame, depending on the
TV standard used. Only for Film (24fps) does the LTC-Frame directly match
the video Frame boundaries.
</p>
<p>
The incoming timecode signal needs to arrive at the <code>ardour:LTC-in</code> port. Port-connections are restored for each session and the preference dialog offers an option to select it for all sessions.
</p>
<p>
Ardour&#039;s transport is aligned to LTC-frame start/end positions according to the SMPTE 12M-1999 <abbr title="specification">spec</abbr> which means that the first bit of an LTC-Frame is aligned to different Lines of a Video-Frame, depending on the TV standard used. Only for Film (24fps) does the LTC-Frame directly match the video Frame boundaries.
</p>
<p><img src="/ardour/manual/html/diagrams/ltc-transport-alignment.png" title="LTC frame alignment" alt="LTC frame alignment"/></p>
<img src="/ardour/manual/html/diagrams/ltc-transport-alignment.png" title="LTC frame alignment" alt="LTC frame alignment"/>
<p><em>Figure: LTC frame alignment for the 525/60 TV standard</em></p>
<p>
Ardour supports vari-speed and backwards playback but will only follow speed changes if the <code>sync locked</code> configuration option is disabled.
Ardour supports vari-speed and backwards playback but will only follow
speed changes if the <kbd class="optoff">sync locked</kbd> option is
disabled.
</p>
<p>
While Ardour is chasing LTC, the main transport clock will display the received Timecode as well as the delta between the incoming signal and Ardour&#039;s transport position.
While Ardour is chasing LTC, the main transport clock will display the
received Timecode as well as the delta between the incoming signal and
Ardour's transport position.
</p>
<p>
A global offset between incoming timecode and Ardour&#039;s transport can be configured in <code>Session &gt; Properties</code>.
A global offset between incoming timecode and Ardour's transport can be
configured in <kbd class="menu">Session &gt; Properties</kbd>.
</p>
<p>
The user-bits in the received LTC frame are ignored.
The user-bits in the received LTC frame are ignored.
</p>
<h3>MTC - MIDI Timecode</h3>
<p>
Ardour&#039;s MTC slave parses full timecode (sysex messages) as well as MTC quarter-frames arriving on the <code>ardour:MTC in</code> port. The transport will only start rolling once a complete sequence of 8 quarter frames has been received.
Ardour's MTC slave parses <dfn>full timecode messages</dfn> as well as
MTC <dfn>quarter-frame messages</dfn> arriving on the
<code>ardour:MTC in</code> port. The transport will only start rolling
once a complete sequence of 8 quarter frames has been received.
</p>
<p>
Ardour supports vari-speed and backwards playback but will only follow MTC speed changes if the <code>sync locked</code> configuration option is disabled.
Ardour supports vari-speed and backwards playback but will only follow
MTC speed changes if the <kbd class="optoff">sync locked</kbd> option
is disabled.
</p>
<p>
When Ardour is chasing MTC, the main transport clock will display the received Timecode as well as the delta between the incoming signal and Ardour&#039;s transport position.
When Ardour is chasing MTC, the main transport clock will display the
received Timecode as well as the delta between the incoming signal and
Ardour's transport position.
</p>
<h3>JACK Transport</h3>
<p>
When slaved to jack, Ardour&#039;s transport will be identical to JACK-transport. As opposed to other slaves, Ardour can be used to control the JACK transport states (stopped/rolling). No port-connections need to be made for jack-transport to work.
</p>
<p>
JACK-transport does not support vari-speed, nor offsets. Ardour does not chase the timecode but is always in perfect sample-sync with it.
</p>
<p>
JACK-transport also includes temp-based-time information ie. Bar:Beats:Ticks and beats-per-minute. However, only one JACK application can provide this information at a given time. The checkbox <code>JACK Time Master</code> in the <code>Session &gt; Properties</code> dialog allows to configure Ardour to act as translator from timecode to BBT information.
When slaved to jack, Ardour's transport will be identical to
JACK-transport. As opposed to other slaves, Ardour can be used to control
the JACK transport states (stopped/rolling). No port connections need to
be made for jack-transport to work.
</p>
<p>
JACK-transport does not support vari-speed, nor offsets. Ardour does not
chase the timecode but is always in perfect sample-sync with it.
</p>
<p>
JACK-transport also includes temp-based-time information in Bar:Beats:Ticks
and beats-per-minute. However, only one JACK application can provide this
information at a given time. The checkbox
<kbd class="option">Session &gt; Properties &gt; JACK Time Master</kbd>
configures Ardour to act as translator from timecode to BBT information.
</p>

View File

@ -4,49 +4,105 @@ title: Overview of all Timecode related settings
menu_title: Overview of Timecode settings
---
<h2>Accessing the Settings and Preferences</h2>
<p>
Timecode related settings are accessed from the menu:
Timecode settings are accessed from the menu in three places:
</p>
<ul>
<li><code>Session &gt; Properties &gt; Timecode</code></li>
<li><code>Edit &gt; Preferences &gt; Transport</code></li>
<li><code>Edit &gt; Preferences &gt; MIDI</code></li>
<li><kbd class="menu">Session &gt; Properties &gt; Timecode</kbd></li>
<li><kbd class="menu">Edit &gt; Preferences &gt; Transport</kbd></li>
<li><kbd class="menu">Edit &gt; Preferences &gt; MIDI</kbd></li>
</ul>
<h2>Timecode Settings</h2>
<p>Thes settings are session specific:</p>
<ul>
<li><strong>Timecode frames-per-second</strong> configure timecode frames-per-second (23.976, 24, 24.975, 25, 29.97, 29.97 drop, 30, 30 drop, 59.94, 60). Note that all fractional framerates are actually fps*(1000.0/1001.0).</li>
<li><strong>Pull up/down</strong> video-pullup modes change the effective samplerate of Ardour to allows for changing a film soundtrack from one frame rate to another. see <a href="http://en.wikipedia.org/wiki/Telecine" title="http://en.wikipedia.org/wiki/Telecine">Telecine</a></li>
<li><strong>Slave Timecode offset</strong> The specified offset is added to the received timecode (MTC or LTC).</li>
<li><strong>Timecode Generator offset</strong> Specify an offset which is added to the generated timecode (so far only LTC).</li>
<li><strong>JACK Time Master</strong> provide Bar|Beat|Tick and other information to JACK</li>
</ul>
<dl>
<dt><kbd class="menu">Timecode frames-per-second</kbd></dt>
<dd>
Configure timecode frames-per-second (23.976, 24, 24.975, 25, 29.97,
29.97 drop, 30, 30 drop, 59.94, 60). Note that all fractional
framerates are actually fps*(1000.0/1001.0).
</dd>
<dt><kbd class="menu">Pull up/down</kbd></dt>
<dd>
Video pull-up modes change the effective samplerate of Ardour to
allow for changing a film soundtrack from one frame rate to another.
See <a href="http://en.wikipedia.org/wiki/Telecine">Telecine</a>
</dd>
<dt><kbd class="menu">Slave Timecode offset</kbd></dt>
<dd>
The specified offset is added to the received timecode (MTC or
LTC).
</dd>
<dt><kbd class="menu">Timecode Generator offset</kbd></dt>
<dd>
Specify an offset which is added to the generated timecode (so far only LTC).
</dd>
<dt><kbd class="option">JACK Time Master</kbd></dt>
<dd>
Provide Bar|Beat|Tick and other information to JACK.
</dd>
</dl>
<p>These settings are session specific.</p>
<h2>Transport Preferences</h2>
<ul>
<li><strong>External timecode source</strong> select timecode source: JACK, LTC, MTC, MClk</li>
<li><strong>Match session video frame rate to external timecode</strong> This option controls the value of the video frame rate <em>while chasing</em> an external timecode source. <strong>When enabled</strong> the session video frame rate will be changed to match that of the selected external timecode source. <strong>When disabled</strong> the session video frame rate will not be changed to match that of the selected external timecode source. Instead the frame rate indication in the main clock will flash red and Ardour will convert between the external timecode standard and the session standard.</li>
<li><strong>External timecode is sync locked</strong> <strong>When enabled</strong> indicates that the selected external timecode source shares sync (Black &amp; Burst, Wordclock, etc) with the audio interface.</li>
<li><strong>Lock to 29.9700 fps instead of 30000/1001</strong> <strong>When enabled</strong> the external timecode source is assumed to use 29.97 fps instead of 30000/1001. SMPTE 12M-1999 specifies 29.97df as 30000/1001. The <abbr title="specification">spec</abbr> further mentions that drop-frame timecode has an accumulated error of -86ms over a 24-hour period. Drop-frame timecode would compensate exactly for a NTSC color frame rate of 30 * 0.9990 (ie 29.970000). That is not the actual rate. However, some vendors use that rate - despite it being against the specs - because the variant of using exactly 29.97 fps has zero timecode drift.</li>
<li><strong>LTC incoming port</strong> offers a session agnostic way to retain the LTC port connection.</li>
<li><strong>Enable LTC generator</strong> does just what it says.</li>
<li><strong>Send LTC while stopped</strong> <b>When enabled</b> Ardour will continue to send LTC information even when the transport (playhead) is not moving. This mode is intended to drive analog tape machines which unspool the tape if no LTC timecode is received.</li>
<li><strong>LTC generator level</strong> Specify the Peak Volume of the generated LTC signal in dbFS. A good value is 0dBu ^= -18dbFS in an EBU calibrated system</li>
</ul>
<dl>
<dt><kbd class="menu">External timecode source</kbd></dt>
<dd>
Select timecode source: JACK, LTC, MTC, MIDI Clock
</dd>
<dt><kbd class="option">Match session video frame rate to external timecode</kbd></dt>
<dd>
This option controls the value of the video frame rate <em>while
chasing</em> an external timecode source. When enabled, the
session video frame rate will be changed to match that of the selected
external timecode source. When disabled, the session video frame rate
will not be changed to match that of the selected external timecode
source. Instead the frame rate indication in the main clock will flash
red and Ardour will convert between the external timecode standard and
the session standard.
</dd>
<dt><kbd class="option">External timecode is sync locked</kbd></dt>
<dd>
Indicates that the selected external timecode source shares sync (Black
&amp; Burst, Wordclock, etc) with the audio interface.
</dd>
<dt><kbd class="option">Lock to 29.9700 fps instead of 30000/1001</kbd></dt>
<dd>
The external timecode source is assumed to use 29.97 fps instead of
30000/1001. SMPTE 12M-1999 specifies 29.97df as 30000/1001. The spec
further mentions that drop-frame timecode has an accumulated error of -86ms
over a 24-hour period. Drop-frame timecode would compensate exactly for a
NTSC color frame rate of 30 * 0.9990 (ie 29.970000). That is not the actual
rate. However, some vendors use that rate &mdash; despite it being against
the specs &mdash; because the variant of using exactly 29.97 fps has zero
timecode drift.
</dd>
<dt><kbd class="menu">LTC incoming port</kbd></dt>
<dd>
Offers a session agnostic way to retain the LTC port connection.
</dd>
<dt><kbd class="option">Enable LTC generator</kbd></dt>
<dd>Does just what it says.</dd>
<dt><kbd class="option">Send LTC while stopped</kbd></dt>
<dd>
Enable to continue to send LTC information even when the transport
(playhead) is not moving. This mode is intended to drive analog tape
machines which unspool the tape if no LTC timecode is received.
</dd>
<dt><kbd class="menu">LTC generator level</kbd></dt>
<dd>
Specify the Peak Volume of the generated LTC signal in dbFS. A good value
is 0&nbsp;dBu (which is -18&nbsp;dbFS in an EBU calibrated system).
</dd>
</dl>
<p>These settings are common to all sessions.</p>
<h2>MIDI Preferences</h2>
<ul>
<li><strong>Send MIDI Timecode</strong> enable MTC generator</li>
<li><strong>Send MIDI Clock</strong> enable MIDI Clock generator</li>
</ul>
<dl>
<dt><kbd class="option">Send MIDI Timecode</kbd></dt><dd>Enable MTC generator</dd>
<dt><kbd class="option">Send MIDI Clock</kbd></dt><dd>Enable MIDI Clock generator</dd>
</dl>
<p>These settings are also common to all sessions.</p>