2007-02-01 23:29:55 -05:00
|
|
|
<?xml version="1.0" standalone="no"?>
|
|
|
|
|
|
|
|
<!DOCTYPE section PUBLIC "-//OASIS//DTD DocBook XML V4.4//EN" "http://www.oasis-open.org/docbook/xml/4.4/docbookx.dtd" [
|
|
|
|
|
|
|
|
]>
|
|
|
|
|
|
|
|
<section id="sn-synchronization_concepts">
|
2007-02-14 22:49:43 -05:00
|
|
|
<title>Synchronization Concepts</title>
|
|
|
|
<para>
|
|
|
|
As soon as you start handling audio on more than one device, it is
|
|
|
|
important to understand and to think about
|
|
|
|
<emphasis>synchronization</emphasis> : how to get the devices to have
|
|
|
|
the same sense of time and speed.
|
|
|
|
</para>
|
2007-02-01 23:29:55 -05:00
|
|
|
|
2007-02-14 22:49:43 -05:00
|
|
|
<para>
|
|
|
|
However, there are two fundamentally different kinds of synchronization:
|
|
|
|
</para>
|
2007-02-01 23:29:55 -05:00
|
|
|
|
2007-02-14 22:49:43 -05:00
|
|
|
<section id="sample-clock">
|
|
|
|
<title>Sample Clock</title>
|
|
|
|
<para>
|
|
|
|
As outlined in the <emphasis>introductory concepts</emphasis> section,
|
|
|
|
digital audio is created by taking a "sample" of an analog signal
|
|
|
|
level on a periodic basis, say 48000 times per seconds (the "sample
|
|
|
|
rate"). A dedicated clock (the "sample clock") ((actually, an
|
|
|
|
oscillating crystal, but technology people call such things clocks))
|
|
|
|
"ticks" at that rate, and every time it does, a new sample is
|
|
|
|
measured. The way the clock is used to convert digital audio back to
|
|
|
|
an analog signal (i.e. to be sent to some loudspeakers) is more
|
|
|
|
complex, but the clock is still an absolutely fundamental part of the
|
|
|
|
mechanism.
|
|
|
|
</para>
|
2007-02-01 23:29:55 -05:00
|
|
|
|
2007-02-14 22:49:43 -05:00
|
|
|
<para>
|
|
|
|
Whenever you connect two digital audio devices together in order to
|
|
|
|
move audio data from one to the other, you <emphasis>must ensure they
|
|
|
|
share the same sample clock</emphasis> . Why is this necessary? The
|
|
|
|
oscillating crystals used for the sample clock are generally very
|
|
|
|
stable (they always tick at the same speed), but there are always
|
|
|
|
minute differences in the speed that any two clocks tick at. When used
|
|
|
|
by themselves, this makes no difference, but connect two digital audio
|
|
|
|
devices together and these minute differences will eventually
|
|
|
|
accumulate over time. Eventually, one of the devices will be trying to
|
|
|
|
read a sample "in the middle" of the other device's tick, and the
|
|
|
|
result is a small click or pop in the audio stream.
|
|
|
|
</para>
|
|
|
|
</section>
|
2007-02-01 23:29:55 -05:00
|
|
|
|
2007-02-14 22:49:43 -05:00
|
|
|
<section id="timeline-sync">
|
|
|
|
<title>Timeline Sync</title>
|
|
|
|
<para>
|
|
|
|
The concept of a timeline comes up over and over again when working
|
|
|
|
with a digital audio workstation, and also with video editing systems.
|
|
|
|
By "timeline" we mean nothing more than some way to define a "name"
|
|
|
|
for the point where certain sounds (and/or visual images) occur. When
|
|
|
|
you work in Ardour's editor window, the rulers near the top provide
|
|
|
|
one or more timelines in different units. You can look at the editor
|
|
|
|
window and say "this sound starts at 1 minute 32 seconds" or "this
|
|
|
|
tracks fades out starting at bar 13 beat 22".
|
|
|
|
</para>
|
2007-02-01 23:29:55 -05:00
|
|
|
|
2007-02-14 22:49:43 -05:00
|
|
|
<para>
|
|
|
|
But what happens when you want to share a timeline between two
|
|
|
|
different devices? For example, you may want to run a hardware video
|
|
|
|
editor in conjunction with ardour, and always have the visual and
|
|
|
|
audio playback be at the same point "in time". How do they each know
|
|
|
|
what "in time" means? How do they know where the other one is? A
|
|
|
|
mechanism for answering these questions provides <emphasis>timeline
|
|
|
|
synchronization</emphasis> .
|
|
|
|
</para>
|
2007-02-01 23:29:55 -05:00
|
|
|
|
2007-02-14 22:49:43 -05:00
|
|
|
<para>
|
|
|
|
Timeline synchronization is entirely different from sample clock
|
|
|
|
synchronization. Two devices can share a sample clock, but never use
|
|
|
|
timeline information. Two devices can be sharing timeline information,
|
|
|
|
but run on different sample clocks - they might not even have sample
|
|
|
|
clocks if they are analog devices.
|
|
|
|
</para>
|
|
|
|
</section>
|
2007-02-01 23:29:55 -05:00
|
|
|
|
2007-02-14 22:49:43 -05:00
|
|
|
<section id="word-clock">
|
|
|
|
<title>Word Clock</title>
|
|
|
|
<para>
|
|
|
|
"Word Clock" is the name given to a signal used to distribute the
|
|
|
|
"ticks" of a sample clock to multiple devices. Most digital audio
|
|
|
|
devices that are intended for professional use have a word clock
|
|
|
|
connector and a way to tell the device to use either its internal
|
|
|
|
sample clock (for standalone use), or to use the word clock signal as
|
|
|
|
the sample clock. Because of the electrical characteristics of the
|
|
|
|
signal, it is very important that any length of cable used to
|
|
|
|
distribute word clock is "terminated" with a 75 ohm resistor at both
|
|
|
|
ends. Unfortunately, some devices include this terminator themselves,
|
|
|
|
some contain a switchable resistor and some do not. Worse still, the
|
|
|
|
user manuals for many devices do not provide any information on their
|
|
|
|
termination configuration. It is often necessary to ask the
|
|
|
|
manufacturer in cases where it is not made very obvious from marking
|
|
|
|
near the word clock connectors on the device.
|
|
|
|
</para>
|
|
|
|
</section>
|
2007-02-01 23:29:55 -05:00
|
|
|
|
2007-02-14 22:49:43 -05:00
|
|
|
<section id="timecode">
|
|
|
|
<title>Timecode</title>
|
|
|
|
<para>
|
|
|
|
"Timecode" is a signal that contains positional or "timeline"
|
|
|
|
information. There are several different kinds of timecode signal, but
|
|
|
|
by far the most important is known as SMPTE. Its name is an acronym
|
|
|
|
for the Society for Motion Picture T?? Engineering, and timecode is
|
|
|
|
just one of the standards they defined, but its the most well known.
|
|
|
|
Because of its origins in the film/video world, SMPTE is very centered
|
|
|
|
on the time units that matter to film/video editors. The base unit is
|
|
|
|
called a "frame" and corresponds to a single still image in a film or
|
|
|
|
video. There are typically on the order of 20-30 frames per second, so
|
|
|
|
the actual resolution of SMPTE timecode is not very good compared to
|
|
|
|
audio-based units where there are tens of thousands of "frames" per
|
|
|
|
second.
|
|
|
|
</para>
|
|
|
|
</section>
|
2007-02-01 23:29:55 -05:00
|
|
|
|
2007-02-14 22:49:43 -05:00
|
|
|
<section id="SMPTE">
|
|
|
|
<title>SMPTE</title>
|
|
|
|
<para>
|
|
|
|
SMPTE defines time using a combinations of hours, minutes, seconds,
|
|
|
|
frames and subframes, combined with the frame rate. In a film/video
|
|
|
|
environment, SMPTE is typically stored on the film/video media, and
|
|
|
|
sent from the device used to play it. There are different ways of
|
|
|
|
storing it on the media - you may come across terms like LTR and VTC -
|
|
|
|
but the crucial idea to grasp is that the film/video has a timecode
|
|
|
|
signal "stamped" into it, so that it is always possible to determine
|
|
|
|
"what time it is" when any given image is visible.
|
|
|
|
</para>
|
2007-02-01 23:29:55 -05:00
|
|
|
|
2007-02-14 22:49:43 -05:00
|
|
|
<para>
|
|
|
|
SMPTE timecode is sent from one system to another as an analog audio
|
|
|
|
signal. You could listen to it if you wanted to, though it sounds like
|
|
|
|
a generally screeching and unpleasant noise. What the SMPTE standard
|
|
|
|
defines is a way to encode and decode the
|
|
|
|
hrs:mins:secs:frames:subframes time into or from this audio signal.
|
|
|
|
</para>
|
|
|
|
</section>
|
2007-02-01 23:29:55 -05:00
|
|
|
|
2007-02-14 22:49:43 -05:00
|
|
|
<section id="mtc">
|
|
|
|
<title>MTC</title>
|
|
|
|
<para>
|
|
|
|
The other very common form of timecode is known as "MTC" (MIDI Time
|
|
|
|
Code). However, MTC is actually nothing more than a different way to
|
|
|
|
transmit SMPTE timecode. It uses the exact same units as SMPTE
|
|
|
|
timecode, but rather than send the signal as audio MTC defines a
|
|
|
|
transmission method that uses a MIDI cabable and a data protocol. MTC
|
|
|
|
consumes a measurable, but small, percentage of the available
|
|
|
|
bandwidth on a MIDI cable (on the order of 2-3%). Most of the time, it
|
|
|
|
is wise to use a single cable for MTC and MMC (MIDI Machine Control)
|
|
|
|
and not share it with "musical" MIDI data (the kind that an instrument
|
|
|
|
would send while being played).
|
|
|
|
</para>
|
|
|
|
</section>
|
2007-02-01 23:29:55 -05:00
|
|
|
|
2007-02-14 22:49:43 -05:00
|
|
|
<section id="jack-transport">
|
|
|
|
<title>JACK Transport</title>
|
|
|
|
<para>
|
|
|
|
For Ardour and other programs that use <emphasis>JACK</emphasis>,
|
|
|
|
there is another method of doing timeline synchronization that is not
|
|
|
|
based on SMPTE or MTC.
|
|
|
|
</para>
|
|
|
|
</section>
|
2007-02-01 23:29:55 -05:00
|
|
|
<!--
|
|
|
|
<xi:include xmlns:xi="http://www.w3.org/2001/XInclude"
|
|
|
|
href="Some_Subsection.xml" />
|
|
|
|
-->
|
|
|
|
</section>
|