Tim Mayberry
b8a6f94325
Makefile Reformat the docs, I explained in a prior commit why this modifies every file git-svn-id: svn://localhost/ardour2/trunk@1463 d708f5d6-7413-0410-9779-e7cbd77b26cf
204 lines
7.9 KiB
XML
204 lines
7.9 KiB
XML
<?xml version="1.0" standalone="no"?>
|
|
|
|
<!DOCTYPE section PUBLIC "-//OASIS//DTD DocBook XML V4.4//EN" "http://www.oasis-open.org/docbook/xml/4.4/docbookx.dtd" [
|
|
|
|
]>
|
|
|
|
<section id="sn-monitoring">
|
|
<title>Monitoring</title>
|
|
<para>
|
|
If you are recording an acoustic instrument or voice with no
|
|
pre-existing recorded material as an accompaniment, then you probably
|
|
don't need to worry about monitoring. Just make sure you've made the
|
|
right <link linkend="sn-jack">connections</link> and you should be ready
|
|
to record without reading this section.
|
|
</para>
|
|
|
|
<para>
|
|
However, if a musician is playing an instrument (it doesn't matter what
|
|
kind) while listening to some pre-existing material, then it is
|
|
important that some mechanism exists to allow her to hear both her own
|
|
playing and the accompaniment. The same is true in a slightly different
|
|
way if the instrument makes no sound until the electrical signal it
|
|
creates has been amplified and fed to some loudspeakers. Listening to
|
|
the performance in this way is called monitoring.
|
|
</para>
|
|
|
|
<para>
|
|
So, if you are recording an electrical or software instrument/signal,
|
|
and/or the musician wants to listen to existing material while
|
|
performing, then you need to ensure that signal routing is setup to
|
|
allow monitoring. You have 2 basic choices:
|
|
</para>
|
|
|
|
<section id="hardware-monitoring">
|
|
<title>Hardware Monitoring</title>
|
|
<para>
|
|
Hardware monitoring uses the capabilities of your audio interface to
|
|
route an incoming signal (e.g. someone playing a guitar into a
|
|
microphone) to an output connection (for example, the speaker outputs,
|
|
or a dedicated analog monitoring stereo pair). Most audio interfaces
|
|
can do this, but how you get them to do so, and what else they can do
|
|
varies greatly. We can divide audio interfaces into 3 general
|
|
categories:
|
|
</para>
|
|
|
|
<itemizedlist>
|
|
<listitem>
|
|
<para>
|
|
relatively simple, typically stereo, devices that allow the signal
|
|
being recorded to be routed back to the main outputs (most
|
|
"consumer" audio interfaces fit this description, along with
|
|
anything that provides an "AC97-compliant CODEC")
|
|
</para>
|
|
</listitem>
|
|
|
|
<listitem>
|
|
<para>
|
|
multichannel devices that allow a given input channel to be routed
|
|
back to its corresponding output channel (the main example is the
|
|
RME Digi9652)
|
|
</para>
|
|
</listitem>
|
|
|
|
<listitem>
|
|
<para>
|
|
multichannel devices that allow any input channel, along with any
|
|
playback channel, to be routed to any output channel (the RME HDSP
|
|
and various interfaces based on the envy24/ice1712 chipsets, such
|
|
as the M-Audio Delta 1010, EZ-8 and various Terratec cards)
|
|
</para>
|
|
</listitem>
|
|
</itemizedlist>
|
|
|
|
<section id="monitoring-consumer-audio-interfaces">
|
|
<title>"Consumer" audio interfaces and monitoring</title>
|
|
<para>
|
|
For interfaces in the first category, there is no standard method of
|
|
getting the signal routing correct. The variations in the wiring of
|
|
hardware mixing chips, and the capabilities of those chips, means
|
|
that you will have to get familiar with a hardware mixer control
|
|
program and the details of your audio interface. In the simple
|
|
cases, simply increasing the level named "Line In" or "Mic" in the
|
|
hardware mixer control program will suffice. But this is not a
|
|
general rule, because there is no general rule.
|
|
</para>
|
|
|
|
<para>
|
|
The following diagram shows a fairly typical AC97-based audio
|
|
interface schematic:
|
|
</para>
|
|
<mediaobject>
|
|
<imageobject>
|
|
<imagedata fileref="images/simplemixer.png"/>
|
|
</imageobject>
|
|
</mediaobject>
|
|
<para>
|
|
Notice:
|
|
</para>
|
|
|
|
<itemizedlist>
|
|
<listitem>
|
|
<para>
|
|
there are multiple input connections, but only one can be used
|
|
as the capture source
|
|
</para>
|
|
</listitem>
|
|
|
|
<listitem>
|
|
<para>
|
|
it is (normally) possible to route the input signals back to the
|
|
outputs, and independently control the gain for this "monitored"
|
|
signal
|
|
</para>
|
|
</listitem>
|
|
|
|
<listitem>
|
|
<para>
|
|
it may or may not be possible to choose the playback stream as
|
|
the capture stream
|
|
</para>
|
|
</listitem>
|
|
</itemizedlist>
|
|
</section>
|
|
|
|
<section id="monitoring-prosumer-audio-interfaces">
|
|
<title>High end "prosumer" interfaces and monitoring</title>
|
|
<para>
|
|
For the only interface in the second category, the RME Digi9652
|
|
("Hammerfall"), the direct monitoring facilities are simplistic but
|
|
useful in some circumstances. They are best controlled using
|
|
<emphasis>JACK hardware monitoring</emphasis>.
|
|
</para>
|
|
|
|
<para>
|
|
When using one of the interfaces in the third category, most people
|
|
find it useful to use hardware monitoring, but prefer to control it
|
|
using a dedicated hardware mixer control program. If you have an RME
|
|
HDSP system, then <command>hdspmixer</command> is the relevant
|
|
program. For interfaces based on the envy24/ice1712/ice1724
|
|
chipsets, such as the Delta1010, Terratecs and others,
|
|
<command>envy24ctl</command> is the right choice. Both programs
|
|
offer access to very powerful matrix mixers that permit many
|
|
different variations on signal routing, for both incoming signals
|
|
and the signals being played back by the computer. You will need to
|
|
spend some time working with these programs to grasp their potential
|
|
and their usage in different situations.
|
|
</para>
|
|
|
|
<para>
|
|
The following diagram gives a partial view of the monitoring
|
|
schemantics for this class of audio interface. Each input can be
|
|
routed back to any output, and each such routing has its own gain
|
|
control. The diagram only shows the routings for "in1" to avoid
|
|
becoming completely incomprehensible.
|
|
</para>
|
|
<mediaobject>
|
|
<imageobject>
|
|
<imagedata fileref="images/matrixmixer.png"/>
|
|
</imageobject>
|
|
</mediaobject>
|
|
</section>
|
|
</section>
|
|
|
|
<section id="jack-hardware-monitoring">
|
|
<title>JACK hardware monitoring</title>
|
|
<para></para>
|
|
</section>
|
|
|
|
<section id="software-monitoring">
|
|
<title>Software monitoring</title>
|
|
<para>
|
|
Much simpler than hardware monitoring is "software monitoring". This
|
|
means that any incoming signal (say, through a Line In connector) is
|
|
delivered to software (such as Ardour) which can then deliver it back
|
|
to any output it chooses, possibly having subjected it to various
|
|
processing beforehand. The software can also mix signals together
|
|
before delivering them back to the output. The fact that software
|
|
monitoring can blend together incoming audio with pre-recorded
|
|
material while adjusting for latency and other factors is the big plus
|
|
for this method. The major downside is latency. There will always be a
|
|
delay between the signal arriving at your audio interface inputs and
|
|
it re-emerging from the outputs, and if this delay is too long, it can
|
|
cause problems for the performer who is listening. They will sense a
|
|
delay between pressing a key/pulling the bow/hitting the drum etc. and
|
|
hearing the sound it produces.
|
|
</para>
|
|
|
|
<para>
|
|
However, if your system is capable of low latency audio, its likely
|
|
that you can use software monitoring effectively if it suits your
|
|
goals.
|
|
</para>
|
|
</section>
|
|
|
|
<section id="controlling-monitoring-within-ardour">
|
|
<title>Controlling monitoring choices within Ardour</title>
|
|
<para></para>
|
|
</section>
|
|
<!--
|
|
<xi:include xmlns:xi="http://www.w3.org/2001/XInclude"
|
|
href="Some_Subsection.xml" />
|
|
-->
|
|
</section>
|