Getting Audio In, Out and Around Your Computer
Before you can begin to use Ardour, you will need to get the audio
input/output capabilities of your system working and properly
configured. There are two aspects to this process: getting your audio
interface (soundcard) working, and configuring it to work with the Jack
Audio Connection Kit (JACK).
JACK
It is extremely important to understand that Ardour does not interact
directly with your audio interface when it is running. Instead, all of
the audio data signals that Ardour receives and generates are sent to
and from JACK, a piece of software that routes audio data between an
audio interface and audio applications, in real time.
Traditionally, most of the audio sources that you would want to
record, as well as a lot of the more significant effects processing,
existed outside the computer. Consequently one of the biggest issues
in integrating a computer into the operation of the studio is how to
move audio data in and out of the computer.
However, it is becoming increasingly common for studios to use audio
sources and effects processing that are comprised completely of
software, quite often running on the same machine as an audio
sequencer or digital audio workstation (DAW). A new problem arises in
such situations, because moving audio in and out of the DAW no longer
involves your hardware audio interface. Instead, data has to be moved
from one piece of software to another, preferably with the same kind
of sample synchronisation you’d have in a properly configured
digital hardware system. This is a problem that has been solved at
least a couple of times (ReWire from PropellerHeads and DirectConnect
from Digidesign are the two most common examples), but JACK is a new
design developed as an open source software project, and is thusly
available for anyone to use, learn from, extend, *fix or modify.
New users may not initially realize that by using Jack, their computer
becomes an extremely flexible and powerful audio tool - especially
with Ardour acting as the ’heart’ of the system.
Getting Your Audio Interface Working
Although Ardour runs on OS X as well as Linux, this documentation
describes only a Linux (ALSA) system. The issues faced on OS X tend
to be entirely different, and are centered mostly on JACK. There are
also alternative audio device driver families for Linux but they are
also not discussed here.
Getting your audio interface working can be the hardest part of
setting your computer up to run Ardour, or it could be one of the
easiest. The level of difficulty you will face depends on the type of
audio interface ("soundcard") you are using, the operating system
version you are using, and your own understanding of how it all works.
In an ideal world, your computer already has a working audio
interface, and all you need do is to start up qjackctl and run JACK.
You can determine if you face this ideal situation by doing a few
simple tests on your machine. The most obvious test is whether
you’ve already heard audio coming out of your computer. If you are
in this situation, you can skip ahead to
.
Checking For an Audio Interface
If you’ve never tried to play audio on your computer before, you
should use a basic playback program such as play, aplay or possibly
xmms. Find an audio file on your machine (locate
.wav may help here), and try to play it. There are several
possibilities:
You may get an error from the program
You may hear nothing
You may hear something, but its too quiet
you may hear something from the wrong loudspeakers.
Selecting Capture Source
Many audio interfaces, particularly the cheaper varieties that are
often found built into computers, have ways to plug in both
microphones and instruments or other audio equipment to be recorded.
This immediately poses a question: how does Ardour (or any software)
know which signal to record, the one coming into the microphone input,
or the one arriving at the "line in" socket? The same question arises
also for "high-end" audio interfaces, though in different ways.
The short answer is: Ardour doesn’t. Instead, this is a choice you
have to make using a program a program that understands how to control
the mixing hardware on the audio interface. Linux/ALSA has a number of
such programs: alsamixer, gamix, aumix, kmix are just a few of them.
Each of them offers you a way to select which of the possible
recordable signals will be used for as the "capture source". How you
select the preferred signal varies from program to program, so you
will have to consult the help documentation for whichever program you
choose to use.
There are also a few programs that offer ways to control just one
particular kind of audio interface. For example, the
hdspmixer program offers control over the
very powerful matrix mixer present on several RME audio interface.
envy24ctrl does the same for a number of
interfaces built around the common ice1712/envy24 chipset, found in
devices from M-Audio, Terratec and others. Please note that this quite
similar to the situation for Windows and MacOS users, where each audio
interface often comes with its own control program that allows certain
critical configuration choices to be made.
Monitoring Choices
Its unfortunate that we have to raise this issue at a point in the
manual where you, the reader, may not even knoiw what "monitoring"
means. However, it is such an absolutely critical aspect of using any
digital audio workstation that we need to at least cover the basics
here. The only people who don’t need to care about monitoring are
those who will never use ardour to record a live performance (even on
performed using a software synthesizer).
Monitoring is the term we use to describe listening to what ardour is
recording. If you are playing a guitar and recording it with ardour,
you can probably hear the guitar’s own sound, but there are many
situations where relying on the sound of the instrument is completely
inadequate. For example, with an electronic instrument, there is no
sound until the electrical signal that it generates has been processed
by an amplifier and fed to a loudspeaker. But if Ardour is recording
the instrument’s signal, what is responsible for sending it to the
amp+loudspeakers? It can get a lot more complex than that: if you are
recording multiple performers at the same time, each performer needs
to hear their own playing/singing, but they also probably need to hear
some of their colleagues’ sound as well. You might be overdubbing
yourself - playing a new line on an instrument while listening to
tracks you’ve already recorded - how do you hear the new material as
well as the existing stuff?
Well, hopefully, you’re convinced that there are some questions to
be dealt with surrounding monitoring, see
for more in depth information.
Can I use multiple soundcards
There are really lots of great reasons why you should not even attempt
to do this. But seriously, save your money for a while and buy
yourself a properly designed multichannel soundcard.
Qjackctl
JACK itself does not come with graphical user interface - to start
JACK and control it you need to have access to a command line and a
basic knowledge of Unix-like operating systems. However,
qjackctl is a
wonderful application that wraps JACK up with a graphical interface
that is both nice to look at and useful at same time. qjackctl is the
recommended way of using JACK.
You should be able to start qjackctl from the “application menu”
of your system, typically found on the panel/appbar/dock or whatever
its called that lives at the top/bottom/left/right of your screen.
[ need screenshot of GNOME/KDE/OSX menus here ]